Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
NASA Astrophysics Data System (ADS)
Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik
2017-08-01
Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.
Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin
2008-01-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182
Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin
2008-07-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.
Inductive Learning: Does Interleaving Exemplars Affect Long-Term Retention?
ERIC Educational Resources Information Center
Zulkiply, Norehan; Burt, Jennifer S.
2013-01-01
Purpose: The present study investigated whether or not the benefits of interleaving of exemplars from several categories vary with retention interval in inductive learning. Methodology: Two experiments were conducted using paintings (Experiment 1) and textual materials (Experiment 2), and the experiments used a mixed factorial design. Forty…
Does Class Matter? Mentoring Small Businesses' Owner-Managers
ERIC Educational Resources Information Center
Greenbank, Paul
2006-01-01
Purpose: This paper examines the way social class influences the relationship between business mentors and small business owner-managers. Design/methodology/approach: The paper is based on the author's experience of mentoring businesses with The Prince's Trust. Three businesses were selected as cases. The methodological approach involved…
Optimization of EGFR high positive cell isolation procedure by design of experiments methodology.
Levi, Ofer; Tal, Baruch; Hileli, Sagi; Shapira, Assaf; Benhar, Itai; Grabov, Pavel; Eliaz, Noam
2015-01-01
Circulating tumor cells (CTCs) in blood circulation may play a role in monitoring and even in early detection of metastasis patients. Due to the limited presence of CTCs in blood circulation, viable CTCs isolation technology must supply a very high recovery rate. Here, we implement design of experiments (DOE) methodology in order to optimize the Bio-Ferrography (BF) immunomagnetic isolation (IMI) procedure for the EGFR high positive CTCs application. All consequent DOE phases such as screening design, optimization experiments and validation experiments were used. A significant recovery rate of more than 95% was achieved while isolating 100 EGFR high positive CTCs from 1 mL human whole blood. The recovery achievement in this research positions BF technology as one of the most efficient IMI technologies, which is ready to be challenged with patients' blood samples. © 2015 International Clinical Cytometry Society.
Periasamy, Rathinasamy; Palvannan, Thayumanavan
2010-12-01
Production of laccase using a submerged culture of Pleurotus orstreatus IMI 395545 was optimized by the Taguchi orthogonal array (OA) design of experiments (DOE) methodology. This approach facilitates the study of the interactions of a large number of variables spanned by factors and their settings, with a small number of experiments, leading to considerable savings in time and cost for process optimization. This methodology optimizes the number of impact factors and enables to calculate their interaction in the production of industrial enzymes. Eight factors, viz. glucose, yeast extract, malt extract, inoculum, mineral solution, inducer (1 mM CuSO₄) and amino acid (l-asparagine) at three levels and pH at two levels, with an OA layout of L18 (2¹ × 3⁷) were selected for the proposed experimental design. The laccase yield obtained from the 18 sets of fermentation experiments performed with the selected factors and levels was further processed with Qualitek-4 software. The optimized conditions shared an enhanced laccase expression of 86.8% (from 485.0 to 906.3 U). The combination of factors was further validated for laccase production and reactive blue 221 decolorization. The results revealed an enhanced laccase yield of 32.6% and dye decolorization up to 84.6%. This methodology allows the complete evaluation of main and interaction factors. © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim
Coscollà, Clara; Navarro-Olivares, Santiago; Martí, Pedro; Yusà, Vicent
2014-02-01
When attempting to discover the important factors and then optimise a response by tuning these factors, experimental design (design of experiments, DoE) gives a powerful suite of statistical methodology. DoE identify significant factors and then optimise a response with respect to them in method development. In this work, a headspace-solid-phase micro-extraction (HS-SPME) combined with gas chromatography tandem mass spectrometry (GC-MS/MS) methodology for the simultaneous determination of six important organotin compounds namely monobutyltin (MBT), dibutyltin (DBT), tributyltin (TBT), monophenyltin (MPhT), diphenyltin (DPhT), triphenyltin (TPhT) has been optimized using a statistical design of experiments (DOE). The analytical method is based on the ethylation with NaBEt4 and simultaneous headspace-solid-phase micro-extraction of the derivative compounds followed by GC-MS/MS analysis. The main experimental parameters influencing the extraction efficiency selected for optimization were pre-incubation time, incubation temperature, agitator speed, extraction time, desorption temperature, buffer (pH, concentration and volume), headspace volume, sample salinity, preparation of standards, ultrasonic time and desorption time in the injector. The main factors (excitation voltage, excitation time, ion source temperature, isolation time and electron energy) affecting the GC-IT-MS/MS response were also optimized using the same statistical design of experiments. The proposed method presented good linearity (coefficient of determination R(2)>0.99) and repeatibilty (1-25%) for all the compounds under study. The accuracy of the method measured as the average percentage recovery of the compounds in spiked surface and marine waters was higher than 70% for all compounds studied. Finally, the optimized methodology was applied to real aqueous samples enabled the simultaneous determination of all compounds under study in surface and marine water samples obtained from Valencia region (Spain). © 2013 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Rhee, Jeong-eun
2013-01-01
This project began as a content analysis of five South Korean high school Social Studies textbooks. Yet, it has evolved into an epistemological experiment to pursue the question of "what does it mean to leave America for Asia, at least methodologically, for the researcher who left Asia for America?" Using the textbooks as a mediating…
Pant, Apourv; Rai, J P N
2018-04-15
Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.
On the hunt for the gene of perspective taking: pitfalls in methodology.
Miklósi, Adám; Topál, József
2011-12-01
In this commentary, we evaluate the methodology of Udell, Dorey, and Wynne's (Learning & Behavior, in press) experiment in controlling for environmental factors and argue that their conclusion is not supported. In particular, we emphasise that comparative studies on dogs and wolves need to ensure that both species enjoyed the same rearing history, are comparable in age, and have the same experience with the testing conditions. We also argue that the utilisation of shelter dogs does not control for genetic effects on social behaviour. Finally, we propose a synergetic model to account for both genetic and environmental effects on interspecific social behaviour in dogs and wolves.
Sunderland, N; Bristed, H; Gudes, O; Boddy, J; Da Silva, M
2012-09-01
This paper introduces sensory ethnography as a methodology for studying residents' daily lived experience of social determinants of health (SDOH) in place. Sensory ethnography is an expansive option for SDOH research because it encourages participating researchers and residents to "turn up" their senses to identify how previously ignored or "invisible" sensory experiences shape local health and wellbeing. Sensory ethnography creates a richer and deeper understanding of the relationships between place and health than existing research methods that focus on things that are more readily observable or quantifiable. To highlight the methodology in use we outline our research activities and learnings from the Sensory Ethnography of Logan-Beaudesert (SELB) pilot study. We discuss theory, data collection methods, preliminary outcomes, and methodological learnings that will be relevant to researchers who wish to use sensory ethnography or develop deeper understandings of place and health generally. Copyright © 2012 Elsevier Ltd. All rights reserved.
Academic Advising: Does It Really Impact Student Success?
ERIC Educational Resources Information Center
Young-Jones, Adena D.; Burt, Tracie D.; Dixon, Stephanie; Hawthorne, Melissa J.
2013-01-01
Purpose: This study was designed to evaluate academic advising in terms of student needs, expectations, and success rather than through the traditional lens of student satisfaction with the process. Design/methodology/approach: Student participants (n = 611) completed a survey exploring their expectations of and experience with academic advising.…
NASA Astrophysics Data System (ADS)
Unger, Johannes; Hametner, Christoph; Jakubek, Stefan; Quasthoff, Marcus
2014-12-01
An accurate state of charge (SoC) estimation of a traction battery in hybrid electric non-road vehicles, which possess higher dynamics and power densities than on-road vehicles, requires a precise battery cell terminal voltage model. This paper presents a novel methodology for non-linear system identification of battery cells to obtain precise battery models. The methodology comprises the architecture of local model networks (LMN) and optimal model based design of experiments (DoE). Three main novelties are proposed: 1) Optimal model based DoE, which aims to high dynamically excite the battery cells at load ranges frequently used in operation. 2) The integration of corresponding inputs in the LMN to regard the non-linearities SoC, relaxation, hysteresis as well as temperature effects. 3) Enhancements to the local linear model tree (LOLIMOT) construction algorithm, to achieve a physical appropriate interpretation of the LMN. The framework is applicable for different battery cell chemistries and different temperatures, and is real time capable, which is shown on an industrial PC. The accuracy of the obtained non-linear battery model is demonstrated on cells with different chemistries and temperatures. The results show significant improvement due to optimal experiment design and integration of the battery non-linearities within the LMN structure.
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
New Developments in the Technology Readiness Assessment Process in US DOE-EM - 13247
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krahn, Steven; Sutter, Herbert; Johnson, Hoyt
2013-07-01
A Technology Readiness Assessment (TRA) is a systematic, metric-based process and accompanying report that evaluates the maturity of the technologies used in systems; it is designed to measure technology maturity using the Technology Readiness Level (TRL) scale pioneered by the National Aeronautics and Space Administration (NASA) in the 1980's. More recently, DoD has adopted and provided systematic guidance for performing TRAs and determining TRLs. In 2007 the GAO recommended that the DOE adopt the NASA/DoD methodology for evaluating technology maturity. Earlier, in 2006-2007, DOE-EM had conducted pilot TRAs on a number of projects at Hanford and Savannah River. In Marchmore » 2008, DOE-EM issued a process guide, which established TRAs as an integral part of DOE-EM's Project Management Critical Decision Process. Since the development of its detailed TRA guidance in 2008, DOE-EM has continued to accumulate experience in the conduct of TRAs and the process for evaluating technology maturity. DOE has developed guidance on TRAs applicable department-wide. DOE-EM's experience with the TRA process, the evaluations that led to recently developed proposed revisions to the DOE-EM TRA/TMP Guide; the content of the proposed changes that incorporate the above lessons learned and insights are described. (authors)« less
The Disabled Student Experience: Does the SERVQUAL Scale Measure Up?
ERIC Educational Resources Information Center
Vaughan, Elizabeth; Woodruffe-Burton, Helen
2011-01-01
Purpose: The purpose of this paper is to empirically test a new disabled service user-specific service quality model ARCHSECRET against a modified SERVQUAL model in the context of disabled students within higher education. Design/methodology/approach: The application of SERVQUAL in the voluntary sector had raised serious issues on its portability…
Valuing Student Teachers' Perspectives: Researching Inclusively in Inclusive Education?
ERIC Educational Resources Information Center
Black-Hawkins, Kristine; Amrhein, Bettina
2014-01-01
This paper considers how engaging with the principles of inclusive research can enhance research studies that set out to understand the experiences of student teachers on initial teacher education programmes. It does so by describing the methodological development of an on-going study of student teachers' perspectives on working with diverse…
Hands-on Science: Does It Matter What Students' Hands Are On?
ERIC Educational Resources Information Center
Triona, Lara M.; Klahr, David
2007-01-01
Hands-on science typically uses physical materials to give students first-hand experience in scientific methodologies, but the recent availability of virtual laboratories raises an important question about whether what students' hands are on matters to their learning. The overall findings of two articles that employed simple comparisons of…
Warpage analysis on thin shell part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.
Martinello, Tiago; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Taqueda, Maria Elena Santos; Consiglieri, Vladi O
2006-09-28
The poor flowability and bad compressibility characteristics of paracetamol are well known. As a result, the production of paracetamol tablets is almost exclusively by wet granulation, a disadvantageous method when compared to direct compression. The development of a new tablet formulation is still based on a large number of experiments and often relies merely on the experience of the analyst. The purpose of this study was to apply experimental design methodology (DOE) to the development and optimization of tablet formulations containing high amounts of paracetamol (more than 70%) and manufactured by direct compression. Nineteen formulations, screened by DOE methodology, were produced with different proportions of Microcel 102, Kollydon VA 64, Flowlac, Kollydon CL 30, PEG 4000, Aerosil, and magnesium stearate. Tablet properties, except friability, were in accordance with the USP 28th ed. requirements. These results were used to generate plots for optimization, mainly for friability. The physical-chemical data found from the optimized formulation were very close to those from the regression analysis, demonstrating that the mixture project is a great tool for the research and development of new formulations.
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Design of experiments on 135 cloned poplar trees to map environmental influence in greenhouse.
Pinto, Rui Climaco; Stenlund, Hans; Hertzberg, Magnus; Lundstedt, Torbjörn; Johansson, Erik; Trygg, Johan
2011-01-31
To find and ascertain phenotypic differences, minimal variation between biological replicates is always desired. Variation between the replicates can originate from genetic transformation but also from environmental effects in the greenhouse. Design of experiments (DoE) has been used in field trials for many years and proven its value but is underused within functional genomics including greenhouse experiments. We propose a strategy to estimate the effect of environmental factors with the ultimate goal of minimizing variation between biological replicates, based on DoE. DoE can be analyzed in many ways. We present a graphical solution together with solutions based on classical statistics as well as the newly developed OPLS methodology. In this study, we used DoE to evaluate the influence of plant specific factors (plant size, shoot type, plant quality, and amount of fertilizer) and rotation of plant positions on height and section area of 135 cloned wild type poplar trees grown in the greenhouse. Statistical analysis revealed that plant position was the main contributor to variability among biological replicates and applying a plant rotation scheme could reduce this variation. Copyright © 2010 Elsevier B.V. All rights reserved.
Design of experiments applications in bioprocessing: concepts and approach.
Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S
2014-01-01
Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.
NASA Technical Reports Server (NTRS)
Malcipa, Carlos; Decker, William A.; Theodore, Colin R.; Blanken, Christopher L.; Berger, Tom
2010-01-01
A piloted simulation investigation was conducted using the NASA Ames Vertical Motion Simulator to study the impact of pitch, roll and yaw attitude bandwidth and phase delay on handling qualities of large tilt-rotor aircraft. Multiple bandwidth and phase delay pairs were investigated for each axis. The simulation also investigated the effect that the pilot offset from the center of gravity has on handling qualities. While pilot offset does not change the dynamics of the vehicle, it does affect the proprioceptive and visual cues and it can have an impact on handling qualities. The experiment concentrated on two primary evaluation tasks: a precision hover task and a simple hover pedal turn. Six pilots flew over 1400 data runs with evaluation comments and objective performance data recorded. The paper will describe the experiment design and methodology, discuss the results of the experiment and summarize the findings.
System Synthesis in Preliminary Aircraft Design using Statistical Methods
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.
1996-01-01
This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).
Metalexical awareness: development, methodology or written language? A cross-linguistic comparison.
Kurvers, Jeanne; Uri, Helene
2006-07-01
This study explores the ability to access word boundaries of pre-school children, using an on-line methodology (Karmiloff-Smith, Grant, Sims, Jones, & Cockle (1996). Cognition, 58, 197-219.), which has hardly been used outside English-speaking countries. In a cross-linguistic study in the Netherlands and Norway, four and five-year-old children were asked to repeat the last word every time a narrator stopped reading a story. In total 32 target-words were used, both closed and open class words, and both monosyllabic and disyllabic words. The outcomes in both countries were different from those of the original English study (Karmiloff-Smith et al., 1996): four- and five-year-olds were successful in only about 26% of the cases, whereas the success rate in the former English experiment was 75% for the younger and 96% for the older children. No differences were found between age groups and between open and closed class words. This methodology does reveal the ability to access word boundaries, but probably not because of the ease of the on-line methodology in itself, but rather because literacy introduces new representations of language, even in on-line processing. The outcomes implicate that the ability to mark word boundaries does not seem to be a valid indication of who is ready for reading.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... between the DOE test data and the data submitted by NEMA; describe the methodological changes DOE is... differences between test data obtained by DOE and test data submitted by NEMA; (3) describe the methodological...
ERIC Educational Resources Information Center
Hurst, Judith; Quinsee, Susannah
2005-01-01
The inclusion of online learning technologies into the higher education (HE) curriculum is frequently associated with the design and development of new models of learning. One could argue that e-learning even demands a reconfiguration of traditional methods of learning and teaching. However, this transformation in pedagogic methodology does not…
The use of control groups in artificial grammar learning.
Reber, Rolf; Perruchet, Pierre
2003-01-01
Experimenters assume that participants of an experimental group have learned an artificial grammar if they classify test items with significantly higher accuracy than does a control group without training. The validity of such a comparison, however, depends on an additivity assumption: Learning is superimposed on the action of non-specific variables-for example, repetitions of letters, which modulate the performance of the experimental group and the control group to the same extent. In two experiments we were able to show that this additivity assumption does not hold. Grammaticality classifications in control groups without training (Experiments 1 and 2) depended on non-specific features. There were no such biases in the experimental groups. Control groups with training on randomized strings (Experiment 2) showed fewer biases than did control groups without training. Furthermore, we reanalysed published research and demonstrated that earlier experiments using control groups without training had produced similar biases in control group performances, bolstering the finding that using control groups without training is methodologically unsound.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Accelerating Vaccine Formulation Development Using Design of Experiment Stability Studies.
Ahl, Patrick L; Mensch, Christopher; Hu, Binghua; Pixley, Heidi; Zhang, Lan; Dieter, Lance; Russell, Ryann; Smith, William J; Przysiecki, Craig; Kosinski, Mike; Blue, Jeffrey T
2016-10-01
Vaccine drug product thermal stability often depends on formulation input factors and how they interact. Scientific understanding and professional experience typically allows vaccine formulators to accurately predict the thermal stability output based on formulation input factors such as pH, ionic strength, and excipients. Thermal stability predictions, however, are not enough for regulators. Stability claims must be supported by experimental data. The Quality by Design approach of Design of Experiment (DoE) is well suited to describe formulation outputs such as thermal stability in terms of formulation input factors. A DoE approach particularly at elevated temperatures that induce accelerated degradation can provide empirical understanding of how vaccine formulation input factors and interactions affect vaccine stability output performance. This is possible even when clear scientific understanding of particular formulation stability mechanisms are lacking. A DoE approach was used in an accelerated 37(°)C stability study of an aluminum adjuvant Neisseria meningitidis serogroup B vaccine. Formulation stability differences were identified after only 15 days into the study. We believe this study demonstrates the power of combining DoE methodology with accelerated stress stability studies to accelerate and improve vaccine formulation development programs particularly during the preformulation stage. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Klees, R.; Slobbe, D. C.; Farahani, H. H.
2018-04-01
The paper is about a methodology to combine a noisy satellite-only global gravity field model (GGM) with other noisy datasets to estimate a local quasi-geoid model using weighted least-squares techniques. In this way, we attempt to improve the quality of the estimated quasi-geoid model and to complement it with a full noise covariance matrix for quality control and further data processing. The methodology goes beyond the classical remove-compute-restore approach, which does not account for the noise in the satellite-only GGM. We suggest and analyse three different approaches of data combination. Two of them are based on a local single-scale spherical radial basis function (SRBF) model of the disturbing potential, and one is based on a two-scale SRBF model. Using numerical experiments, we show that a single-scale SRBF model does not fully exploit the information in the satellite-only GGM. We explain this by a lack of flexibility of a single-scale SRBF model to deal with datasets of significantly different bandwidths. The two-scale SRBF model performs well in this respect, provided that the model coefficients representing the two scales are estimated separately. The corresponding methodology is developed in this paper. Using the statistics of the least-squares residuals and the statistics of the errors in the estimated two-scale quasi-geoid model, we demonstrate that the developed methodology provides a two-scale quasi-geoid model, which exploits the information in all datasets.
Methodology of shooting training using modern IT techniques
NASA Astrophysics Data System (ADS)
Gudzbeler, Grzegorz; Struniawski, Jarosław
2017-08-01
Mastering, improvement, shaping and preservation of skills of safe, efficient and effective use of the firearm requires the use of relevant methodology of conducting the shooting training. However reality of police trainings does not usually allow for intensive training shooting with the use of ammunition. An alternative solution is the use of modern training technologies. Example of this is the "Virtual system of improvement tactics of intervention services responsible for security and shooting training." Introduction of stimulator to police trainings will enable complete stuff preparation to achieve its tasks, creating potential of knowledge and experience in many areas, far exceeding the capabilities of conventional training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boissonnade, A; Hossain, Q; Kimball, J
Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Theron D.; McDonald, Jimmie M.; Cadwallader, Lee C.
2000-01-15
This paper discusses the thermal response of two prototypical International Thermonuclear Experimental Reactor (ITER) divertor channels during simulated loss-of-flow-accident (LOFA) experiments. The thermal response was characterized by the time-to-burnout (TBO), which is a figure of merit on the mockups' survivability. Data from the LOFA experiments illustrate that (a) the pre-LOFA inlet velocity does not significantly influence the TBO, (b) the incident heat flux (IHF) does influence the TBO, and (c) a swirl tape insert significantly improves the TBO and promotes the initiation of natural circulation. This natural circulation enabled the mockup to absorb steady-state IHFs after the coolant circulation pumpmore » was disabled. Several methodologies for thermal-hydraulic modeling of the LOFA were attempted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, T.D.; McDonald, J.M.; Cadwallader, L.C.
2000-01-01
This paper discusses the thermal response of two prototypical International Thermonuclear Experimental Reactor (ITER) divertor channels during simulated loss-of-flow-accident (LOFA) experiments. The thermal response was characterized by the time-to-burnout (TBO), which is a figure of merit on the mockups' survivability. Data from the LOFA experiments illustrate that (a) the pre-LOFA inlet velocity does not significantly influence the TBO, (b) the incident heat flux (IHF) does influence the TBO, and (c) a swirl tape insert significantly improves the TBO and promotes the initiation of natural circulation. This natural circulation enabled the mockup to absorb steady-state IHFs after the coolant circulation pumpmore » was disabled. Several methodologies for thermal-hydraulic modeling of the LOFA were attempted.« less
Carson, Susan J; Abuhaloob, Lamis; Richards, Derek; Hector, Mark P; Freeman, Ruth
2017-10-25
Obesity and dental caries are global public health problems which can impact in childhood and throughout the life course. In simple terms, childhood dental caries and body weight are linked via the common risk factor of diet. An association between dental caries and obesity has been described in a number of studies and reviews. However, similarly, a relationship has also been noted between low body weight and caries experience in children. This protocol will provide the framework for an umbrella review to address the following question: Does the available evidence support a relationship between dental caries experience and body weight in the child population? This review protocol outlines the process to carry out an umbrella systematic review which will synthesise previous reviews of childhood dental caries experience and body weight. An umbrella review methodology will be used to examine the methodological and reporting quality of existing reviews. The final umbrella review aims to aggregate the available evidence in order to provide a summary for policymakers and to inform healthcare interventions. PROSPERO CRD42016047304.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Evaluating Astronomy Laboratories
NASA Astrophysics Data System (ADS)
Zirbel, E. L.
2002-12-01
A set of non-traditional astronomy laboratories for non-science majors will be presented along with evaluations of lab technicians (these labs were originally developed at the College of Staten Island of the City University of New York). The goal of these labs is twofold: (a) to provide the students with hands-on experiences of scientific methodology and (b) to provoke critical thinking. Because non-science majors are often rather resistant to learning the relevant methodology - and especially to thinking critically - this manual is structured differently. It does not only provide traditional cook-book recipes but also contains several leading questions to make the students realize why they are doing what. The students are encouraged to write full sentences and explain how they reach which conclusions. This poster summarizes the experiences of the laboratory assistants that worked with the instructor and presents how they judge the effectiveness of the laboratories.
Goodman, Angela; Hakala, J. Alexandra; Bromhal, Grant; Deel, Dawn; Rodosta, Traci; Frailey, Scott; Small, Michael; Allen, Doug; Romanov, Vyacheslav; Fazio, Jim; Huerta, Nicolas; McIntyre, Dustin; Kutchko, Barbara; Guthrie, George
2011-01-01
A detailed description of the United States Department of Energy (US-DOE) methodology for estimating CO2 storage potential for oil and gas reservoirs, saline formations, and unmineable coal seams is provided. The oil and gas reservoirs are assessed at the field level, while saline formations and unmineable coal seams are assessed at the basin level. The US-DOE methodology is intended for external users such as the Regional Carbon Sequestration Partnerships (RCSPs), future project developers, and governmental entities to produce high-level CO2 resource assessments of potential CO2 storage reservoirs in the United States and Canada at the regional and national scale; however, this methodology is general enough that it could be applied globally. The purpose of the US-DOE CO2 storage methodology, definitions of storage terms, and a CO2 storage classification are provided. Methodology for CO2 storage resource estimate calculation is outlined. The Log Odds Method when applied with Monte Carlo Sampling is presented in detail for estimation of CO2 storage efficiency needed for CO2 storage resource estimates at the regional and national scale. CO2 storage potential reported in the US-DOE's assessment are intended to be distributed online by a geographic information system in NatCarb and made available as hard-copy in the Carbon Sequestration Atlas of the United States and Canada. US-DOE's methodology will be continuously refined, incorporating results of the Development Phase projects conducted by the RCSPs from 2008 to 2018. Estimates will be formally updated every two years in subsequent versions of the Carbon Sequestration Atlas of the United States and Canada.
The student perspective of high school laboratory experiences
NASA Astrophysics Data System (ADS)
Lambert, R. Mitch
High school science laboratory experiences are an accepted teaching practice across the nation despite a lack of research evidence to support them. The purpose of this study was to examine the perspective of students---stakeholders often ignored---on these experiences. Insight into the students' perspective was explored progressively using a grounded theory methodology. Field observations of science classrooms led to an open-ended survey of high school science students, garnering 665 responses. Twelve student interviews then focused on the data and questions evolving from the survey. The student perspective on laboratory experiences revealed varied information based on individual experience. Concurrent analysis of the data revealed that although most students like (348/665) or sometimes like (270/665) these experiences, some consistent factors yielded negative experiences and prompted suggestions for improvement. The category of responses that emerged as the core idea focused on student understanding of the experience. Students desire to understand the why do, the how to, and the what it means of laboratory experiences. Lacking any one of these, the experience loses educational value for them. This single recurring theme crossed the boundaries of age, level in school, gender, and even the student view of lab experiences as positive or negative. This study suggests reflection on the current laboratory activities in which science teachers engage their students. Is the activity appropriate (as opposed to being merely a favorite), does it encourage learning, does it fit, does it operate at the appropriate level of inquiry, and finally what can science teachers do to integrate these activities into the classroom curriculum more effectively? Simply stated, what can teachers do so that students understand what to do, what's the point, and how that point fits into what they are learning outside the laboratory?
Debrus, B; Lebrun, P; Kindenge, J Mbinze; Lecomte, F; Ceccato, A; Caliaro, G; Mbay, J Mavar Tayey; Boulanger, B; Marini, R D; Rozet, E; Hubert, Ph
2011-08-05
An innovative methodology based on design of experiments (DoE), independent component analysis (ICA) and design space (DS) was developed in previous works and was tested out with a mixture of 19 antimalarial drugs. This global LC method development methodology (i.e. DoE-ICA-DS) was used to optimize the separation of 19 antimalarial drugs to obtain a screening method. DoE-ICA-DS methodology is fully compliant with the current trend of quality by design. DoE was used to define the set of experiments to model the retention times at the beginning, the apex and the end of each peak. Furthermore, ICA was used to numerically separate coeluting peaks and estimate their unbiased retention times. Gradient time, temperature and pH were selected as the factors of a full factorial design. These retention times were modelled by stepwise multiple linear regressions. A recently introduced critical quality attribute, namely the separation criterion (S), was also used to assess the quality of separations rather than using the resolution. Furthermore, the resulting mathematical models were also studied from a chromatographic point of view to understand and investigate the chromatographic behaviour of each compound. Good adequacies were found between the mathematical models and the expected chromatographic behaviours predicted by chromatographic theory. Finally, focusing at quality risk management, the DS was computed as the multidimensional subspace where the probability for the separation criterion to lie in acceptance limits was higher than a defined quality level. The DS was computed propagating the prediction error from the modelled responses to the quality criterion using Monte Carlo simulations. DoE-ICA-DS allowed encountering optimal operating conditions to obtain a robust screening method for the 19 considered antimalarial drugs in the framework of the fight against counterfeit medicines. Moreover and only on the basis of the same data set, a dedicated method for the determination of three antimalarial compounds in a pharmaceutical formulation was optimized to demonstrate both the efficiency and flexibility of the methodology proposed in the present study. Copyright © 2011 Elsevier B.V. All rights reserved.
Sharing stories: life history narratives in stuttering research.
Kathard, H
2001-01-01
The life experiences of people who stutter (PWS) have not featured prominently in research. Historically, the profession of speech and language therapy has amassed data and developed its theory of stuttering within a positivistic frame. As a consequence, the existing over-arching theory of research and practice does not engage holistically with the dynamic personal, socio-cultural and political contexts of the individual who stutters. Therefore a conceptual shift is required in ontology, epistemology and methodology underpinning the knowledge construction process. The use of the life history narratives as a research tool is promoted. An exploratory study of a single participant is presented to illuminate the methodological approach and emerging theoretical constructs.
The influence of biodegradability of sewer solids for the management of CSOs.
Sakrabani, R; Ashley, R M; Vollertsen, J
2005-01-01
The re-suspension of sediments in combined sewers and the associated pollutants into the bulk water during wet weather flows can cause pollutants to be carried further downstream to receiving waters or discharged via Combined Sewer Overflows (CSO). A typical pollutograph shows the trend of released bulk pollutants with time but does not consider information on the biodegradability of these pollutants. A new prediction methodology based on Oxygen Utilisation Rate (respirometric method) and Erosionmeter (laboratory device replicating in-sewer erosion) experiments is proposed which is able to predict the trends in biodegradability during in-sewer sediment erosion in wet weather conditions. The proposed new prediction methodology is also based on COD fractionation techniques.
A Methodology to Assess the Strategic Benefits of New Production Technologies
1990-01-01
does not capture these strategic advantage , that make new technologies attractive. Our methodology integrates investments in new pro- duction...capture these strategic advantages that make new technologies attractive. Our methodology integrates investments in new production technologies into the...focus on reducing labor costs, does not capture these strategic advantages that make new technologies attractive. In many cases, retaining the existing
Arigo, Danielle; Pagoto, Sherry; Carter-Harris, Lisa; Lillie, Sarah E; Nebeker, Camille
2018-01-01
As the popularity and diversity of social media platforms increases so does their utility for health research. Using social media for recruitment into clinical studies and/or delivering health behavior interventions may increase reach to a broader audience. However, evidence supporting the efficacy of these approaches is limited, and key questions remain with respect to optimal benchmarks, intervention development and methodology, participant engagement, informed consent, privacy, and data management. Little methodological guidance is available to researchers interested in using social media for health research. In this Tutorial, we summarize the content of the 2017 Society for Behavioral Medicine Pre-Conference Course entitled 'Using Social Media for Research,' at which the authors presented their experiences with methodological and ethical issues relating to social media-enabled research recruitment and intervention delivery. We identify common pitfalls and provide recommendations for recruitment and intervention via social media. We also discuss the ethical and responsible conduct of research using social media for each of these purposes.
Pagoto, Sherry; Carter-Harris, Lisa; Lillie, Sarah E; Nebeker, Camille
2018-01-01
As the popularity and diversity of social media platforms increases so does their utility for health research. Using social media for recruitment into clinical studies and/or delivering health behavior interventions may increase reach to a broader audience. However, evidence supporting the efficacy of these approaches is limited, and key questions remain with respect to optimal benchmarks, intervention development and methodology, participant engagement, informed consent, privacy, and data management. Little methodological guidance is available to researchers interested in using social media for health research. In this Tutorial, we summarize the content of the 2017 Society for Behavioral Medicine Pre-Conference Course entitled ‘Using Social Media for Research,’ at which the authors presented their experiences with methodological and ethical issues relating to social media-enabled research recruitment and intervention delivery. We identify common pitfalls and provide recommendations for recruitment and intervention via social media. We also discuss the ethical and responsible conduct of research using social media for each of these purposes. PMID:29942634
Optimal Modality Selection for Cooperative Human-Robot Task Completion.
Jacob, Mithun George; Wachs, Juan P
2016-12-01
Human-robot cooperation in complex environments must be fast, accurate, and resilient. This requires efficient communication channels where robots need to assimilate information using a plethora of verbal and nonverbal modalities such as hand gestures, speech, and gaze. However, even though hybrid human-robot communication frameworks and multimodal communication have been studied, a systematic methodology for designing multimodal interfaces does not exist. This paper addresses the gap by proposing a novel methodology to generate multimodal lexicons which maximizes multiple performance metrics over a wide range of communication modalities (i.e., lexicons). The metrics are obtained through a mixture of simulation and real-world experiments. The methodology is tested in a surgical setting where a robot cooperates with a surgeon to complete a mock abdominal incision and closure task by delivering surgical instruments. Experimental results show that predicted optimal lexicons significantly outperform predicted suboptimal lexicons (p <; 0.05) in all metrics validating the predictability of the methodology. The methodology is validated in two scenarios (with and without modeling the risk of a human-robot collision) and the differences in the lexicons are analyzed.
Knöspel, Fanny; Schindler, Rudolf K; Lübberstedt, Marc; Petzolt, Stephanie; Gerlach, Jörg C; Zeilinger, Katrin
2010-12-01
The in vitro culture behaviour of embryonic stem cells (ESC) is strongly influenced by the culture conditions. Current culture media for expansion of ESC contain some undefined substances. Considering potential clinical translation work with such cells, the use of defined media is desirable. We have used Design of Experiments (DoE) methods to investigate the composition of a serum-free chemically defined culture medium for expansion of mouse embryonic stem cells (mESC). Factor screening analysis according to Plackett-Burman revealed that insulin and leukaemia inhibitory factor (LIF) had a significant positive influence on the proliferation activity of the cells, while zinc and L: -cysteine reduced the cell growth. Further analysis using minimum run resolution IV (MinRes IV) design indicates that following factor adjustment LIF becomes the main factor for the survival and proliferation of mESC. In conclusion, DoE screening assays are applicable to develop and to refine culture media for stem cells and could also be employed to optimize culture media for human embryonic stem cells (hESC).
Ashengroph, Morahem; Ababaf, Sajad
2014-12-01
Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.
1996-08-01
The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less
Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K
2015-01-01
Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708
Simulation-Based Methodologies for Global Optimization and Planning
2013-10-11
GESK ) Stochastic kriging (SK) was introduced by Ankenman, Nelson, and Staum [1] to handle the stochastic simulation setting, where the noise in the...is used for the kriging. Four experiments will be used to illustrate some charac- teristics of SK, SKG, and GESK , with respect to the choice of...samples at each point. Because GESK is able to explore the design space more via extrapolation, it does a better job of capturing the fluctuations of the
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.
Optimization of Nd: YAG Laser Marking of Alumina Ceramic Using RSM And ANN
NASA Astrophysics Data System (ADS)
Peter, Josephine; Doloi, B.; Bhattacharyya, B.
2011-01-01
The present research papers deals with the artificial neural network (ANN) and the response surface methodology (RSM) based mathematical modeling and also an optimization analysis on marking characteristics on alumina ceramic. The experiments have been planned and carried out based on Design of Experiment (DOE). It also analyses the influence of the major laser marking process parameters and the optimal combination of laser marking process parametric setting has been obtained. The output of the RSM optimal data is validated through experimentation and ANN predictive model. A good agreement is observed between the results based on ANN predictive model and actual experimental observations.
Navigation in endoscopic sinus surgery: the first Indian experience.
Rai, Devinder; Munjal, Manish; Rai, Varun
2013-08-01
Although the use of image guidance surgery (IGS) is standard practice in developed countries, it has not been in use in Indian Otolaryngology ever since its clinical inception in 1994. Some clinically interesting applications, relevant indications, practical tips and results in the Indian context are presented. Usage technique and data presentation. Indications based on AAO-HNS 2002 guidelines seem valid, and though the accuracy parameters remain still guarded, in line with the best technology available, based on the evidences of scattered reports and expert opinions, the use of navigation can be recommended as state of the art. IGS provides reliable information to a sinus surgeon in difficult circumstances. Its adaptation fortunately does not require a significant learning curve as it does not change the methodology of the surgical procedure. It can be an excellent teaching tool, but its use does not replace proper surgical training.
Exogenous attention and color perception: performance and appearance of saturation and hue.
Fuller, Stuart; Carrasco, Marisa
2006-11-01
Exogenous covert attention is an automatic, transient form of attention that can be triggered by sudden changes in the periphery. Here we test for the effects of attention on color perception. We used the methodology developed by Carrasco, Ling, and Read [Carrasco, M., Ling, S., & Read, S. (2004). Attention alters appearance. Nature Neuroscience, 7 (3) 308-313] to explore the effects of exogenous attention on appearance of saturation (Experiment 1) and of hue (Experiment 2). We also tested orientation discrimination performance for single stimuli defined by saturation or hue (Experiment 3). The results indicate that attention increases apparent saturation, but does not change apparent hue, notwithstanding the fact that it improves orientation discrimination for both saturation and hue stimuli.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, B.P.; Legg, J.; Travis, C.C.
1995-06-01
This document describes a worker health risk evaluation methodology for assessing risks associated with Environmental Restoration (ER) and Waste Management (WM). The methodology is appropriate for estimating worker risks across the Department of Energy (DOE) Complex at both programmatic and site-specific levels. This document supports the worker health risk methodology used to perform the human health risk assessment portion of the DOE Programmatic Environmental Impact Statement (PEIS) although it has applications beyond the PEIS, such as installation-wide worker risk assessments, screening-level assessments, and site-specific assessments.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
78 FR 47677 - DOE Activities and Methodology for Assessing Compliance With Building Energy Codes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... construction. Post- construction evaluations were implemented in one study in an effort to reduce these costs... these pilot studies have led to a number of recommendations and potential changes to the DOE methodology... fundamental assumptions and approaches to measuring compliance with building energy codes. This notice...
Wang, Shuang; Yue, Bo; Liang, Xuefeng; Jiao, Licheng
2018-03-01
Wisely utilizing the internal and external learning methods is a new challenge in super-resolution problem. To address this issue, we analyze the attributes of two methodologies and find two observations of their recovered details: 1) they are complementary in both feature space and image plane and 2) they distribute sparsely in the spatial space. These inspire us to propose a low-rank solution which effectively integrates two learning methods and then achieves a superior result. To fit this solution, the internal learning method and the external learning method are tailored to produce multiple preliminary results. Our theoretical analysis and experiment prove that the proposed low-rank solution does not require massive inputs to guarantee the performance, and thereby simplifying the design of two learning methods for the solution. Intensive experiments show the proposed solution improves the single learning method in both qualitative and quantitative assessments. Surprisingly, it shows more superior capability on noisy images and outperforms state-of-the-art methods.
Empirical cost models for estimating power and energy consumption in database servers
NASA Astrophysics Data System (ADS)
Valdivia Garcia, Harold Dwight
The explosive growth in the size of data centers, coupled with the widespread use of virtualization technology has brought power and energy consumption as major concerns for data center administrators. Provisioning decisions must take into consideration not only target application performance but also the power demands and total energy consumption incurred by the hardware and software to be deployed at the data center. Failure to do so will result in damaged equipment, power outages, and inefficient operation. Since database servers comprise one of the most popular and important server applications deployed in such facilities, it becomes necessary to have accurate cost models that can predict the power and energy demands that each database workloads will impose in the system. In this work we present an empirical methodology to estimate the power and energy cost of database operations. Our methodology uses multiple-linear regression to derive accurate cost models that depend only on readily available statistics such as selectivity factors, tuple size, numbers columns and relational cardinality. Moreover, our method does not need measurement of individual hardware components, but rather total power and energy consumption measured at a server. We have implemented our methodology, and ran experiments with several server configurations. Our experiments indicate that we can predict power and energy more accurately than alternative methods found in the literature.
Basu, Anindya; Leong, Susanna Su Jan
2012-02-03
The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.
Does sadness impair color perception? Flawed evidence and faulty methods.
Holcombe, Alex O; Brown, Nicholas J L; Goodbourn, Patrick T; Etz, Alexander; Geukes, Sebastian
2016-01-01
In their 2015 paper, Thorstenson, Pazda, and Elliot offered evidence from two experiments that perception of colors on the blue-yellow axis was impaired if the participants had watched a sad movie clip, compared to participants who watched clips designed to induce a happy or neutral mood. Subsequently, these authors retracted their article, citing a mistake in their statistical analyses and a problem with the data in one of their experiments. Here, we discuss a number of other methodological problems with Thorstenson et al.'s experimental design, and also demonstrate that the problems with the data go beyond what these authors reported. We conclude that repeating one of the two experiments, with the minor revisions proposed by Thorstenson et al., will not be sufficient to address the problems with this work.
When expectation confounds iconic memory.
Bachmann, Talis; Aru, Jaan
2016-10-01
In response to the methodological criticism (Bachmann & Aru, 2015) of the interpretation of their earlier experimental results (Mack, Erol, & Clarke, 2015) Mack, Erol, Clarke, and Bert (2016) presented new results that they interpret again in favor of the stance that an attention-free phenomenal iconic store does not exist. Here we once more question their conclusions. When their subjects were unexpectedly asked to report the letters instead of the post-cued circles in the 101th trial where letters were actually absent, they likely failed to see the empty display area because prior experience with letters in the preceding trials produced expectancy based illusory experience of letter-like objects. Copyright © 2016 Elsevier Inc. All rights reserved.
Optimization of Nd: YAG Laser Marking of Alumina Ceramic Using RSM And ANN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter, Josephine; Doloi, B.; Bhattacharyya, B.
The present research papers deals with the artificial neural network (ANN) and the response surface methodology (RSM) based mathematical modeling and also an optimization analysis on marking characteristics on alumina ceramic. The experiments have been planned and carried out based on Design of Experiment (DOE). It also analyses the influence of the major laser marking process parameters and the optimal combination of laser marking process parametric setting has been obtained. The output of the RSM optimal data is validated through experimentation and ANN predictive model. A good agreement is observed between the results based on ANN predictive model and actualmore » experimental observations.« less
Does movement influence representations of time and space?
2017-01-01
Embodied cognition posits that abstract conceptual knowledge such as mental representations of time and space are at least partially grounded in sensorimotor experiences. If true, then the execution of whole-body movements should result in modulations of temporal and spatial reference frames. To scrutinize this hypothesis, in two experiments participants either walked forward, backward or stood on a treadmill and responded either to an ambiguous temporal question (Experiment 1) or an ambiguous spatial question (Experiment 2) at the end of the walking manipulation. Results confirmed the ambiguousness of the questions in the control condition. Nevertheless, despite large power, walking forward or backward did not influence the answers or response times to the temporal (Experiment 1) or spatial (Experiment 2) question. A follow-up Experiment 3 indicated that this is also true for walking actively (or passively) in free space (as opposed to a treadmill). We explore possible reasons for the null-finding as concerns the modulation of temporal and spatial reference frames by movements and we critically discuss the methodological and theoretical implications. PMID:28376130
Does movement influence representations of time and space?
Loeffler, Jonna; Raab, Markus; Cañal-Bruland, Rouwen
2017-01-01
Embodied cognition posits that abstract conceptual knowledge such as mental representations of time and space are at least partially grounded in sensorimotor experiences. If true, then the execution of whole-body movements should result in modulations of temporal and spatial reference frames. To scrutinize this hypothesis, in two experiments participants either walked forward, backward or stood on a treadmill and responded either to an ambiguous temporal question (Experiment 1) or an ambiguous spatial question (Experiment 2) at the end of the walking manipulation. Results confirmed the ambiguousness of the questions in the control condition. Nevertheless, despite large power, walking forward or backward did not influence the answers or response times to the temporal (Experiment 1) or spatial (Experiment 2) question. A follow-up Experiment 3 indicated that this is also true for walking actively (or passively) in free space (as opposed to a treadmill). We explore possible reasons for the null-finding as concerns the modulation of temporal and spatial reference frames by movements and we critically discuss the methodological and theoretical implications.
NASA Astrophysics Data System (ADS)
Zhang, Hongjuan; Kurtz, Wolfgang; Kollet, Stefan; Vereecken, Harry; Franssen, Harrie-Jan Hendricks
2018-01-01
The linkage between root zone soil moisture and groundwater is either neglected or simplified in most land surface models. The fully-coupled subsurface-land surface model TerrSysMP including variably saturated groundwater dynamics is used in this work. We test and compare five data assimilation methodologies for assimilating groundwater level data via the ensemble Kalman filter (EnKF) to improve root zone soil moisture estimation with TerrSysMP. Groundwater level data are assimilated in the form of pressure head or soil moisture (set equal to porosity in the saturated zone) to update state vectors. In the five assimilation methodologies, the state vector contains either (i) pressure head, or (ii) log-transformed pressure head, or (iii) soil moisture, or (iv) pressure head for the saturated zone only, or (v) a combination of pressure head and soil moisture, pressure head for the saturated zone and soil moisture for the unsaturated zone. These methodologies are evaluated in synthetic experiments which are performed for different climate conditions, soil types and plant functional types to simulate various root zone soil moisture distributions and groundwater levels. The results demonstrate that EnKF cannot properly handle strongly skewed pressure distributions which are caused by extreme negative pressure heads in the unsaturated zone during dry periods. This problem can only be alleviated by methodology (iii), (iv) and (v). The last approach gives the best results and avoids unphysical updates related to strongly skewed pressure heads in the unsaturated zone. If groundwater level data are assimilated by methodology (iii), EnKF fails to update the state vector containing the soil moisture values if for (almost) all the realizations the observation does not bring significant new information. Synthetic experiments for the joint assimilation of groundwater levels and surface soil moisture support methodology (v) and show great potential for improving the representation of root zone soil moisture.
Andri, Bertyl; Dispas, Amandine; Marini, Roland Djang'Eing'a; Hubert, Philippe; Sassiat, Patrick; Al Bakain, Ramia; Thiébaut, Didier; Vial, Jérôme
2017-03-31
This work presents a first attempt to establish a model of the retention behaviour for pharmaceutical compounds in gradient mode SFC. For this purpose, multivariate statistics were applied on the basis of data gathered with the Design of Experiment (DoE) methodology. It permitted to build optimally the experiments needed, and served as a basis for providing relevant physicochemical interpretation of the effects observed. Data gathered over a broad experimental domain enabled the establishment of well-fit linear models of the retention of the individual compounds in presence of methanol as co-solvent. These models also allowed the appreciation of the impact of each experimental parameter and their factorial combinations. This approach was carried out with two organic modifiers (i.e. methanol and ethanol) and provided comparable results. Therefore, it demonstrates the feasibility to model retention in gradient mode SFC for individual compounds as a function of the experimental conditions. This approach also permitted to highlight the predominant effect of some parameters (e.g. gradient slope and pressure) on the retention of compounds. Because building of individual models of retention was possible, the next step considered the establishment of a global model of the retention to predict the behaviour of given compounds on the basis of, on the one side, the physicochemical descriptors of the compounds (e.g. Linear Solvation Energy Relationship (LSER) descriptors) and, on the other side, of the experimental conditions. This global model was established by means of partial least squares regression for the selected compounds, in an experimental domain defined by the Design of Experiment (DoE) methodology. Assessment of the model's predictive capabilities revealed satisfactory agreement between predicted and actual retention (i.e. R 2 =0.942, slope=1.004) of the assessed compounds, which is unprecedented in the field. Copyright © 2017 Elsevier B.V. All rights reserved.
Revalidation studies of Mark 16 experiments: J70
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S.Y.
1993-10-25
The MGBS-TGAL combination of the J70 criticality modules was validated for Mark 16 lattices by H. K. Clark as reported in DPST-83-1025. Unfortunately, the records of the calculations reported can not be retrieved and the descriptions of the modeling used are not fully provided in DPST-83-1025. The report does not describe in detail how to model the experiments and how to set up the input. The computer output for the cases reported in the memorandum can not be located in files. The MGBS-TGAL calculations reported in DPST-83-1025 have been independently reperformed to provide retrievable record copies of the calculations, tomore » provide a detailed description and discussion of the methodology used, and to serve as a training exercise for a novice criticality safety engineer. The current results reproduce Clark`s reported results to within about 0.01% or better. A procedure to perform these and similar calculations is given in this report, with explanation of the methodology choices provided. Copies of the computer output have been made via microfiche and will be maintained in APG files.« less
Does dishonesty really invite third-party punishment? Results of a more stringent test.
Konishi, Naoki; Ohtsubo, Yohsuke
2015-05-01
Many experiments have demonstrated that people are willing to incur cost to punish norm violators even when they are not directly harmed by the violation. Such altruistic third-party punishment is often considered an evolutionary underpinning of large-scale human cooperation. However, some scholars argue that previously demonstrated altruistic third-party punishment against fairness-norm violations may be an experimental artefact. For example, envy-driven retaliatory behaviour (i.e. spite) towards better-off unfair game players may be misidentified as altruistic punishment. Indeed, a recent experiment demonstrated that participants ceased to inflict third-party punishment against an unfair player once a series of key methodological problems were systematically controlled for. Noticing that a previous finding regarding apparently altruistic third-party punishment against honesty-norm violations may have been subject to methodological issues, we used a different and what we consider to be a more sound design to evaluate these findings. Third-party punishment against dishonest players withstood this more stringent test. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
Mindfulness in PTSD treatment.
Lang, Ariel J
2017-04-01
In recent years, the field has witnessed considerable enthusiasm for the ancient practice of mindfulness. The skills derived from this experience, including focused attention, nonjudgmental acceptance of internal experiences and reduced autonomic reactivity, may be helpful in counteracting pathological responses to trauma. Several types of interventions that incorporate principles of mindfulness have been examined for treatment of posttraumatic stress disorder (PTSD). This nascent literature is inconsistent and methodologically limited but does suggest that mindfulness is a potentially important tool for creating psychological change. The interventions described herein generally would not yet be considered first-line treatments for PTSD. Nonetheless, evidence is building, and mindfulness may provide an impetus for better understanding how to personalize psychological interventions and to evaluate their outcomes. Published by Elsevier Ltd.
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P.; Kumar, Ambuj
2011-01-01
Objectives To assess whether reported methodological quality of randomized controlled trials (RCTs) reflect the actual methodological quality, and to evaluate the association of effect size (ES) and sample size with methodological quality. Study design Systematic review Setting Retrospective analysis of all consecutive phase III RCTs published by 8 National Cancer Institute Cooperative Groups until year 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Results 429 RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94, 95%CI: 0.88, 0.99) and 24% (RHR: 1.24, 95%CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. Conclusion The largest study to-date shows poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. PMID:22424985
Kazemi, Khoshrooz; Zhang, Baiyu; Lye, Leonard M; Cai, Qinghong; Cao, Tong
2016-12-01
A design of experiment (DOE) based methodology was adopted in this study to investigate the effects of multiple factors and their interactions on the performance of a municipal solid waste (MSW) composting process. The impact of four factors, carbon/nitrogen ratio (C/N), moisture content (MC), type of bulking agent (BA) and aeration rate (AR) on the maturity, stability and toxicity of compost product was investigated. The statistically significant factors were identified using final C/N, germination index (GI) and especially the enzyme activities as responses. Experimental results validated the use of enzyme activities as proper indices during the course of composting. Maximum enzyme activities occurred during the active phase of decomposition. MC has a significant effect on dehydrogenase activity (DGH), β-glucosidase activity (BGH), phosphodiesterase activity (PDE) and the final moisture content of the compost. C/N is statistically significant for final C/N, DGH, BGH, and GI. The results provided guidance to optimize a MSW composting system that will lead to increased decomposition rate and the production of more stable and mature compost. Copyright © 2016 Elsevier Ltd. All rights reserved.
Galick, Aimee; D'Arrigo-Patrick, Elizabeth; Knudson-Martin, Carmen
2015-08-01
Female heart patients are underdiagnosed and undertreated. The purpose of this qualitative meta-data-analysis was to explain how societal expectations related to gender and the treatment environment influence women's experiences and can inform optimal care. The authors used grounded theory methodology and a social constructionist gender lens to analyze 43 studies (1993-2012) of women's experiences of heart disease. The analysis illustrates how social expectations within both medical and relational contexts led to women experiencing barriers to diagnosis and treatment and inadvertent minimization of their experience and knowledge. Women's descriptions of their experiences suggest three kinds of health care strategies that have the potential to increase women's engagement with heart disease treatment and rehabilitation: (a) support give and take in relational connections, (b) identify and acknowledge unique health-promoting behavior, and (c) focus on empowerment. These findings have interdisciplinary implications for practice with women with heart disease. © The Author(s) 2015.
A Numerical, Literal, and Converged Perturbation Algorithm
NASA Astrophysics Data System (ADS)
Wiesel, William E.
2017-09-01
The KAM theorem and von Ziepel's method are applied to a perturbed harmonic oscillator, and it is noted that the KAM methodology does not allow for necessary frequency or angle corrections, while von Ziepel does. The KAM methodology can be carried out with purely numerical methods, since its generating function does not contain momentum dependence. The KAM iteration is extended to allow for frequency and angle changes, and in the process apparently can be successfully applied to degenerate systems normally ruled out by the classical KAM theorem. Convergence is observed to be geometric, not exponential, but it does proceed smoothly to machine precision. The algorithm produces a converged perturbation solution by numerical methods, while still retaining literal variable dependence, at least in the vicinity of a given trajectory.
Prestigious Science Journals Struggle to Reach Even Average Reliability
Brembs, Björn
2018-01-01
In which journal a scientist publishes is considered one of the most crucial factors determining their career. The underlying common assumption is that only the best scientists manage to publish in a highly selective tier of the most prestigious journals. However, data from several lines of evidence suggest that the methodological quality of scientific experiments does not increase with increasing rank of the journal. On the contrary, an accumulating body of evidence suggests the inverse: methodological quality and, consequently, reliability of published research works in several fields may be decreasing with increasing journal rank. The data supporting these conclusions circumvent confounding factors such as increased readership and scrutiny for these journals, focusing instead on quantifiable indicators of methodological soundness in the published literature, relying on, in part, semi-automated data extraction from often thousands of publications at a time. With the accumulating evidence over the last decade grew the realization that the very existence of scholarly journals, due to their inherent hierarchy, constitutes one of the major threats to publicly funded science: hiring, promoting and funding scientists who publish unreliable science eventually erodes public trust in science. PMID:29515380
Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions
NASA Technical Reports Server (NTRS)
Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.
2011-01-01
A surrogate model methodology is described for predicting, during flight, the residual strength of aircraft structures that sustain discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. Two ductile fracture simulations are presented to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data does, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high fidelity fracture simulation framework provide useful tools for adaptive flight technology.
Optimisation Of Cutting Parameters Of Composite Material Laser Cutting Process By Taguchi Method
NASA Astrophysics Data System (ADS)
Lokesh, S.; Niresh, J.; Neelakrishnan, S.; Rahul, S. P. Deepak
2018-03-01
The aim of this work is to develop a laser cutting process model that can predict the relationship between the process input parameters and resultant surface roughness, kerf width characteristics. The research conduct is based on the Design of Experiment (DOE) analysis. Response Surface Methodology (RSM) is used in this work. It is one of the most practical and most effective techniques to develop a process model. Even though RSM has been used for the optimization of the laser process, this research investigates laser cutting of materials like Composite wood (veneer)to be best circumstances of laser cutting using RSM process. The input parameters evaluated are focal length, power supply and cutting speed, the output responses being kerf width, surface roughness, temperature. To efficiently optimize and customize the kerf width and surface roughness characteristics, a machine laser cutting process model using Taguchi L9 orthogonal methodology was proposed.
Summarizing Monte Carlo Results in Methodological Research.
ERIC Educational Resources Information Center
Harwell, Michael R.
Monte Carlo studies of statistical tests are prominently featured in the methodological research literature. Unfortunately, the information from these studies does not appear to have significantly influenced methodological practice in educational and psychological research. One reason is that Monte Carlo studies lack an overarching theory to guide…
NASA Astrophysics Data System (ADS)
Saavedra, Juan Alejandro
Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.
Methodology Series Module 7: Ecologic Studies and Natural Experiments
Setia, Maninder Singh
2017-01-01
In this module, we have discussed study designs that have not been covered in the previous modules – ecologic studies and natural experiments. In an ecologic study, the unit of analysis is a group or aggregate rather than the individual. It may be the characteristics of districts, states, or countries. For example, per capita income across countries, income quintiles across districts, and proportion of college graduates in states. If the data already exist (such as global measures and prevalence of diseases, data sets such as the National Family Health Survey, census data), then ecologic studies are cheap and data are easy to collect. However, one needs to be aware of the “ecologic fallacy.” The researcher should not interpret ecologic level results at the individual level. In “natural experiments,” the researcher does not assign the exposure (as is the case in interventional studies) to the groups in the study. The exposure is assigned by a natural process. This may be due to existing policies or services (example, one city has laws against specific vehicles and the other city does not); changes in services or policies; or introduction of new laws (such helmet for bikers and seat-belts for cars). We would like to encourage researchers to explore the possibility of using these study designs to conduct studies. PMID:28216721
Methodology Series Module 7: Ecologic Studies and Natural Experiments.
Setia, Maninder Singh
2017-01-01
In this module, we have discussed study designs that have not been covered in the previous modules - ecologic studies and natural experiments. In an ecologic study, the unit of analysis is a group or aggregate rather than the individual. It may be the characteristics of districts, states, or countries. For example, per capita income across countries, income quintiles across districts, and proportion of college graduates in states. If the data already exist (such as global measures and prevalence of diseases, data sets such as the National Family Health Survey, census data), then ecologic studies are cheap and data are easy to collect. However, one needs to be aware of the "ecologic fallacy." The researcher should not interpret ecologic level results at the individual level. In "natural experiments," the researcher does not assign the exposure (as is the case in interventional studies) to the groups in the study. The exposure is assigned by a natural process. This may be due to existing policies or services (example, one city has laws against specific vehicles and the other city does not); changes in services or policies; or introduction of new laws (such helmet for bikers and seat-belts for cars). We would like to encourage researchers to explore the possibility of using these study designs to conduct studies.
Power Analysis Tutorial for Experimental Design Software
2014-11-01
I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 November 2014 Power Analysis Tutorial for Experimental Design Software...16) [Jun 2013]. I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5205 Power Analysis Tutorial for Experimental Design ...Test and Evaluation (T&E) community is increasing its employment of Design of Experiments (DOE), a rigorous methodology for planning and evaluating
Mhaskar, Rahul; Djulbegovic, Benjamin; Magazin, Anja; Soares, Heloisa P; Kumar, Ambuj
2012-06-01
To assess whether the reported methodological quality of randomized controlled trials (RCTs) reflects the actual methodological quality and to evaluate the association of effect size (ES) and sample size with methodological quality. Systematic review. This is a retrospective analysis of all consecutive phase III RCTs published by eight National Cancer Institute Cooperative Groups up to 2006. Data were extracted from protocols (actual quality) and publications (reported quality) for each study. Four hundred twenty-nine RCTs met the inclusion criteria. Overall reporting of methodological quality was poor and did not reflect the actual high methodological quality of RCTs. The results showed no association between sample size and actual methodological quality of a trial. Poor reporting of allocation concealment and blinding exaggerated the ES by 6% (ratio of hazard ratio [RHR]: 0.94; 95% confidence interval [CI]: 0.88, 0.99) and 24% (RHR: 1.24; 95% CI: 1.05, 1.43), respectively. However, actual quality assessment showed no association between ES and methodological quality. The largest study to date shows that poor quality of reporting does not reflect the actual high methodological quality. Assessment of the impact of quality on the ES based on reported quality can produce misleading results. Copyright © 2012 Elsevier Inc. All rights reserved.
Experiment on building Sundanese lexical database based on WordNet
NASA Astrophysics Data System (ADS)
Dewi Budiwati, Sari; Nurani Setiawan, Novihana
2018-03-01
Sundanese language is the second biggest local language used in Indonesia. Currently, Sundanese language is rarely used since we have the Indonesian language in everyday conversation and as the national language. We built a Sundanese lexical database based on WordNet and Indonesian WordNet as an alternative way to preserve the language as one of local culture. WordNet was chosen because of Sundanese language has three levels of word delivery, called language code of conduct. Web user participant involved in this research for specifying Sundanese semantic relations, and an expert linguistic for validating the relations. The merge methodology was implemented in this experiment. Some words are equivalent with WordNet while another does not have its equivalence since some words are not exist in another culture.
Artificial neural networks in evaluation and optimization of modified release solid dosage forms.
Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica
2012-10-18
Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.
Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms
Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica
2012-01-01
Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369
The local-ladder effect: social status and subjective well-being.
Anderson, Cameron; Kraus, Michael W; Galinsky, Adam D; Keltner, Dacher
2012-07-01
Dozens of studies in different nations have revealed that socioeconomic status only weakly predicts an individual's subjective well-being (SWB). These results imply that although the pursuit of social status is a fundamental human motivation, achieving high status has little impact on one's SWB. However, we propose that sociometric status-the respect and admiration one has in face-to-face groups (e.g., among friends or coworkers)-has a stronger effect on SWB than does socioeconomic status. Using correlational, experimental, and longitudinal methodologies, four studies found consistent evidence for a local-ladder effect: Sociometric status significantly predicted satisfaction with life and the experience of positive and negative emotions. Longitudinally, as sociometric status rose or fell, SWB rose or fell accordingly. Furthermore, these effects were driven by feelings of power and social acceptance. Overall, individuals' sociometric status matters more to their SWB than does their socioeconomic status.
Chen, Po-Hao; Loehfelm, Thomas W; Kamer, Aaron P; Lemmon, Andrew B; Cook, Tessa S; Kohli, Marc D
2016-12-01
The residency review committee of the Accreditation Council of Graduate Medical Education (ACGME) collects data on resident exam volume and sets minimum requirements. However, this data is not made readily available, and the ACGME does not share their tools or methodology. It is therefore difficult to assess the integrity of the data and determine if it truly reflects relevant aspects of the resident experience. This manuscript describes our experience creating a multi-institutional case log, incorporating data from three American diagnostic radiology residency programs. Each of the three sites independently established automated query pipelines from the various radiology information systems in their respective hospital groups, thereby creating a resident-specific database. Then, the three institutional resident case log databases were aggregated into a single centralized database schema. Three hundred thirty residents and 2,905,923 radiologic examinations over a 4-year span were catalogued using 11 ACGME categories. Our experience highlights big data challenges including internal data heterogeneity and external data discrepancies faced by informatics researchers.
Political Transformation and Research Methodology in Doctoral Education
ERIC Educational Resources Information Center
Herman, Chaya
2010-01-01
This paper examines the relationship between political change and epistemologies and methodologies employed at doctorate level. It does so by analysing the range of topics, questions and methodologies used by doctoral students at the University of Pretoria's Faculty of Education between 1985 and 2005--a time-frame that covers the decade before and…
The ethics of placebo-controlled trials: methodological justifications.
Millum, Joseph; Grady, Christine
2013-11-01
The use of placebo controls in clinical trials remains controversial. Ethical analysis and international ethical guidance permit the use of placebo controls in randomized trials when scientifically indicated in four cases: (1) when there is no proven effective treatment for the condition under study; (2) when withholding treatment poses negligible risks to participants; (3) when there are compelling methodological reasons for using placebo, and withholding treatment does not pose a risk of serious harm to participants; and, more controversially, (4) when there are compelling methodological reasons for using placebo, and the research is intended to develop interventions that can be implemented in the population from which trial participants are drawn, and the trial does not require participants to forgo treatment they would otherwise receive. The concept of methodological reasons is essential to assessing the ethics of placebo controls in these controversial last two cases. This article sets out key considerations relevant to considering whether methodological reasons for a placebo control are compelling. © 2013.
Methodology to identify risk-significant components for inservice inspection and testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.
1992-08-01
Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Brockway, Walter F.; Lung, Bruce
The primary objective of the Department of Energy’s (DOE) Energy Treasure Hunt In-Plant Training (INPLT) is to train Better Plants partner employees to lead and conduct future energy efficiency Treasure Hunts within their facilities without DOE assistance. By taking a “learning-by-doing” approach, this INPLT, like other DOE INPLT trainings, has the added benefit of uncovering real energy and cost-saving opportunities. This INPLT leverages DOE and Better Plants technical staff, resources and tools and the EPA “Energy Treasure Hunt Guide: Simple Steps to Finding Energy Savings” process. While Treasure Hunts are a relatively well-known approach to identifying energy-savings in manufacturing plants,more » DOE is adding several additional elements in its Treasure Hunt Exchanges. The first element is technical assistance and methodology. DOE provides high-quality technical resources, such as energy efficiency calculators, fact sheets, source books etc., to facilitate the Treasure Hunt process and teaches four fundamentals: 1) how to profile equipment, 2) how to collect data, and 3), data & ROI calculation methodologies. Another element is the “train the trainer” approach wherein the training facilitator will train at least one partner employee to facilitate future treasure hunts. Another element is that DOE provides energy diagnostic equipment and teaches the participants how to use them. Finally, DOE also offers partners the opportunity to exchange teams of employees either within a partners’ enterprise or with other partners to conduct the treasure hunt in each other’s facilities. This exchange of teams is important because each team can bring different insights and uncover energy-saving opportunities that would otherwise be missed. This paper will discuss DOE methodology and the early results and lessons learned from DOE’S Energy Treasure Hunt In-Plant Trainings at Better Plants Partner facilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Paul D. Bayless; Richard W. Johnson
2010-09-01
The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) beganmore » their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is inadequate to permit steady-state operation at reasonable conditions. 4. To enable the HTTF to operate at a more representative steady-state conditions, DOE recently allocated funding via a DOE subcontract to HTTF to permit an OSU infrastructure upgrade such that 2.2 MW will become available for HTTF experiments. 5. Analyses have been performed to study the relationship between HTTF and MHTGR via the hierarchical two-tiered scaling methodology which has been used successfully in the past, e.g., APEX facility scaling to the Westinghouse AP600 plant. These analyses have focused on the relationship between key variables that will be measured in the HTTF to the counterpart variables in the MHTGR with a focus on natural circulation, using nitrogen as a working fluid, and core heat transfer. 6. Both RELAP5-3D and computational fluid dynamics (CD-Adapco’s STAR-CCM+) numerical models of the MHTGR and the HTTF have been constructed and analyses are underway to study the relationship between the reference reactor and the HTTF. The HTTF is presently being designed. It has ¼-scaling relationship to the MHTGR in both the height and the diameter. Decisions have been made to design the reactor cavity cooling system (RCCS) simulation as a boundary condition for the HTTF to ensure that (a) the boundary condition is well defined and (b) the boundary condition can be modified easily to achieve the desired heat transfer sink for HTTF experimental operations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-09-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigations/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating establishment technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies requires by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-02-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
TRAC Innovative Visualization Techniques
2016-11-14
Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and
Optimization of a chondrogenic medium through the use of factorial design of experiments.
Enochson, Lars; Brittberg, Mats; Lindahl, Anders
2012-12-01
The standard culture system for in vitro cartilage research is based on cells in a three-dimensional micromass culture and a defined medium containing the chondrogenic key growth factor, transforming growth factor (TGF)-β1. The aim of this study was to optimize the medium for chondrocyte micromass culture. Human chondrocytes were cultured in different media formulations, designed with a factorial design of experiments (DoE) approach and based on the standard medium for redifferentiation. The significant factors for the redifferentiation of the chondrocytes were determined and optimized in a two-step process through the use of response surface methodology. TGF-β1, dexamethasone, and glucose were significant factors for differentiating the chondrocytes. Compared to the standard medium, TGF-β1 was increased 30%, dexamethasone reduced 50%, and glucose increased 22%. The potency of the optimized medium was validated in a comparative study against the standard medium. The optimized medium resulted in micromass cultures with increased expression of genes important for the articular chondrocyte phenotype and in cultures with increased glycosaminoglycan/DNA content. Optimizing the standard medium with the efficient DoE method, a new medium that gave better redifferentiation for articular chondrocytes was determined.
Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C
2015-08-05
Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
2016-07-26
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
Advanced flight control system study
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.
1982-01-01
A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.
Prediction of microcracking in composite laminates under thermomechanical loading
NASA Technical Reports Server (NTRS)
Maddocks, Jason R.; Mcmanus, Hugh L.
1995-01-01
Composite laminates used in space structures are exposed to both thermal and mechanical loads. Cracks in the matrix form, changing the laminate thermoelastic properties. An analytical methodology is developed to predict microcrack density in a general laminate exposed to an arbitrary thermomechanical load history. The analysis uses a shear lag stress solution in conjunction with an energy-based cracking criterion. Experimental investigation was used to verify the analysis. Correlation between analysis and experiment is generally excellent. The analysis does not capture machining-induced cracking, or observed delayed crack initiation in a few ply groups, but these errors do not prevent the model from being a useful preliminary design tool.
Graphics performance in rich Internet applications.
Hoetzlein, Rama C
2012-01-01
Rendering performance for rich Internet applications (RIAs) has recently focused on the debate between using Flash and HTML5 for streaming video and gaming on mobile devices. A key area not widely explored, however, is the scalability of raw bitmap graphics performance for RIAs. Does Flash render animated sprites faster than HTML5? How much faster is WebGL than Flash? Answers to these questions are essential for developing large-scale data visualizations, online games, and truly dynamic websites. A new test methodology analyzes graphics performance across RIA frameworks and browsers, revealing specific performance outliers in existing frameworks. The results point toward a future in which all online experiences might be GPU accelerated.
Development of a Response Surface Thermal Model for Orion Mated to the International Space Station
NASA Technical Reports Server (NTRS)
Miller, Stephen W.; Meier, Eric J.
2010-01-01
A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs
Xie, Jinfu; Horton, Melanie; Zorman, Julie; Antonello, Joseph M.; Zhang, Yuhua; Arnold, Beth A.; Secore, Susan; Xoconostle, Rachel; Miezeiewski, Matthew; Wang, Su; Price, Colleen E.; Thiriot, David; Goerke, Aaron; Gentile, Marie-Pierre; Skinner, Julie M.
2014-01-01
Clostridium difficile strains producing binary toxin, in addition to toxin A (TcdA) and toxin B (TcdB), have been associated with more severe disease and increased recurrence of C. difficile infection in recent outbreaks. Binary toxin comprises two subunits (CDTa and CDTb) and catalyzes the ADP-ribosylation of globular actin (G-actin), which leads to the depolymerization of filamentous actin (F-actin) filaments. A robust assay is highly desirable for detecting the cytotoxic effect of the toxin and the presence of neutralizing antibodies in animal and human sera to evaluate vaccine efficacy. We describe here the optimization, using design-of-experiment (DOE) methodology, of a high-throughput assay to measure the toxin potency and neutralizing antibodies (NAb) against binary toxin. Vero cells were chosen from a panel of cells screened for sensitivity and specificity. We have successfully optimized the CDTa-to-CDTb molar ratio, toxin concentration, cell-seeding density, and sera-toxin preincubation time in the NAb assay using DOE methodology. This assay is robust, produces linear results across serial dilutions of hyperimmune serum, and can be used to quantify neutralizing antibodies in sera from hamsters and monkeys immunized with C. difficile binary toxin-containing vaccines. The assay will be useful for C. difficile diagnosis, for epidemiology studies, and for selecting and optimizing vaccine candidates. PMID:24623624
ERIC Educational Resources Information Center
Quinlan, K. M.; Male, S.; Baillie, C.; Stamboulis, A.; Fill, J.; Jaffer, Z.
2013-01-01
Threshold concepts were introduced nearly 10 years ago by Ray Land and Jan Meyer. This work has spawned four international conferences and hundreds of papers. Although the idea has clearly gained traction in higher education, this sub-field does not yet have a fully fledged research methodology or a strong critical discourse about methodology.…
Does Maltreatment Beget Maltreatment? A Systematic Review of the Intergenerational Literature
Thornberry, Terence P.; Knight, Kelly E.; Lovegrove, Peter J.
2014-01-01
In this paper, we critically review the literature testing the cycle of maltreatment hypothesis which posits continuity in maltreatment across adjacent generations. That is, we examine whether a history of maltreatment victimization is a significant risk factor for the later perpetration of maltreatment. We begin by establishing 11 methodological criteria that studies testing this hypothesis should meet. They include such basic standards as using representative samples, valid and reliable measures, prospective designs, and different reporters for each generation. We identify 47 studies that investigated this issue and then evaluate them with regard to the 11 methodological criteria. Overall, most of these studies report findings consistent with the cycle of maltreatment hypothesis. Unfortunately, at the same time, few of them satisfy the basic methodological criteria that we established; indeed, even the stronger studies in this area only meet about half of them. Moreover, the methodologically stronger studies present mixed support for the hypothesis. As a result, the positive association often reported in the literature appears to be based largely on the methodologically weaker designs. Based on our systematic methodological review, we conclude that this small and methodologically weak body of literature does not provide a definitive test of the cycle of maltreatment hypothesis. We conclude that it is imperative to develop more robust and methodologically adequate assessments of this hypothesis to more accurately inform the development of prevention and treatment programs. PMID:22673145
A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.
In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.
1994-10-01
The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less
NASA Astrophysics Data System (ADS)
Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.
2017-09-01
This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Justin Matthew
These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less
Comparison of a 3-D CFD-DSMC Solution Methodology With a Wind Tunnel Experiment
NASA Technical Reports Server (NTRS)
Glass, Christopher E.; Horvath, Thomas J.
2002-01-01
A solution method for problems that contain both continuum and rarefied flow regions is presented. The methodology is applied to flow about the 3-D Mars Sample Return Orbiter (MSRO) that has a highly compressed forebody flow, a shear layer where the flow separates from a forebody lip, and a low density wake. Because blunt body flow fields contain such disparate regions, employing a single numerical technique to solve the entire 3-D flow field is often impractical, or the technique does not apply. Direct simulation Monte Carlo (DSMC) could be employed to solve the entire flow field; however, the technique requires inordinate computational resources for continuum and near-continuum regions, and is best suited for the wake region. Computational fluid dynamics (CFD) will solve the high-density forebody flow, but continuum assumptions do not apply in the rarefied wake region. The CFD-DSMC approach presented herein may be a suitable way to obtain a higher fidelity solution.
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Vachhani, Kathak; Pagotto, Andrea; Wang, Yufa; Whyne, Cari; Nam, Diane
2018-01-03
Fracture healing is a lengthy process which fails in 5-10% of cases. Lithium, a low-cost therapeutic used in psychiatric medicine, up-regulates the canonical Wingless pathway crucial for osteoblastic mineralization in fracture healing. A design-of-experiments (DOE) methodology was used to optimize lithium administration parameters (dose, onset time and treatment duration) to enhance healing in a rat femoral fracture model. In the previously completed first stage (screening), onset time was found to significantly impact healing, with later (day 7 vs. day 3 post-fracture) treatment yielding improved maximum yield torque. The greatest strength was found in healing femurs treated at day 7 post fracture, with a low lithium dose (20 mg/kg) for 2 weeks duration. This paper describes the findings of the second (optimization) and third (verification) stages of the DOE investigation. Closed traumatic diaphyseal femur fractures were induced in 3-month old rats. Healing was evaluated on day 28 post fracture by CT-based morphometry and torsional loading. In optimization, later onset times of day 10 and 14 did not perform as well as day 7 onset. As such, efficacy of the best regimen (20 mg/kg dose given at day 7 onset for 2 weeks duration) was reassessed in a distinct cohort of animals to complete the DOE verification. A significant 44% higher maximum yield torque (primary outcome) was seen with optimized lithium treatment vs. controls, which paralleled the 46% improvement seen in the screening stage. Successful completion of this robustly designed preclinical DOE study delineates the optimal lithium regimen for enhancing preclinical long-bone fracture healing. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kierkegaard, Axel; Boij, Susann; Efraimsson, Gunilla
2010-02-01
Acoustic wave propagation in flow ducts is commonly modeled with time-domain non-linear Navier-Stokes equation methodologies. To reduce computational effort, investigations of a linearized approach in frequency domain are carried out. Calculations of sound wave propagation in a straight duct are presented with an orifice plate and a mean flow present. Results of transmission and reflections at the orifice are presented on a two-port scattering matrix form and are compared to measurements with good agreement. The wave propagation is modeled with a frequency domain linearized Navier-Stokes equation methodology. This methodology is found to be efficient for cases where the acoustic field does not alter the mean flow field, i.e., when whistling does not occur.
Capturing the 'art' of emergency medicine: Does film foster reflection in medical students?
Brand, Gabrielle; Wise, Steve; Siddiqui, Zarrin S; Celenza, Antonio; Fatovich, Daniel M
2017-08-01
Integrating arts and humanities-based pedagogy into curricula is of growing interest among medical educators, particularly how it promotes reflection and empathy. Our aim was to explore whether a 2.50 min film titled 'The Art of the ED' stimulated reflective learning processes in a group of first year medical students. The film was shown prior to their first clinical placement in an ED. Student participation was voluntary and not assessable. Using an exploratory qualitative research approach, this study drew on data collected from students' individual written reflections, exploring their perceptions towards clinical experience in an emergency medicine (EM) attachment. A total of 123 (51% of 240) students submitted a reflection. The qualitative data revealed three main themes: the opportunity for students to preview EM ('While watching the film, I felt like I was the patient and the doctor all at once, in that I was living the experience both from within and as an observer …'); exposed the reality of ED; and fostered a growing awareness of the fragility of human life. These findings highlight how visual methodologies (like film) create a safe, non-threatening space to access, experience and process emotion around their perceptions towards EM, and to anticipate and emotionally prepare for their impending clinical experience in the ED. These data support the use of visual methodologies to foster reflective processes that assist medical students to integrate the 'art' of EM, and the development and commitment of core doctoring values of empathy, service and respect for patients. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Meta-analysis of field experiments shows no change in racial discrimination in hiring over time.
Quillian, Lincoln; Pager, Devah; Hexel, Ole; Midtbøen, Arnfinn H
2017-10-10
This study investigates change over time in the level of hiring discrimination in US labor markets. We perform a meta-analysis of every available field experiment of hiring discrimination against African Americans or Latinos ( n = 28). Together, these studies represent 55,842 applications submitted for 26,326 positions. We focus on trends since 1989 ( n = 24 studies), when field experiments became more common and improved methodologically. Since 1989, whites receive on average 36% more callbacks than African Americans, and 24% more callbacks than Latinos. We observe no change in the level of hiring discrimination against African Americans over the past 25 years, although we find modest evidence of a decline in discrimination against Latinos. Accounting for applicant education, applicant gender, study method, occupational groups, and local labor market conditions does little to alter this result. Contrary to claims of declining discrimination in American society, our estimates suggest that levels of discrimination remain largely unchanged, at least at the point of hire.
Venkata Mohan, S; Chandrasekhara Rao, N; Krishna Prasad, K; Murali Krishna, P; Sreenivas Rao, R; Sarma, P N
2005-06-20
The Taguchi robust experimental design (DOE) methodology has been applied on a dynamic anaerobic process treating complex wastewater by an anaerobic sequencing batch biofilm reactor (AnSBBR). For optimizing the process as well as to evaluate the influence of different factors on the process, the uncontrollable (noise) factors have been considered. The Taguchi methodology adopting dynamic approach is the first of its kind for studying anaerobic process evaluation and process optimization. The designed experimental methodology consisted of four phases--planning, conducting, analysis, and validation connected sequence-wise to achieve the overall optimization. In the experimental design, five controllable factors, i.e., organic loading rate (OLR), inlet pH, biodegradability (BOD/COD ratio), temperature, and sulfate concentration, along with the two uncontrollable (noise) factors, volatile fatty acids (VFA) and alkalinity at two levels were considered for optimization of the anae robic system. Thirty-two anaerobic experiments were conducted with a different combination of factors and the results obtained in terms of substrate degradation rates were processed in Qualitek-4 software to study the main effect of individual factors, interaction between the individual factors, and signal-to-noise (S/N) ratio analysis. Attempts were also made to achieve optimum conditions. Studies on the influence of individual factors on process performance revealed the intensive effect of OLR. In multiple factor interaction studies, biodegradability with other factors, such as temperature, pH, and sulfate have shown maximum influence over the process performance. The optimum conditions for the efficient performance of the anaerobic system in treating complex wastewater by considering dynamic (noise) factors obtained are higher organic loading rate of 3.5 Kg COD/m3 day, neutral pH with high biodegradability (BOD/COD ratio of 0.5), along with mesophilic temperature range (40 degrees C), and low sulfate concentration (700 mg/L). The optimization resulted in enhanced anaerobic performance (56.7%) from a substrate degradation rate (SDR) of 1.99 to 3.13 Kg COD/m3 day. Considering the obtained optimum factors, further validation experiments were carried out, which showed enhanced process performance (3.04 Kg COD/m3-day from 1.99 Kg COD/m3 day) accounting for 52.13% improvement with the optimized process conditions. The proposed method facilitated a systematic mathematical approach to understand the complex multi-species manifested anaerobic process treating complex chemical wastewater by considering the uncontrollable factors. Copyright (c) 2005 Wiley Periodicals, Inc.
Högden, Fabia; Hütter, Mandy; Unkelbach, Christian
2018-02-26
The role of awareness in evaluative learning has been thoroughly investigated with a variety of theoretical and methodological approaches. We investigated evaluative conditioning (EC) without awareness with an approach that conceptually provides optimal conditions for unaware learning - the Continuous Flash Suppression paradigm (CFS). In CFS, a stimulus presented to one eye can be rendered invisible for a prolonged duration by presenting a high-contrast dynamic pattern to the other eye. The suppressed stimulus is nevertheless processed. First, Experiment 1 established EC effects in a pseudo-CFS setup without suppression. Experiment 2 then employed CFS to suppress conditioned stimuli (CSs) from awareness while the unconditioned stimuli (USs) were visible. While Experiment 1 and 2 used a between-participants manipulation of CS suppression, Experiments 3 and 4 both manipulated suppression within participants. We observed EC effects when CSs were not suppressed, but found no EC effects when the CS was suppressed from awareness. We relate our finding to previous research and discuss theoretical implications for EC. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.
Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob
2017-06-12
A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO 3 - and NO 2 - in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg 2+ , Ca 2+ , and Ba 2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.
Building Efficiency Evaluation and Uncertainty Analysis with DOE's Asset Score Preview
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
Building Energy Asset Score Tool, developed by the U.S. Department of Energy (DOE), is a program to encourage energy efficiency improvement by helping building owners and managers assess a building's energy-related systems independent of operations and maintenance. Asset Score Tool uses a simplified EnergyPlus model to provide an assessment of building systems, through minimum user inputs of basic building characteristics. Asset Score Preview is a newly developed option that allows users to assess their building's systems and the potential value of a more in-depth analysis via an even more simplified approach. This methodology provides a preliminary approach to estimating amore » building's energy efficiency and potential for improvement. This paper provides an overview of the methodology used for the development of Asset Score Preview and the scoring methodology.« less
NASA Astrophysics Data System (ADS)
Arafa, Mona G.; Ayoub, Bassam M.
2017-01-01
Niosomes entrapping pregabalin (PG) were prepared using span 60 and cholesterol in different molar ratios by hydration method, the remaining PG from the hydrating solution was separated from vesicles by freeze centrifugation. Optimization of nano-based carrier of pregabalin (PG) was achieved. Quality by Design strategy was successfully employed to obtain PG-loaded niosomes with the desired properties. The optimal particle size, drug release and entrapment efficiency were attained by Minitab® program using design of experiment (DOE) that predicted the best parameters by investigating the combined effect of different factors simultaneously. Pareto chart was used in the screening step to exclude the insignificant variables while response surface methodology (RSM) was used in the optimization step to study the significant factors. Best formula was selected to prepare topical hydrogels loaded with niosomal PG using HPMC and Carbopol 934. It was verified, by means of mechanical and rheological tests, that addition of the vesicles to the gel matrix affected significantly gel network. In vitro release and ex vivo permeation experiments were carried out. Delivery of PG molecules followed a Higuchi, non Fickian diffusion. The present work will be of interest for pharmaceutical industry as a controlled transdermal alternative to the conventional oral route.
Arafa, Mona G.; Ayoub, Bassam M.
2017-01-01
Niosomes entrapping pregabalin (PG) were prepared using span 60 and cholesterol in different molar ratios by hydration method, the remaining PG from the hydrating solution was separated from vesicles by freeze centrifugation. Optimization of nano-based carrier of pregabalin (PG) was achieved. Quality by Design strategy was successfully employed to obtain PG-loaded niosomes with the desired properties. The optimal particle size, drug release and entrapment efficiency were attained by Minitab® program using design of experiment (DOE) that predicted the best parameters by investigating the combined effect of different factors simultaneously. Pareto chart was used in the screening step to exclude the insignificant variables while response surface methodology (RSM) was used in the optimization step to study the significant factors. Best formula was selected to prepare topical hydrogels loaded with niosomal PG using HPMC and Carbopol 934. It was verified, by means of mechanical and rheological tests, that addition of the vesicles to the gel matrix affected significantly gel network. In vitro release and ex vivo permeation experiments were carried out. Delivery of PG molecules followed a Higuchi, non Fickian diffusion. The present work will be of interest for pharmaceutical industry as a controlled transdermal alternative to the conventional oral route. PMID:28134262
Thomas D. Foust, Ph.D, P.E. | NREL
-June 1997 Mechanical Systems Engineer, Nuclear Energy Program, DOE, August 1990-August 1992 Test Production," Science (2007) Heat Exchanger Performance Enhancement Methodologies, DOE Technical Report Separation Systems for Bioenergy Separations," presented at 24th Symposium on Biotechnology for Fuels
2009 DOE Vehicle Technologies Program Annual Merit Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2009-10-01
Annual Merit Review and Peer Evaluation Meeting to review the FY2008 accomplishments and FY2009 plans for the Vehicle Technologies Program, and provide an opportunity for industry, government, and academic to give inputs to DOE on the Program with a structured and formal methodology.
ERIC Educational Resources Information Center
Zelnio, Ryan J.
2013-01-01
This dissertation seeks to contribute to a fuller understanding of how international scientific collaboration has affected national scientific systems. It does this by developing three methodological approaches grounded in social complexity theory and applying them to the evaluation of national scientific systems. The first methodology identifies…
Capturing Individual Uptake: Toward a Disruptive Research Methodology
ERIC Educational Resources Information Center
Bastian, Heather
2015-01-01
This article presents and illustrates a qualitative research methodology for studies of uptake. It does so by articulating a theoretical framework for qualitative investigations of uptake and detailing a research study designed to invoke and capture students' uptakes in a first-year writing classroom. The research design sought to make uptake…
Lithographic process window optimization for mask aligner proximity lithography
NASA Astrophysics Data System (ADS)
Voelkel, Reinhard; Vogler, Uwe; Bramati, Arianna; Erdmann, Andreas; Ünal, Nezih; Hofmann, Ulrich; Hennemeyer, Marc; Zoberbier, Ralph; Nguyen, David; Brugger, Juergen
2014-03-01
We introduce a complete methodology for process window optimization in proximity mask aligner lithography. The commercially available lithography simulation software LAB from GenISys GmbH was used for simulation of light propagation and 3D resist development. The methodology was tested for the practical example of lines and spaces, 5 micron half-pitch, printed in a 1 micron thick layer of AZ® 1512HS1 positive photoresist on a silicon wafer. A SUSS MicroTec MA8 mask aligner, equipped with MO Exposure Optics® was used in simulation and experiment. MO Exposure Optics® is the latest generation of illumination systems for mask aligners. MO Exposure Optics® provides telecentric illumination and excellent light uniformity over the full mask field. MO Exposure Optics® allows the lithography engineer to freely shape the angular spectrum of the illumination light (customized illumination), which is a mandatory requirement for process window optimization. Three different illumination settings have been tested for 0 to 100 micron proximity gap. The results obtained prove, that the introduced process window methodology is a major step forward to obtain more robust processes in mask aligner lithography. The most remarkable outcome of the presented study is that a smaller exposure gap does not automatically lead to better print results in proximity lithography - what the "good instinct" of a lithographer would expect. With more than 5'000 mask aligners installed in research and industry worldwide, the proposed process window methodology might have significant impact on yield improvement and cost saving in industry.
On Improving the Experiment Methodology in Pedagogical Research
ERIC Educational Resources Information Center
Horakova, Tereza; Houska, Milan
2014-01-01
The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerdes, K.D.; Holtzscheiter, E.W.
2006-07-01
The U.S. Department of Energy's (DOE) Office of Environmental Management (EM) has collaborated with the Russian Federal Atomic Energy Agency - Rosatom (formerly Minatom) for 14 years on waste management challenges of mutual concern. Currently, EM is cooperating with Rosatom to explore issues related to high-level waste and investigate Russian experience and technologies that could support EM site cleanup needs. EM and Rosatom are currently implementing six collaborative projects on high-level waste issues: 1) Advanced Melter Technology Application to the U.S. DOE Defense Waste Processing Facility (DWPF) - Cold Crucible Induction Heated Melter (CCIM); 2) - Design Improvements to themore » Cold Crucible Induction Heated Melter; 3) Long-term Performance of Hanford Low-Activity Glasses in Burial Environments; 4) Low-Activity-Waste (LAW) Glass Sulfur Tolerance; 5) Improved Retention of Key Contaminants of Concern in Low Temperature Immobilized Waste Forms; and, 6) Documentation of Mixing and Retrieval Experience at Zheleznogorsk. Preliminary results and the path forward for these projects will be discussed. An overview of two new projects 7) Entombment technology performance and methodology for the Future 8) Radiation Migration Studies at Key Russian Nuclear Disposal Sites is also provided. The purpose of this paper is to provide an overview of EM's objectives for participating in cooperative activities with the Russian Federal Atomic Energy Agency, present programmatic and technical information on these activities, and outline specific technical collaborations currently underway and planned to support DOE's cleanup and closure mission. (authors)« less
Design of experiments (DoE) in pharmaceutical development.
N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios
2017-06-01
At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.
Hybrid CMS methods with model reduction for assembly of structures
NASA Technical Reports Server (NTRS)
Farhat, Charbel
1991-01-01
Future on-orbit structures will be designed and built in several stages, each with specific control requirements. Therefore there must be a methodology which can predict the dynamic characteristics of the assembled structure, based on the dynamic characteristics of the subassemblies and their interfaces. The methodology developed by CSC to address this issue is Hybrid Component Mode Synthesis (HCMS). HCMS distinguishes itself from standard component mode synthesis algorithms in the following features: (1) it does not require the subcomponents to have displacement compatible models, which makes it ideal for analyzing the deployment of heterogeneous flexible multibody systems, (2) it incorporates a second-level model reduction scheme at the interface, which makes it much faster than other algorithms and therefore suitable for control purposes, and (3) it does answer specific questions such as 'how does the global fundamental frequency vary if I change the physical parameters of substructure k by a specified amount?'. Because it is based on an energy principle rather than displacement compatibility, this methodology can also help the designer to define an assembly process. Current and future efforts are devoted to applying the HCMS method to design and analyze docking and berthing procedures in orbital construction.
Brain Drain, Brain Gain, and Mobility: Theories and Prospective Methods
ERIC Educational Resources Information Center
Jalowiecki, Bohdan; Gorzelak, Grzegorz Jerzy
2004-01-01
This paper presents some theoretical and methodological considerations associated with the geographical and professional mobility of science professionals, including the conduct by the authors of a large scale survey questionnaire in Poland in 1994. It does not directly relate to research conducted elsewhere in the region, but does reflect…
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.
This paper reviews existing methodologies and reporting codes used to describe extracted energy resources such as coal and oil and describes a comparable proposed methodology to describe geothermal resources. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of assessing the impacts of its funding programs. This framework will allow for GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress. Standards and reporting codes used in other countries and energy sectorsmore » provide guidance to inform development of a geothermal methodology, but industry feedback and our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and we sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for assessing and reporting on GTO funding according to resource knowledge and resource grade (or quality). This methodology would allow GTO to target funding or measure impact by progression of projects or geological potential for development.« less
Waste treatability guidance program. User`s guide. Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toth, C.
1995-12-21
DOE sites across the country generate and manage radioactive, hazardous, mixed, and sanitary wastes. It is necessary for each site to find the technologies and associated capacities required to manage its waste. One role of DOE HQ Office of Environmental Restoration and Waste Management is to facilitate the integration of the site- specific plans into coherent national plans. DOE has developed a standard methodology for defining and categorizing waste streams into treatability groups based on characteristic parameters that influence waste management technology needs. This Waste Treatability Guidance Program automates the Guidance Document for the categorization of waste information into treatabilitymore » groups; this application provides a consistent implementation of the methodology across the National TRU Program. This User`s Guide provides instructions on how to use the program, including installations instructions and program operation. This document satisfies the requirements of the Software Quality Assurance Plan.« less
Sobczyk, Mateusz; Michno, Klaudia; Kosztyła, Paulina; Stec, Daniel; Michalczyk, Łukasz
2015-12-01
The acute toxicity of ammonia on Thulinius ruffoi (Bertolani, 1981), a eutardigrade isolated from a small waste water treatment plant (WWTP) in Poland, was estimated. Our results show that no active individuals survived a 24 h exposure to solutions equal to or higher than 125 mg/L of total ammonia nitrogen (NH3-N + NH4 (+)-N), which, under the conditions in our experiment, was equivalent to 1.17 mg/L of un-ionised ammonia (NH3). The LC50 concentration of total ammonia nitrogen was equal to 52 mg/L (or 0.65 mg/L un-ionised ammonia). Given that the norms for the concentration of ammonia in treated waters leaving WWTPs are usually several times lower than the LC50 for T. ruffoi, this species does not seem to be a good bioindicator candidate for WWTPs. In this paper we also note that various ecotoxicological studies use different methodological approaches and we suggest that a more uniform methodology may aid interspecific comparisons of LC50 values.
Abdominal surgery process modeling framework for simulation using spreadsheets.
Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja
2015-08-01
We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Eckert, Marion; Jones, Tina
2002-06-01
This study aimed to identify the lived experience of patients with implantable cardioverter defibrillators (ICD) and their families. The methodology used was interpretative phenomenology. Unstructured interviews were conducted with three family members and three ICD recipients. Using a methodological approach outlined by van Manen, the participants transcribed texts were analysed looking for similar concepts and ideas that developed into themes that explicated the meaning of this phenomena. The themes that emerged were: dependence, which encompassed their perceptions about the life-saving device; the memory of their first defibrillation experience; lifestyle changes, which incorporated modification techniques; lack of control, which highlighted feelings such as fear, anxiety and powerlessness; mind game, which illustrated psychological challenges; and the issue of security, demonstrating how 'being there' and not 'being there' impacted on their everyday lives. The long-term outcomes of living with an ICD are important considerations for all health-care providers. This research highlights the everyday activities of recipients, the lifestyle changes they have made, the emotional significance of the device and the psychological coping strategies that the participants have adopted. The findings of this research will allow health-care professionals to be better prepared to provide education and support for ICD recipients and their families in regards to issues related to insertion of the device during the postinsertion recovery period and for long-term management after hospital discharge.
Defining process design space for monoclonal antibody cell culture.
Abu-Absi, Susan Fugett; Yang, LiYing; Thompson, Patrick; Jiang, Canping; Kandula, Sunitha; Schilling, Bernhard; Shukla, Abhinav A
2010-08-15
The concept of design space has been taking root as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. During mapping of the process design space, the multidimensional combination of operational variables is studied to quantify the impact on process performance in terms of productivity and product quality. An efficient methodology to map the design space for a monoclonal antibody cell culture process is described. A failure modes and effects analysis (FMEA) was used as the basis for the process characterization exercise. This was followed by an integrated study of the inoculum stage of the process which includes progressive shake flask and seed bioreactor steps. The operating conditions for the seed bioreactor were studied in an integrated fashion with the production bioreactor using a two stage design of experiments (DOE) methodology to enable optimization of operating conditions. A two level Resolution IV design was followed by a central composite design (CCD). These experiments enabled identification of the edge of failure and classification of the operational parameters as non-key, key or critical. In addition, the models generated from the data provide further insight into balancing productivity of the cell culture process with product quality considerations. Finally, process and product-related impurity clearance was evaluated by studies linking the upstream process with downstream purification. Production bioreactor parameters that directly influence antibody charge variants and glycosylation in CHO systems were identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehrmann, Henning; Perdue, Robert
2012-07-01
Cementation of radioactive waste is a common technology. The waste is mixed with cement and water and forms a stable, solid block. The physical properties like compression strength or low leach ability depends strongly on the cement recipe. Due to the fact that this waste cement mixture has to fulfill special requirements, a recipe development is necessary. The Six Sigma{sup TM}' DMAIC methodology, together with the Design of experiment (DoE) approach, was employed to optimize the process of a recipe development for cementation at the Ling Ao nuclear power plant (NPP) in China. The DMAIC offers a structured, systematical andmore » traceable process to derive test parameters. The DoE test plans and statistical analysis is efficient regarding the amount of test runs and the benefit gain by getting a transfer function. A transfer function enables simulation which is useful to optimize the later process and being responsive to changes. The DoE method was successfully applied for developing a cementation recipe for both evaporator concentrate and resin waste in the plant. The key input parameters were determined, evaluated and the control of these parameters were included into the design. The applied Six Sigma{sup TM} tools can help to organize the thinking during the engineering process. Data are organized and clearly presented. Various variables can be limited to the most important ones. The Six Sigma{sup TM} tools help to make the thinking and decision process trace able. The tools can help to make data driven decisions (e.g. C and E Matrix). But the tools are not the only golden way. Results from scoring tools like the C and E Matrix need close review before using them. The DoE is an effective tool for generating test plans. DoE can be used with a small number of tests runs, but gives a valuable result from an engineering perspective in terms of a transfer function. The DoE prediction results, however, are only valid in the tested area. So a careful selection of input parameter and their limits for setting up a DoE is very important. An extrapolation of results is not recommended because the results are not reliable out of the tested area. (authors)« less
Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays
NASA Technical Reports Server (NTRS)
Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.
1977-01-01
A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.
PIA and REWIND: Two New Methodologies for Cross Section Adjustment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmiotti, G.; Salvatores, M.
2017-02-01
This paper presents two new cross section adjustment methodologies intended for coping with the problem of compensations. The first one PIA, Progressive Incremental Adjustment, gives priority to the utilization of experiments of elemental type (those sensitive to a specific cross section), following a definite hierarchy on which type of experiment to use. Once the adjustment is performed, both the new adjusted data and the new covariance matrix are kept. The second methodology is called REWIND (Ranking Experiments by Weighting for Improved Nuclear Data). This new proposed approach tries to establish a methodology for ranking experiments by looking at the potentialmore » gain they can produce in an adjustment. Practical applications for different adjustments illustrate the results of the two methodologies against the current one and show the potential improvement for reducing uncertainties in target reactors.« less
Hospital cost control in Norway: a decade's experience with prospective payment.
Crane, T S
1985-01-01
Under Norway's prospective payment system, which was in existence from 1972 to 1980, hospital costs increased 15.8 percent annually, compared with 15.3 percent in the United States. In 1980 the Norwegian national government started paying for all institutional services according to a population-based, morbidity-adjusted formula. Norway's prospective payment system provides important insights into problems of controlling hospital costs despite significant differences, including ownership of medical facilities and payment and spending as a percent of GNP. Yet striking similarities exist. Annual real growth in health expenditures from 1972 to 1980 in Norway was 2.2 percent, compared with 2.4 percent in the United States. In both countries, public demands for cost control were accompanied by demands for more services. And problems of geographic dispersion of new technology and distribution of resources were similar. Norway's experience in the 1970s demonstrates that prospective payment is no panacea. The annual budget process created disincentives to hospitals to control costs. But Norway's changes in 1980 to a population-based methodology suggest a useful approach to achieve a more equitable distribution of resources. This method of payment provides incentives to control variations in both admissions and cost per case. In contrast, the Medicare approach based on Diagnostic Related Groups (DRGs) is limited, and it does not affect variations in admissions and capital costs. Population-based methodologies can be used in adjusting DRG rates to control both problems. In addition, the DRG system only applies to Medicare payments; the Norwegian experience demonstrates that this system may result in significant shifting of costs onto other payors. PMID:3927385
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Safety Issues at the Defense Production Reactors. A Report to the U.S. Department of Energy.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC. Commission on Physical Sciences, Mathematics, and Resources.
This report provides an assessment of safety management, safety review, and safety methodology employed by the Department of Energy (DOE) and private contractors. Chapter 1, "The DOE Safety Framework," examines safety objectives for production reactors and processes to implement the objectives. Chapter 2, "Technical Issues,"…
Derivatization of peptides as quaternary ammonium salts for sensitive detection by ESI-MS.
Cydzik, Marzena; Rudowska, Magdalena; Stefanowicz, Piotr; Szewczuk, Zbigniew
2011-06-01
A series of model peptides in the form of quaternary ammonium salts at the N-terminus was efficiently prepared by the solid-phase synthesis. Tandem mass spectrometric analysis of the peptide quaternary ammonium derivatives was shown to provide sequence confirmation and enhanced detection. We designed the 2-(1,4-diazabicyclo[2.2.2] octylammonium)acetyl quaternary ammonium group which does not suffer from neutral losses during MS/MS experiments. The presented quaternization of 1,4-diazabicyclo[2.2.2]octane (DABCO) by iodoacetylated peptides is relatively easy and compatible with standard solid-phase peptide synthesis. This methodology offers a novel sensitive approach to analyze peptides and other compounds. Copyright © 2011 European Peptide Society and John Wiley & Sons, Ltd.
Acoustics outreach program for the deaf
NASA Astrophysics Data System (ADS)
Vongsawad, Cameron T.; Berardi, Mark L.; Whiting, Jennifer K.; Lawler, M. Jeannette; Gee, Kent L.; Neilsen, Tracianne B.
2016-03-01
The Hear and See methodology has often been used as a means of enhancing pedagogy by focusing on the two strongest learning senses, but this naturally does not apply to deaf or hard of hearing students. Because deaf students' prior nonaural experiences with sound will vary significantly from those of students with typical hearing, different methods must be used to build understanding. However, the sensory-focused pedagogical principle can be applied in a different way for the Deaf by utilizing the senses of touch and sight, called here the ``See and Feel'' method. This presentation will provide several examples of how acoustics demonstrations have been adapted to create an outreach program for a group of junior high students from a school for the Deaf and discuss challenges encountered.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
... Methodology for Boiling Water Reactors, June 2011. To support use of Topical Report ANP-10307PA, Revision 0... the NRC's E-Filing system does not support unlisted software, and the NRC Meta System Help Desk will... Water Reactors with AREVA Topical Report ANP- 10307PA, Revision 0, ``AREVA MCPR Safety Limit Methodology...
ERIC Educational Resources Information Center
Bath, Caroline
2009-01-01
This paper explores how ethnographic and action research methodologies can be justifiably combined to create a new methodological approach in educational research. It draws on existing examples in both educational research and development studies that have discussed the use of ethnography and action research in specific projects. Interpretations…
What Does Global Migration Network Say about Recent Changes in the World System Structure?
ERIC Educational Resources Information Center
Zinkina, Julia; Korotayev, Andrey
2014-01-01
Purpose: The aim of this paper is to investigate whether the structure of the international migration system has remained stable through the recent turbulent changes in the world system. Design/methodology/approach: The methodology draws on the social network analysis framework--but with some noteworthy limitations stipulated by the specifics of…
Sovány, Tamás; Tislér, Zsófia; Kristó, Katalin; Kelemen, András; Regdon, Géza
2016-09-01
The application of the Quality by Design principles is one of the key issues of the recent pharmaceutical developments. In the past decade a lot of knowledge was collected about the practical realization of the concept, but there are still a lot of unanswered questions. The key requirement of the concept is the mathematical description of the effect of the critical factors and their interactions on the critical quality attributes (CQAs) of the product. The process design space (PDS) is usually determined by the use of design of experiment (DoE) based response surface methodologies (RSM), but inaccuracies in the applied polynomial models often resulted in the over/underestimation of the real trends and changes making the calculations uncertain, especially in the edge regions of the PDS. The completion of RSM with artificial neural network (ANN) based models is therefore a commonly used method to reduce the uncertainties. Nevertheless, since the different researches are focusing on the use of a given DoE, there is lack of comparative studies on different experimental layouts. Therefore, the aim of present study was to investigate the effect of the different DoE layouts (2 level full factorial, Central Composite, Box-Behnken, 3 level fractional and 3 level full factorial design) on the model predictability and to compare model sensitivities according to the organization of the experimental data set. It was revealed that the size of the design space could differ more than 40% calculated with different polynomial models, which was associated with a considerable shift in its position when higher level layouts were applied. The shift was more considerable when the calculation was based on RSM. The model predictability was also better with ANN based models. Nevertheless, both modelling methods exhibit considerable sensitivity to the organization of the experimental data set, and the use of design layouts is recommended, where the extreme values factors are more represented. Copyright © 2016 Elsevier B.V. All rights reserved.
A comparative reliability analysis of free-piston Stirling machines
NASA Astrophysics Data System (ADS)
Schreiber, Jeffrey G.
2001-02-01
A free-piston Stirling power convertor is being developed for use in an advanced radioisotope power system to provide electric power for NASA deep space missions. These missions are typically long lived, lasting for up to 14 years. The Department of Energy (DOE) is responsible for providing the radioisotope power system for the NASA missions, and has managed the development of the free-piston power convertor for this application. The NASA Glenn Research Center has been involved in the development of Stirling power conversion technology for over 25 years and is currently providing support to DOE. Due to the nature of the potential missions, long life and high reliability are important features for the power system. Substantial resources have been spent on the development of long life Stirling cryocoolers for space applications. As a very general statement, free-piston Stirling power convertors have many features in common with free-piston Stirling cryocoolers, however there are also significant differences. For example, designs exist for both power convertors and cryocoolers that use the flexure bearing support system to provide noncontacting operation of the close-clearance moving parts. This technology and the operating experience derived from one application may be readily applied to the other application. This similarity does not pertain in the case of outgassing and contamination. In the cryocooler, the contaminants normally condense in the critical heat exchangers and foul the performance. In the Stirling power convertor just the opposite is true as contaminants condense on non-critical surfaces. A methodology was recently published that provides a relative comparison of reliability, and is applicable to systems. The methodology has been applied to compare the reliability of a Stirling cryocooler relative to that of a free-piston Stirling power convertor. The reliability analysis indicates that the power convertor should be able to have superior reliability compared to the cryocooler. .
Maddineni, Sindhuri; Battu, Sunil Kumar; Morott, Joe; Majumdar, Soumyajit; Repka, Michael A.
2014-01-01
The objective of the present study was to develop techniques for an abuse-deterrent (AD) platform utilizing hot melt extrusion (HME) process. Formulation optimization was accomplished by utilizing Box-Behnken design of experiments to determine the effect of the three formulation factors: PolyOx™ WSR301, Benecel™ K15M, and Carbopol 71G; each of which was studied at three levels on TR attributes of the produced melt extruded pellets. A response surface methodology was utilized to identify the optimized formulation. Lidocaine Hydrochloride was used as a model drug, and suitable formulation ingredients were employed as carrier matrices and processing aids. All of the formulations were evaluated for the TR attributes such as particle size post-milling, gelling, percentage of drug extraction in water and alcohol. All of the DOE formulations demonstrated sufficient hardness and elasticity, and could not be reduced into fine particles (<150µm), which is a desirable feature to prevent snorting. In addition, all of the formulations exhibited good gelling tendency in water with minimal extraction of drug in the aqueous medium. Moreover, Benecel™ K15M in combination with PolyOx™ WSR301 could be utilized to produce pellets with TR potential. HME has been demonstrated to be a viable technique with a potential to develop novel abuse-deterrent formulations. PMID:24433429
Reference Model 6 (RM6): Oscillating Wave Energy Converter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana L; Smith, Chris; Jenne, Dale Scott
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter reference model design in a complementary manner to Reference Models 1-4 contained in the above report. In this report, a conceptual design for an Oscillating Water Column Wave Energy Converter (WEC) device appropriate for the modeled reference resource site was identified, and a detailed backward bent duct buoy (BBDB) device design was developed using a combination of numerical modeling tools and scaled physical models. Our team used the methodology in SAND2013-9040more » for the economic analysis that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays, up to 100 devices. The methodology was applied to identify key cost drivers and to estimate levelized cost of energy (LCOE) for this RM6 Oscillating Water Column device in dollars per kilowatt-hour ($/kWh). Although many costs were difficult to estimate at this time due to the lack of operational experience, the main contribution of this work was to disseminate a detailed set of methodologies and models that allow for an initial cost analysis of this emerging technology. This project is sponsored by the U.S. Department of Energy's (DOE) Wind and Water Power Technologies Program Office (WWPTO), within the Office of Energy Efficiency & Renewable Energy (EERE). Sandia National Laboratories, the lead in this effort, collaborated with partners from National Laboratories, industry, and universities to design and test this reference model.« less
The ability of winter grazing to reduce wildfire size, intensity ...
A recent study by Davies et al. sought to test whether winter grazing could reduce wildfire size, fire behavior metrics, and fire-induced plant mortality in shrub-grasslands. The authors concluded that ungrazed rangelands may experience more fire-induced mortality of native perennial bunchgrasses. The authors also presented several statements regarding the benefits of winter grazing on post-fire plant community responses. However, this commentary will show that the study by Davies et al. has underlying methodological flaws, lacks data necessary to support their conclusions, and does not provide an accurate discussion on the effect of grazing on rangeland ecosystems. Importantly, Davies et al. presented no data on the post-fire mortality of the perennial bunchgrasses or on the changes in plant community composition following their experimental fires. Rather, Davies et al. inferred these conclusions based off their observed fire behavior metrics of maximum temperature and a term described as the “heat load”. However, neither metric is appropriate for elucidating the heat flux impacts on plants. This lack of post-fire data, several methodological flaws, and the use of inadequate metrics describing heat cast doubts on the authors’ ability to support their stated conclusions. This article is a commentary highlights the scientific shortcomings in a forthcoming paper by Davies et al. in the International Journal of Wildland Fire. The study has methodological flaw
NASA Technical Reports Server (NTRS)
Hoppa, Mary Ann; Wilson, Larry W.
1994-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobi, Rober
2007-03-28
This Topical Report (#6 of 9) consists of the figures 3.6-13 to (and including) 3.6-18 (and appropriate figure captions) that accompany the Final Technical Progress Report entitled: “Innovative Methodology for Detection of Fracture-Controlled Sweet Spots in the Northern Appalachian Basin” for DOE/NETL Award DE-AC26-00NT40698.
ERIC Educational Resources Information Center
Bierema, Andrea M.-K.; Schwartz, Renee S.; Gill, Sharon A.
2017-01-01
Recent calls for reform in education recommend science curricula to be based on central ideas instead of a larger number of topics and for alignment between current scientific research and curricula. Because alignment is rarely studied, especially for central ideas, we developed a methodology to discover the extent of alignment between primary…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-26
... excludes (1) polyethylene bags that are not printed with logos or store names and that are closeable with... comparison methodology to TCI's targeted sales and the average-to-average comparison methodology to TCI's non... average-to-average comparison method does not account for such price differences and results in the...
de Almeida, Daniela Tonizza; de Oliveira Echternacht, E Eliza Helena
2012-01-01
The study investigated, through a literature review, how the research community has addressed the difficulties experiences by the teams in front of new prescriptions for the treatment of alcohol and drug users in Centers of Psyco-social Attention and discusses the relevance of the conceptual and methodological references of Ergonomics for understand and transform work situations. Such studies tend to do an analysis that does not account for variability present in both the workers and work organization and prioritize the analysis of final results of the, disqualifying local inventions due to tasks imposed by policy guidelines. It is estimated that the Ergonomics, considering the diversity of training, learning and experience, contributes to the implementation of media to promote the continued development of competences that can meet the demands of production and expands knowledge about the problems experienced and the possibilities of overcoming them.
Do we need methodological theory to do qualitative research?
Avis, Mark
2003-09-01
Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.
ERIC Educational Resources Information Center
Reynolds-Keefer, Laura
2010-01-01
This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…
Assessing the Fire Risk for a Historic Hangar
NASA Technical Reports Server (NTRS)
Datta, Koushik; Morrison, Richard S.
2010-01-01
NASA Ames Research Center (ARC) is evaluating options of reuse of its historic Hangar 1. As a part of this evaluation, a qualitative fire risk assessment study was performed to evaluate the potential threat of combustion of the historic hangar. The study focused on the fire risk trade-off of either installing or not installing a Special Hazard Fire Suppression System in the Hangar 1 deck areas. The assessment methodology was useful in discussing the important issues among various groups within the Center. Once the methodology was deemed acceptable, the results were assessed. The results showed that the risk remained in the same risk category, whether Hangar 1 does or does not have a Special Hazard Fire Suppression System. Note that the methodology assessed the risk to Hangar 1 and not the risk to an aircraft in the hangar. If one had a high value aircraft, the aircraft risk analysis could potentially show a different result. The assessed risk results were then communicated to management and other stakeholders.
Review of Natural Phenomena Hazard (NPH) Assessments for the DOE Hanford Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Robert L.; Ross, Steven B.
2011-09-15
The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the DOE's Hanford Site, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. This review is an update and expansion to the September 2010 review of PNNL-19751, Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic).
Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.K.; Buelt, J.L.; Stottlemyre, J.A.
1991-02-01
Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets ofmore » compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
NASA Technical Reports Server (NTRS)
Miller, James; Leggett, Jay; Kramer-White, Julie
2008-01-01
A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.
Platt, Tyson L; Zachar, Peter; Ray, Glen E; Lobello, Steven G; Underhill, Andrea T
2007-04-01
Studies have found that Wechsler scale administration and scoring proficiency is not easily attained during graduate training. These findings may be related to methodological issues. Using a single-group repeated measures design, this study documents statistically significant, though modest, error reduction on the WAIS-III and WISC-III during a graduate course in assessment. The study design does not permit the isolation of training factors related to error reduction, or assessment of whether error reduction is a function of mere practice. However, the results do indicate that previous study findings of no or inconsistent improvement in scoring proficiency may have been the result of methodological factors. Implications for teaching individual intelligence testing and further research are discussed.
Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments
2016-03-24
NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION
WIPP waste characterization program sampling and analysis guidance manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Andrew Hill; Jay Beaman; Joseph O' Leary
2001-01-01
This paper is about estimating a salience scale for trip reporting. The measurement project began as a way of establishing the affects of methodological changes between 1994 and 1997 in the Canadian Travel Survey. This is a survey that Canada uses to study the travel of its residents. There were several changes in methodology that could be expected to influence how...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
ERIC Educational Resources Information Center
MacBride, Owen
This survey of studies of medical school costs was made in order to evaluate and compare the methodologies and findings of those studies. The survey covered studies of one or more medical schools that either produced figures for average annual per-student cost of education and/or discussed the methodologies and problems involved in producing such…
Maintaining space shuttle safety within an environment of change
NASA Astrophysics Data System (ADS)
Greenfield, Michael A.
1999-09-01
In the 10 years since the Challenger accident, NASA has developed a set of stable and capable processes to prepare the Space Shuttle for safe launch and return. Capitalizing on the extensive experience gained from a string of over 50 successful flights, NASA today is changing the way it does business in an effort to reduce cost. A single Shuttle Flight Operations Contractor (SFOC) has been chosen to operate the Shuttle. The Government role will change from direct "oversight" to "insight" gained through understanding and measuring the contractor's processes. This paper describes the program management changes underway and the NASA Safety and Mission Assurance (S&MA) organization's philosophy, role, and methodology for pursuing this new approach. It describes how audit and surveillance will replace direct oversight and how meaningful performance metrics will be implemented.
Does Bertrand's rule apply to macronutrients?
Raubenheimer, D; Lee, K.P; Simpson, S.J
2005-01-01
It has been known for over a century that the dose–response curve for many micronutrients is non-monotonic, having an initial stage of increasing benefits with increased intake, followed by increasing costs as excesses become toxic. This phenomenon, termed Bertrand's rule, is widely assumed not to apply to caloric macronutrients. To date this assumption has been safe, owing to the considerable methodological challenges involved in coaxing animals to over-ingest macronutrients in a way that enables the effects of specific food components to be isolated. Here we report an experiment which overcomes these difficulties, to test whether the second phase (incurring costs with excessive intake) applies to carbohydrate intake by the generalist-feeding caterpillar Spodoptera littoralis. The results showed that excess carbohydrate intake caused increased mortality, thus extending Bertrand's rule to macronutrients. PMID:16243690
Heusser, P
2000-03-01
The study by Sommer et al. recently reported in Complementary Therapies in Medicine has been heavily criticised in Switzerland since its original publication. Its major problems are an inadequate reflection of real practice, an inadequate study design relative to the central research objective, questionable value of the applied instrument and procedure for health assessment, methodological and statistical problems, and failure to consider literature relevant to the topic. For these reasons, this experimental study does not allow an answer to its central questions as to costs and effectiveness of complementary medicine made available within Switzerland's mandatory basic health insurance provisions. We propose more practice-related, non-experimental prospective study designs to realistically answer these questions.
Women's experience of being well during peri-menopause: a phenomenological study.
Mackey, Sandra
2007-01-01
A research study was conducted to investigate women's experience of being well during the peri-menopause because much of the research investigating the experience of menopause has concentrated on its problematic and pathological aspects. For the majority of western women the reproductive transition of menopause is not problematic, however, the nature of the unproblematic or healthy menopause has not been investigated. The aim in conducting this research was to enhance understanding of the experience of being healthy or well during menopause. In so doing, recognition of the diversity of menopausal experiences may be strengthened. The research was approached from the disciplinary perspective of nursing, and was grounded in the methodology of Heideggerian interpretive phenomenology. Data was collected via unstructured, in-depth interviews and analysis was conducted utilising the repetitive and circular process developed by van Manen. The phenomenon of being healthy or well during menopause was expressed in the form of three major themes. These were the continuity of menstrual experience, the embodiment of menopausal symptoms, and the containment of menopause and menopausal symptoms. The experience of health and well being during menopause can accommodate the experience of symptoms when the experience of symptoms does not disrupt embodied existence and the continuity of menstrual patterns. Menopause is widely studied, yet only partly understood. While much is now known about the nature and influence of ovarian hormones, the physiology of menopausal changes, and the treatment of menopausal symptoms, little is known and understood about the experience of menopause. Research that has investigated the experience of menopause has largely focused on the problematic experiences. It is now known that the majority of women, regardless of cultural background, do not experience menopause in a problematic way (Utian 1977; Porter et al. 1996). However, the nature of such experience has not been revealed and it is not known whether this experience of a non-problematic menopause constitutes wellness at menopause. The research reported here aimed to achieve greater understanding of the nature of this experience of menopause, through an investigation of women's everyday experience of wellness and wellbeing during menopause. Wellness, by its very nature, is an elusive state. It is elusive because it is a non-problematic state, thus difficult to mark out by measurement, events or experiences. In wellness, nothing 'stands out' to notice, observe or disrupt as it does in illness (van Manen 1990). Nevertheless, the term wellness describes a particular and recognisable state of being which, in this study, is revealed through interpretative analysis of post-menopausal women's descriptions of their experiences.
Influence of communal and private folklore on bringing meaning to the experience of persistent pain.
Hendricks, Joyce Marie
2015-11-01
To provide an overview of the relevance and strengths of using the literary folkloristic methodology to explore the ways in which people with persistent pain relate to and make sense of their experiences through narrative accounts. Storytelling is a conversation with a purpose. The reciprocal bond between researcher and storyteller enables the examination of the meaning of experiences. Life narratives, in the context of wider traditional and communal folklore, can be analysed to discover how people make sense of their circumstances. This paper draws from the experience of the author, who has previously used this narrative approach. It is a reflection of how the approach may be used to understand those experiencing persistent pain without a consensual diagnosis. Using an integrative method, peer-reviewed research and discussion papers published between January 1990 and December 2014 and listed in the CINAHL, Science Direct, PsycINFO and Google Scholar databases were reviewed. In addition, texts that addressed research methodologies such as literary folkloristic methodology and Marxist literary theory were used. The unique role that nurses play in managing pain is couched in the historical and cultural context of nursing. Literary folkloristic methodology offers an opportunity to gain a better understanding and appreciation of how the experience of pain is constructed and to connect with sufferers. Literary folkloristic methodology reveals that those with persistent pain are often rendered powerless to live their lives. Increasing awareness of how this experience is constructed and maintained also allows an understanding of societal influences on nursing practice. Nurse researchers try to understand experiences in light of specific situations. Literary folkloristic methodology can enable them to understand the inter-relationship between people in persistent pain and how they construct their experiences.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
ERIC Educational Resources Information Center
Rodriguez, Maria A.; Niaz, Mansoor
2004-01-01
The objectives of this study are: (1) evaluation of the methodology used in recent search for particles with fractional electrical charge (quarks) and its implications for understanding the scientific research methodology of Millikan; (2) evaluation of 43 general physics textbooks and 11 laboratory manuals, with respect to the oil drop experiment,…
ERIC Educational Resources Information Center
Gaven, Patricia; Williams, R. David
An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…
ERIC Educational Resources Information Center
Socha, Teresa; Potter, Tom; Potter, Stephanie; Jickling, Bob
2016-01-01
This paper shares our experiences using pinhole photography with adolescents as both a pedagogical tool to support and deepen adolescent experiences in wild nature, and as a visual methodological tool to elucidate their experiences. Reflecting on a journey that explored the nature-based experiences of two adolescents on a family canoe trip in…
Quantitative Comparison of Tandem Mass Spectra Obtained on Various Instruments
NASA Astrophysics Data System (ADS)
Bazsó, Fanni Laura; Ozohanics, Oliver; Schlosser, Gitta; Ludányi, Krisztina; Vékey, Károly; Drahos, László
2016-08-01
The similarity between two tandem mass spectra, which were measured on different instruments, was compared quantitatively using the similarity index (SI), defined as the dot product of the square root of peak intensities in the respective spectra. This function was found to be useful for comparing energy-dependent tandem mass spectra obtained on various instruments. Spectral comparisons show the similarity index in a 2D "heat map", indicating which collision energy combinations result in similar spectra, and how good this agreement is. The results and methodology can be used in the pharma industry to design experiments and equipment well suited for good reproducibility. We suggest that to get good long-term reproducibility, it is best to adjust the collision energy to yield a spectrum very similar to a reference spectrum. It is likely to yield better results than using the same tuning file, which, for example, does not take into account that contamination of the ion source due to extended use may influence instrument tuning. The methodology may be used to characterize energy dependence on various instrument types, to optimize instrumentation, and to study the influence or correlation between various experimental parameters.
Natural fracture systems studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lorenz, J.C.; Warpinski, N.R.
The objectives of this program are (1) to develop a basinal-analysis methodology for natural fracture exploration and exploitation, and (2) to determine the important characteristics of natural fracture systems for use in completion, stimulation, and production operations. Natural-fracture basinal analysis begins with studies of fractures in outcrop, core and logs in order to determine the type of fracturing and the relationship of the fractures to the lithologic environment. Of particular interest are the regional fracture systems that are pervasive in western US tight sand basins. A Methodology for applying this analysis is being developed, with the goal of providing amore » structure for rationally characterizing natural fracture systems basin-wide. Such basin-wide characterizations can then be expanded and supplemented locally, at sites where production may be favorable. Initial application of this analysis is to the Piceance basin where there is a wealth of data from the Multiwell Experiment (MWX), DOE cooperative wells, and other basin studies conducted by Sandia, CER Corporation, and the USGS (Lorenz and Finley, 1989, Lorenz et aI., 1989, and Spencer and Keighin, 1984). Such a basinal approach has been capable of explaining the fracture characteristics found throughout the southern part of the Piceance basin and along the Grand Hogback.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This document contains a listing, description, and selected references for documented human radiation experiments sponsored, supported, or performed by the US Department of Energy (DOE) or its predecessors, including the US Energy Research and Development Administration (ERDA), the US Atomic Energy Commission (AEC), the Manhattan Engineer District (MED), and the Off ice of Scientific Research and Development (OSRD). The list represents work completed by DOE`s Off ice of Human Radiation Experiments (OHRE) through June 1995. The experiment list is available on the Internet via a Home Page on the World Wide Web (http://www.ohre.doe.gov). The Home Page also includes the fullmore » text of Human Radiation Experiments. The Department of Energy Roadmap to the Story and the Records (DOE/EH-0445), published in February 1995, to which this publication is a supplement. This list includes experiments released at Secretary O`Leary`s June 1994 press conference, as well as additional studies identified during the 12 months that followed. Cross-references are provided for experiments originally released at the press conference; for experiments released as part of The DOE Roadmap; and for experiments published in the 1986 congressional report entitled American Nuclear Guinea Pigs: Three Decades of Radiation Experiments on US Citizens. An appendix of radiation terms is also provided.« less
Transforming the patient care environment with Lean Six Sigma and realistic evaluation.
Black, Jason
2009-01-01
Lean Six Sigma (LSS) is a structured methodology for transforming processes, but it does not fully consider the complex social interactions that cause processes to form in hospital organizations. By combining LSS implementations with the concept of Realistic Evaluation, a methodology that promotes change by assessing and considering the individual characteristics of an organization's social environment, successful and sustainable process improvement is more likely.
Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaChance, Jeffrey L.; Hansen, Clifford W.
2010-09-01
The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additionalmore » Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.« less
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2014-03-01
A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.
Journalists and substance use: A systematic literature review.
MacDonald, Jasmine B; Saliba, Anthony J; Hodgins, Gene
2016-01-01
Journalists' exposure to potentially traumatic events (PTEs), high levels of job stress, and anecdotal reports within the industry seem to suggest that journalists are at greater risk than the general population to experience substance use disorders. The present systematic literature review (SLR) aims to provide a concise, comprehensive, and systematic review of the quantitative literature relating to journalists' experience of substance use. The systematic review method adopted within the present study was based on that prescribed by Fink in the 2010 book, Conducting systematic literature reviews: From the internet to paper, 3rd ed., which contains three main elements: sampling the literature, screening the literature, and extracting data. Alcohol consumption is the most widely studied substance in journalist samples and is discussed in relation to quantity, level of risk, and potential alcoholism. The review also considers journalists' use of substances, including cigarettes, cannabis, and other illicit substances. In particular, comparisons are made between journalistic roles and gender. The research is piecemeal in nature, in that more recent research does not build upon the research that has come before it. Much of what has been reported does not reflect the progress that has taken place in recent years within the alcohol consumption and substance use field in terms of theory, assessment, scale development, practice, and interventions with those who use or are addicted to various substances. This SLR raises a number of methodological and theoretical issues to be explored and addressed in future research.
Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine
2013-08-06
We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2011-01-01
The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.
Olonisakin, Tolani F.; Pribis, John P.; Zupetic, Jill; Yoon, Joo Heung; Holleran, Kyle M.; Jeong, Kwonho; Shaikh, Nader; Rubio, Doris M.; Lee, Janet S.
2017-01-01
Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (Nature) and one that does not (Cell), to identify in vivo animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing Nature to Cell from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies. PMID:28902887
42 CFR 480.120 - Information subject to disclosure.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subcontracts under those contracts (except for proprietary or business information); (3) Copies of documents..., including a study design and methodology. (b) Aggregate statistical information that does not implicitly or...
Fingelkurts, Alexander A; Fingelkurts, Andrew A
2009-11-01
To figure out whether the main empirical question "Is our brain hardwired to believe in and produce God, or is our brain hardwired to perceive and experience God?" is answered, this paper presents systematic critical review of the positions, arguments and controversies of each side of the neuroscientific-theological debate and puts forward an integral view where the human is seen as a psycho-somatic entity consisting of the multiple levels and dimensions of human existence (physical, biological, psychological, and spiritual reality), allowing consciousness/mind/spirit and brain/body/matter to be seen as different sides of the same phenomenon, neither reducible to each other. The emergence of a form of causation distinctive from physics where mental/conscious agency (a) is neither identical with nor reducible to brain processes and (b) does exert "downward" causal influence on brain plasticity and the various levels of brain functioning is discussed. This manuscript also discusses the role of cognitive processes in religious experience and outlines what can neuroscience offer for study of religious experience and what is the significance of this study for neuroscience, clinicians, theology and philosophy. A methodological shift from "explanation" to "description" of religious experience is suggested. This paper contributes to the ongoing discussion between theologians, cognitive psychologists and neuroscientists.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Low-cost Active Structural Control Space Experiment (LASC)
NASA Technical Reports Server (NTRS)
Robinett, Rush; Bukley, Angelia P.
1992-01-01
The DOE Lab Director's Conference identified the need for the DOE National Laboratories to actively and aggressively pursue ways to apply DOE technology to problems of national need. Space structures are key elements of DOD and NASA space systems and a space technology area in which DOE can have a significant impact. LASC is a joint agency space technology experiment (DOD Phillips, NASA Marshall, and DOE Sandia). The topics are presented in viewgraph form and include the following: phase 4 investigator testbed; control of large flexible structures in orbit; INFLEX; Controls, Astrophysics; and structures experiments in space; SARSAT; and LASC mission objectives.
ERIC Educational Resources Information Center
Adamson, John; Muller, Theron
2018-01-01
This manuscript uses a joint autoethnographic methodology to explore the experiences of two language teacher scholars working in the academy outside the global centre in Japan. Emphasis is given to how the methodology used, cycles of reflective writing, reveals commonalities and differences in our respective experiences of working in the Japanese…
Brisbois, Ben W; Cole, Donald C; Davison, Colleen M; Di Ruggiero, Erica; Hanson, Lori; Janes, Craig R; Larson, Charles P; Nixon, Stephanie; Plamondon, Katrina; Stime, Bjorn
2016-12-27
Funding options for global health research prominently include grants from corporations, as well as from foundations linked to specific corporations. While such funds can enable urgently-needed research and interventions, they can carry the risk of skewing health research priorities and exacerbating health inequities. With the objective of promoting critical reflection on potential corporate funding options for global health research, we propose a set of three questions developed through an open conference workshop and reflection on experiences of global health researchers and their institutions: 1) Does this funding allow me/us to retain control over research design, methodology and dissemination processes? 2) Does accessing this funding source involve altering my/our research agenda (i.e., what is the impact of this funding source on research priorities)? 3) What are the potential "unintended consequences" of accepting corporate funding, in terms of legitimizing corporations or models of development that are at the root of many global health problems? These questions outline an intentional and cautionary approach to decision-making when corporate funding for global health research is being considered by funding agencies, institutions, researchers and research stakeholders.
U.S. techniques, methodologies, business practices, and expertise, which increase mutual understanding not support the pursuance of a waiver of the J-1 Exchange Visitor's Visa. NREL does not generally
43 CFR 11.31 - What does the Assessment Plan include?
Code of Federal Regulations, 2010 CFR
2010-10-01
... recovery period, and other such information required to perform the selected methodologies. (3) The... coordinated to the extent possible with any remedial investigation feasibility study or other investigation...
METHODOLOGICAL QUALITY OF ECONOMIC EVALUATIONS ALONGSIDE TRIALS OF KNEE PHYSIOTHERAPY.
García-Pérez, Lidia; Linertová, Renata; Arvelo-Martín, Alejandro; Guerra-Marrero, Carolina; Martínez-Alberto, Carlos Enrique; Cuéllar-Pompa, Leticia; Escobar, Antonio; Serrano-Aguilar, Pedro
2017-01-01
The methodological quality of an economic evaluation performed alongside a clinical trial can be underestimated if the paper does not report key methodological features. This study discusses methodological assessment issues on the example of a systematic review on cost-effectiveness of physiotherapy for knee osteoarthritis. Six economic evaluation studies included in the systematic review and related clinical trials were assessed using the 10-question check-list by Drummond and the Physiotherapy Evidence Database (PEDro) scale. All economic evaluations were performed alongside a clinical trial but the studied interventions were too heterogeneous to be synthesized. Methodological quality of the economic evaluations reported in the papers was not free of drawbacks, and in some cases, it improved when information from the related clinical trial was taken into account. Economic evaluation papers dedicate little space to methodological features of related clinical trials; therefore, the methodological quality can be underestimated if evaluated separately from the trials. Future economic evaluations should follow more strictly the recommendations about methodology and the authors should pay special attention to the quality of reporting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
2010-08-01
a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables
ERIC Educational Resources Information Center
Kyburz-Graber, Regula
2004-01-01
There is a tendency to use case-study research methodology for research issues aiming at simply describing a complex situation, and to draw conclusions with insufficient rigour. Sound case-study research, however, follows discriminate rules which can be described in all the dimensions of a full case-study research process. This paper examines…
78 FR 5444 - Submission for OMB Review; Payments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... challenging that the agency's methodology for calculating it is insufficient and inadequate and does not... hours of burden per response is unrealistically low given the level of documentation required to support...
Conducting interactive experiments online.
Arechar, Antonio A; Gächter, Simon; Molleman, Lucas
2018-01-01
Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.
Enhancement of 2,3-Butanediol Production by Klebsiella oxytoca PTCC 1402
Anvari, Maesomeh; Safari Motlagh, Mohammad Reza
2011-01-01
Optimal operating parameters of 2,3-Butanediol production using Klebsiella oxytoca under submerged culture conditions are determined by using Taguchi method. The effect of different factors including medium composition, pH, temperature, mixing intensity, and inoculum size on 2,3-butanediol production was analyzed using the Taguchi method in three levels. Based on these analyses the optimum concentrations of glucose, acetic acid, and succinic acid were found to be 6, 0.5, and 1.0 (% w/v), respectively. Furthermore, optimum values for temperature, inoculum size, pH, and the shaking speed were determined as 37°C, 8 (g/L), 6.1, and 150 rpm, respectively. The optimal combinations of factors obtained from the proposed DOE methodology was further validated by conducting fermentation experiments and the obtained results revealed an enhanced 2,3-Butanediol yield of 44%. PMID:21318172
Outdoor chamber measurements of biological aerosols with a passive FTIR spectrometer
NASA Astrophysics Data System (ADS)
D'Amico, Francis M.; Emge, Darren K.; Roelant, Geoffrey J.
2004-02-01
Outdoor measurements of dry bacillus subtilis (BG) spores were conducted with a passive Fourier transform infrared (FTIR) spectrometer using two types of chambers. One was a large open-ended cell, and the other was a canyon of similar dimensions. The canyon exposes the aerosol plume to downwelling sky radiance, while the open-ended cell does not. The goal of the experiments was to develop a suitable test methodology for evaluation of passive standoff detectors for open-air aerosol measurements. Dry BG aerosol particles were dispersed with a blower through an opening in the side of the chamber to create a pseudo-stationary plume, wind conditions permitting. Numerous trials were performed with the FTIR spectrometer positioned to view mountain, sky and mixed mountain-sky backgrounds. This paper will discuss the results of the FTIR measurements for BG and Kaolin dust releases.
Nishawala, Vinesh V.; Ostoja-Starzewski, Martin; Leamy, Michael J.; ...
2015-09-10
Peridynamics is a non-local continuum mechanics formulation that can handle spatial discontinuities as the governing equations are integro-differential equations which do not involve gradients such as strains and deformation rates. This paper employs bond-based peridynamics. Cellular Automata is a local computational method which, in its rectangular variant on interior domains, is mathematically equivalent to the central difference finite difference method. However, cellular automata does not require the derivation of the governing partial differential equations and provides for common boundary conditions based on physical reasoning. Both methodologies are used to solve a half-space subjected to a normal load, known as Lamb’smore » Problem. The results are compared with theoretical solution from classical elasticity and experimental results. Furthermore, this paper is used to validate our implementation of these methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.
While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less
Levine, Jonathan S.; Fukai, Isis; Soeder, Daniel J.; ...
2016-05-31
While the majority of shale formations will serve as reservoir seals for stored anthropogenic carbon dioxide (CO2), hydrocarbon-bearing shale formations may be potential geologic sinks after depletion through primary production. Here in this paper we present the United States-Department of Energy-National Energy Technology Laboratory (US-DOE-NETL) methodology for screening-level assessment of prospective CO 2 storage resources in shale using a volumetric equation. Volumetric resource estimates are produced from the bulk volume, porosity, and sorptivity of the shale and storage efficiency factors based on formation-scale properties and petrophysical limitations on fluid transport. Prospective shale formations require: (1) prior hydrocarbon production using horizontalmore » drilling and stimulation via staged, high-volume hydraulic fracturing, (2) depths sufficient to maintain CO 2 in a supercritical state, generally >800 m, and (3) an overlying seal. The US-DOE-NETL methodology accounts for storage of CO 2 in shale as a free fluid phase within fractures and matrix pores and as an sorbed phase on organic matter and clays. Uncertainties include but are not limited to poorly-constrained geologic variability in formation thickness, porosity, existing fluid content, organic richness, and mineralogy. Knowledge of how these parameters may be linked to depositional environments, facies, and diagenetic history of the shale will improve the understanding of pore-to-reservoir scale behavior, and provide improved estimates of prospective CO 2 storage.« less
Social Information Transmission in Animals: Lessons from Studies of Diffusion
Duboscq, Julie; Romano, Valéria; MacIntosh, Andrew; Sueur, Cédric
2016-01-01
The capacity to use information provided by others to guide behavior is a widespread phenomenon in animal societies. A standard paradigm to test if and/or how animals use and transfer social information is through social diffusion experiments, by which researchers observe how information spreads within a group, sometimes by seeding new behavior in the population. In this article, we review the context, methodology and products of such social diffusion experiments. Our major focus is the transmission of information from an individual (or group thereof) to another, and the factors that can enhance or, more interestingly, inhibit it. We therefore also discuss reasons why social transmission sometimes does not occur despite being expected to. We span a full range of mechanisms and processes, from the nature of social information itself and the cognitive abilities of various species, to the idea of social competency and the constraints imposed by the social networks in which animals are embedded. We ultimately aim at a broad reflection on practical and theoretical issues arising when studying how social information spreads within animal groups. PMID:27540368
Generalized Subset Designs in Analytical Chemistry.
Surowiec, Izabella; Vikström, Ludvig; Hector, Gustaf; Johansson, Erik; Vikström, Conny; Trygg, Johan
2017-06-20
Design of experiments (DOE) is an established methodology in research, development, manufacturing, and production for screening, optimization, and robustness testing. Two-level fractional factorial designs remain the preferred approach due to high information content while keeping the number of experiments low. These types of designs, however, have never been extended to a generalized multilevel reduced design type that would be capable to include both qualitative and quantitative factors. In this Article we describe a novel generalized fractional factorial design. In addition, it also provides complementary and balanced subdesigns analogous to a fold-over in two-level reduced factorial designs. We demonstrate how this design type can be applied with good results in three different applications in analytical chemistry including (a) multivariate calibration using microwave resonance spectroscopy for the determination of water in tablets, (b) stability study in drug product development, and (c) representative sample selection in clinical studies. This demonstrates the potential of generalized fractional factorial designs to be applied in many other areas of analytical chemistry where representative, balanced, and complementary subsets are required, especially when a combination of quantitative and qualitative factors at multiple levels exists.
Daskalakis, Markos I; Magoulas, Antonis; Kotoulas, Georgios; Katsikis, Ioannis; Bakolas, Asterios; Karageorgis, Aristomenis P; Mavridou, Athena; Doulia, Danae; Rigas, Fotis
2014-08-01
Bacterially induced calcium carbonate precipitation of a Cupriavidus metallidurans isolate was investigated to develop an environmentally friendly method for restoration and preservation of ornamental stones. Biomineralization performance was carried out in a growth medium via a Design of Experiments (DoE) approach using, as design factors, the temperature, growth medium concentration, and inoculum concentration. The optimum conditions were determined with the aid of consecutive experiments based on response surface methodology (RSM) and were successfully validated thereafter. Statistical analysis can be utilized as a tool for screening bacterial bioprecipitation as it considerably reduced the experimental time and effort needed for bacterial evaluation. Analytical methods provided an insight to the biomineral characteristics, and sonication tests proved that our isolate could create a solid new layer of vaterite on marble substrate withstanding sonication forces. C. metallidurans ACA-DC 4073 provided a compact vaterite layer on the marble substrate with morphological characteristics that assisted in its differentiation. The latter proved valuable during spraying minimum amount of inoculated media on marble substrate under conditions close to an in situ application. A sufficient and clearly distinguishable layer was identified.
Social Information Transmission in Animals: Lessons from Studies of Diffusion.
Duboscq, Julie; Romano, Valéria; MacIntosh, Andrew; Sueur, Cédric
2016-01-01
The capacity to use information provided by others to guide behavior is a widespread phenomenon in animal societies. A standard paradigm to test if and/or how animals use and transfer social information is through social diffusion experiments, by which researchers observe how information spreads within a group, sometimes by seeding new behavior in the population. In this article, we review the context, methodology and products of such social diffusion experiments. Our major focus is the transmission of information from an individual (or group thereof) to another, and the factors that can enhance or, more interestingly, inhibit it. We therefore also discuss reasons why social transmission sometimes does not occur despite being expected to. We span a full range of mechanisms and processes, from the nature of social information itself and the cognitive abilities of various species, to the idea of social competency and the constraints imposed by the social networks in which animals are embedded. We ultimately aim at a broad reflection on practical and theoretical issues arising when studying how social information spreads within animal groups.
Cognitive processes in dissociation: an analysis of core theoretical assumptions.
Giesbrecht, Timo; Lynn, Steven Jay; Lilienfeld, Scott O; Merckelbach, Harald
2008-09-01
Dissociation is typically defined as the lack of normal integration of thoughts, feelings, and experiences into consciousness and memory. The present article critically evaluates the research literature on cognitive processes in dissociation. The authors' review indicates that dissociation is characterized by subtle deficits in neuropsychological performance (e.g., heightened distractibility). Some of the cognitive phenomena (e.g., weakened cognitive inhibition) associated with dissociation appear to be dependent on the emotional or attentional context. Contrary to a widespread assumption in the clinical literature, dissociation does not appear to be related to avoidant information processing. Rather, it is associated with an enhanced propensity toward pseudo-memories, possibly mediated by heightened levels of interrogative suggestibility, fantasy proneness, and cognitive failures. Evidence for a link between dissociation and either memory fragmentation or early trauma based on objective measures is conspicuously lacking. The authors identify a variety of methodological issues and discrepancies that make it difficult to articulate a comprehensive framework for cognitive mechanisms in dissociation. The authors conclude with a discussion of research domains (e.g., sleep-related experiences, drug-related dissociation) that promise to advance our understanding of cognition and dissociation.
Purity homophily in social networks.
Dehghani, Morteza; Johnson, Kate; Hoover, Joe; Sagi, Eyal; Garten, Justin; Parmar, Niki Jitendra; Vaisey, Stephen; Iliev, Rumen; Graham, Jesse
2016-03-01
Does sharing moral values encourage people to connect and form communities? The importance of moral homophily (love of same) has been recognized by social scientists, but the types of moral similarities that drive this phenomenon are still unknown. Using both large-scale, observational social-media analyses and behavioral lab experiments, the authors investigated which types of moral similarities influence tie formations. Analysis of a corpus of over 700,000 tweets revealed that the distance between 2 people in a social-network can be predicted based on differences in the moral purity content-but not other moral content-of their messages. The authors replicated this finding by experimentally manipulating perceived moral difference (Study 2) and similarity (Study 3) in the lab and demonstrating that purity differences play a significant role in social distancing. These results indicate that social network processes reflect moral selection, and both online and offline differences in moral purity concerns are particularly predictive of social distance. This research is an attempt to study morality indirectly using an observational big-data study complemented with 2 confirmatory behavioral experiments carried out using traditional social-psychology methodology. (c) 2016 APA, all rights reserved).
Sakkas, Vasilios A; Islam, Md Azharul; Stalikas, Constantine; Albanis, Triantafyllos A
2010-03-15
The use of chemometric methods such as response surface methodology (RSM) based on statistical design of experiments (DOEs) is becoming increasingly widespread in several sciences such as analytical chemistry, engineering and environmental chemistry. Applied catalysis, is certainly not the exception. It is clear that photocatalytic processes mated with chemometric experimental design play a crucial role in the ability of reaching the optimum of the catalytic reactions. The present article reviews the major applications of RSM in modern experimental design combined with photocatalytic degradation processes. Moreover, the theoretical principles and designs that enable to obtain a polynomial regression equation, which expresses the influence of process parameters on the response are thoroughly discussed. An original experimental work, the photocatalytic degradation of the dye Congo red (CR) using TiO(2) suspensions and H(2)O(2), in natural surface water (river water) is comprehensively described as a case study, in order to provide sufficient guidelines to deal with this subject, in a rational and integrated way. (c) 2009 Elsevier B.V. All rights reserved.
Nandal, Preeti; Ravella, Sreenivas Rao; Kuhad, Ramesh Chander
2013-01-01
Laccase production by Coriolopsis caperata RCK2011 under solid state fermentation was optimized following Taguchi design of experiment. An orthogonal array layout of L18 (21 × 37) was constructed using Qualitek-4 software with eight most influensive factors on laccase production. At individual level pH contributed higher influence, whereas, corn steep liquor (CSL) accounted for more than 50% of the severity index with biotin and KH2PO4 at the interactive level. The optimum conditions derived were; temperature 30°C, pH 5.0, wheat bran 5.0 g, inoculum size 0.5 ml (fungal cell mass = 0.015 g dry wt.), biotin 0.5% w/v, KH2PO4 0.013% w/v, CSL 0.1% v/v and 0.5 mM xylidine as an inducer. The validation experiments using optimized conditions confirmed an improvement in enzyme production by 58.01%. The laccase production to the level of 1623.55 Ugds−1 indicates that the fungus C. caperata RCK2011 has the commercial potential for laccase. PMID:23463372
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heaney, Mike
Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introducedmore » and finally a case study will be presented to demonstrate this methodology.« less
An ontological case base engineering methodology for diabetes management.
El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema
2014-08-01
Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.
Pantex Falling Man - Independent Review Panel Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertolini, Louis; Brannon, Nathan; Olson, Jared
2014-11-01
Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less
NASA Astrophysics Data System (ADS)
Meng, Xuelian
Urban land-use research is a key component in analyzing the interactions between human activities and environmental change. Researchers have conducted many experiments to classify urban or built-up land, forest, water, agriculture, and other land-use and land-cover types. Separating residential land uses from other land uses within urban areas, however, has proven to be surprisingly troublesome. Although high-resolution images have recently become more available for land-use classification, an increase in spatial resolution does not guarantee improved classification accuracy by traditional classifiers due to the increase of class complexity. This research presents an approach to detect and separate residential land uses on a building scale directly from remotely sensed imagery to enhance urban land-use analysis. Specifically, the proposed methodology applies a multi-directional ground filter to generate a bare ground surface from lidar data, then utilizes a morphology-based building detection algorithm to identify buildings from lidar and aerial photographs, and finally separates residential buildings using a supervised C4.5 decision tree analysis based on the seven selected building land-use indicators. Successful execution of this study produces three independent methods, each corresponding to the steps of the methodology: lidar ground filtering, building detection, and building-based object-oriented land-use classification. Furthermore, this research provides a prototype as one of the few early explorations of building-based land-use analysis and successful separation of more than 85% of residential buildings based on an experiment on an 8.25-km2 study site located in Austin, Texas.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
Rosati, Alexandra G; Warneken, Felix
2016-06-01
We recently reported a study (Warneken & Rosati Proceedings of the Royal Society B, 282, 20150229, 2015) examining whether chimpanzees possess several cognitive capacities that are critical to engage in cooking. In a subsequent commentary, Beran, Hopper, de Waal, Sayers, and Brosnan Learning & Behavior (2015) asserted that our paper has several flaws. Their commentary (1) critiques some aspects of our methodology and argues that our work does not constitute evidence that chimpanzees can actually cook; (2) claims that these results are old news, as previous work had already demonstrated that chimpanzees possess most or all of these capacities; and, finally, (3) argues that comparative psychological studies of chimpanzees cannot adequately address questions about human evolution, anyway. However, their critique of the premise of our study simply reiterates several points we made in the original paper. To quote ourselves: "As chimpanzees neither control fire nor cook food in their natural behavior, these experiments therefore focus not on whether chimpanzees can actually cook food, but rather whether they can apply their cognitive skills to novel problems that emulate cooking" (Warneken & Rosati Proceedings of the Royal Society B, 282, 20150229, 2015, p. 2). Furthermore, the methodological issues they raise are standard points about psychological research with animals-many of which were addressed synthetically across our 9 experiments, or else are orthogonal to our claims. Finally, we argue that comparative studies of extant apes (and other nonhuman species) are a powerful and indispensable method for understanding human cognitive evolution.
An LMI approach for the Integral Sliding Mode and H∞ State Feedback Control Problem
NASA Astrophysics Data System (ADS)
Bezzaoucha, Souad; Henry, David
2015-11-01
This paper deals with the state feedback control problem for linear uncertain systems subject to both matched and unmatched perturbations. The proposed control law is based on an the Integral Sliding Mode Control (ISMC) approach to tackle matched perturbations as well as the H∞ paradigm for robustness against unmatched perturbations. The proposed method also parallels the work presented in [1] which addressed the same problem and proposed a solution involving an Algebraic Riccati Equation (ARE)-based formulation. The contribution of this paper is concerned by the establishment of a Linear Matrix Inequality (LMI)-based solution which offers the possibility to consider other types of constraints such as 𝓓-stability constraints (pole assignment-like constraints). The proposed methodology is applied to a pilot three-tank system and experiment results illustrate the feasibility. Note that only a few real experiments have been rarely considered using SMC in the past. This is due to the high energetic behaviour of the control signal. It is important to outline that the paper does not aim at proposing a LMI formulation of an ARE. This is done since 1971 [2] and further discussed in [3] where the link between AREs and ARIs (algebraic Riccati inequality) is established for the H∞ control problem. The main contribution of this paper is to establish the adequate LMI-based methodology (changes of matrix variables) so that the ARE that corresponds to the particular structure of the mixed ISMC/H∞ structure proposed by [1] can be re-formulated within the LMI paradigm.
Experience-based co-design in an adult psychological therapies service.
Cooper, Kate; Gillmore, Chris; Hogg, Lorna
2016-01-01
Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.
Galbusera, Laura; Fellin, Lisa
2014-01-01
Research in psychopathology may be considered as an intersubjective endeavor mainly concerned with understanding other minds. Thus, the way we conceive of social understanding influences how we do research in psychology in the first place. In this paper, we focus on psychopathology research as a paradigmatic case for this methodological issue, since the relation between the researcher and the object of study is characterized by a major component of “otherness.” We critically review different methodologies in psychopathology research, highlighting their relation to different social cognition theories (the third-, first-, and second-person approaches). Hence we outline the methodological implications arising from each theoretical stance. Firstly, we critically discuss the dominant paradigm in psychopathology research, based on the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 2013) and on quantitative methodology, as an example of a third-person methodology. Secondly, we contrast this mainstream view with phenomenological psychopathology which—by rejecting the reductionist view exclusively focused on behavioral symptoms—takes consciousness as its main object of study: it therefore attempts to grasp patients’ first-person experience. But how can we speak about a first-person perspective in psychopathology if the problem at stake is the experience of the other? How is it possible to understand the experience from “within,” if the person who is having this experience is another? By addressing these issues, we critically explore the feasibility and usefulness of a second-person methodology in psychopathology research. Notwithstanding the importance of methodological pluralism, we argue that a second-person perspective should inform the epistemology and methods of research in psychopathology, as it recognizes the fundamental circular and intersubjective construction of knowledge. PMID:25368589
Galbusera, Laura; Fellin, Lisa
2014-01-01
Research in psychopathology may be considered as an intersubjective endeavor mainly concerned with understanding other minds. Thus, the way we conceive of social understanding influences how we do research in psychology in the first place. In this paper, we focus on psychopathology research as a paradigmatic case for this methodological issue, since the relation between the researcher and the object of study is characterized by a major component of "otherness." We critically review different methodologies in psychopathology research, highlighting their relation to different social cognition theories (the third-, first-, and second-person approaches). Hence we outline the methodological implications arising from each theoretical stance. Firstly, we critically discuss the dominant paradigm in psychopathology research, based on the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 2013) and on quantitative methodology, as an example of a third-person methodology. Secondly, we contrast this mainstream view with phenomenological psychopathology which-by rejecting the reductionist view exclusively focused on behavioral symptoms-takes consciousness as its main object of study: it therefore attempts to grasp patients' first-person experience. But how can we speak about a first-person perspective in psychopathology if the problem at stake is the experience of the other? How is it possible to understand the experience from "within," if the person who is having this experience is another? By addressing these issues, we critically explore the feasibility and usefulness of a second-person methodology in psychopathology research. Notwithstanding the importance of methodological pluralism, we argue that a second-person perspective should inform the epistemology and methods of research in psychopathology, as it recognizes the fundamental circular and intersubjective construction of knowledge.
Radiation Assurance for the Space Environment
NASA Technical Reports Server (NTRS)
Barth, Janet L.; LaBel, Kenneth A.; Poivey, Christian
2004-01-01
The space radiation environment can lead to extremely harsh operating conditions for spacecraft electronic systems. A hardness assurance methodology must be followed to assure that the space radiation environment does not compromise the functionality and performance of space-based systems during the mission lifetime. The methodology includes a definition of the radiation environment, assessment of the radiation sensitivity of parts, worst-case analysis of the impact of radiation effects, and part acceptance decisions which are likely to include mitigation measures.
High Specific Heat Dielectrics and Kapitza Resistance at Dielectric Boundaries.
1985-09-30
CsI rods. S. The results of the Kapitza measurements are shown in Fig. 1 for both interfaces. The methodology consisted of establishing a reservoir...measurements, however, and can be used as a check on the methodology . For instance, in this case the thermal conductivity of both copper pieces was...to Eq. (5). fherfmal- to be localized (i.e., this excitation does not carry hea0. conductivity data at the higher temperatures on both Using smoothied
DOE Office of Scientific and Technical Information (OSTI.GOV)
SVIRIDOVA, V.V.; ERASTOV, V.V.; ISAEV, N.V.
2005-05-16
The MC&A Equipment and Methodological Support Strategic Plan (MEMS SP) for implementing modern MC&A equipment and methodologies at Rosatom facilities has been developed within the framework of the U.S.-Russian MPC&A Program. This plan developed by the Rosatom's Russian MC&A Equipment and Methodologies (MEM) Working Group and is coordinated by that group with support and coordination provided by the MC&A Measurements Project, Office of National Infrastructure and Sustainability, US DOE. Implementation of different tasks of the MEMS Strategic Plan is coordinated by Rosatom and US-DOE in cooperation with different U.S.-Russian MC&A-related working groups and joint site project teams. This cooperation allowsmore » to obtain and analyze information about problems, current needs and successes at Rosatom facilities and facilitates solution of the problems, satisfying the facilities' needs and effective exchange of expertise and lessons learned. The objective of the MEMS Strategic Plan is to enhance effectiveness of activities implementing modern equipment and methodologies in the Russian State MC&A system. These activities are conducted within the joint Russian-US MPC&A program aiming at reduction of possibility for theft or diversion of nuclear materials and enhancement of control of nuclear materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminsky, J.; Tschanz, J.F.
In order to adress barriers to community energy-conservation efforts, DOE has established the Comprehensive Community Energy Management (CCEM) program. The role of CCEM is to provide direction and technical support for energy-conservation efforts at the local level. The program to date has included project efforts to develop combinations and variations of community energy planning and management tools applicable to communities of diverse characteristics. This paper describes the salient features of some of the tools and relates them to the testing program soon to begin in several pilot-study communities. Two methodologies that arose within such an actual planning context are takenmore » from DOE-sponsored projects in Clarksburg, West Virginia and the proposed new capital city for Alaska. Energy management in smaller communities and/or communities with limited funding and manpower resources has received special attention. One project of this type developed in general methodology that emphasizes efficient ways for small communities to reach agreement on local energy problems and potential solutions; by this guidance, the community is led to understand where it should concentrate its efforts in subsequent management activities. Another project concerns rapid growth of either a new or an existing community that could easily outstrip the management resources available locally. This methodology strives to enable the community to seize the opportunity for energy conservation through integrating the design of its energy systems and its development pattern. The last methodology creates applicable tools for comprehensive community energy planning. (MCW)« less
Perceptions of a Methodology for the Development of Productivity Indicators.
1982-09-01
leadership it must concentrate as do the Japanese, 1 on improving productivity in all areas of both the public and private sectors . Maintaining industrial...systems. The private sector has developed a very strong productivity measurement system based on profit and economic standings iri the market. However, as...the federal sector (including the Department of Defense) does not produce for profit nor does it compete in the private sector markets, it is not
O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa
2016-07-26
Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.
Grau, P; Vanrolleghem, P; Ayesa, E
2007-01-01
In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.
2000-02-01
HIDS] Program: Power Drive Train Crack Detection Diagnostics and Prognostics ife Usage Monitoring and Damage Tolerance; Techniques, Methodologies, and...and Prognostics , Life Usage Monitoring , and Damage Tolerance; Techniques, Methodologies, and Experiences Andrew Hess Harrison Chin William Hardman...continuing program and deployed engine monitoring systems in fixed to evaluate helicopter diagnostic, prognostic , and wing aircraft, notably on the A
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... Methodology II. Summary of the Comparative Analysis A. Qualitative Analysis 1. Discussion of Detailed Textual... used for this preliminary determination. II. Summary of the Comparative Analysis DOE carried out both a...
Developing International Managers: The Contribution of Cultural Experience to Learning
ERIC Educational Resources Information Center
Townsend, Peter; Regan, Padraic; Li, Liang Liang
2015-01-01
Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…
Millennials and the World of Work: Experiences in Paid Work During Adolescence
Schulenberg, John E.
2010-01-01
Purpose This article considers some important questions faced by youth as they enter and adapt to paid work. We focus on two key questions: (1) how many hours should teenagers work during the school year and (2) what available jobs are desirable? Design/Methodology/Approach To help answer these questions, we review studies that have examined the effects of early work experiences on academic achievement, positive youth development, and health-risk behaviors. We also draw upon nationally representative data from the Monitoring the Future (MTF) study to illustrate some new findings on youth employment. Findings Moderate work hours, especially in jobs of higher-quality, are associated with a broad range of positive developmental outcomes. Implications These questions are not only important to teenagers and their parents, they also reflect key debates among scholars in sociology, developmental psychology, and economics regarding the potential short- and long-term consequences of early work experiences for social development and socioeconomic achievement. Originality/Value Although work intensity is an important dimension of adolescent work experience, it is clearly not the only one and we argue that it may not even be the most important one. By focusing on types and qualities of jobs, more can be gained in terms of understanding for whom and under what conditions teenage work does provide benefits for and detriments to youth development. PMID:20495611
Millennials and the World of Work: Experiences in Paid Work During Adolescence.
Staff, Jeremy; Schulenberg, John E
2010-06-01
PURPOSE: This article considers some important questions faced by youth as they enter and adapt to paid work. We focus on two key questions: (1) how many hours should teenagers work during the school year and (2) what available jobs are desirable? DESIGN/METHODOLOGY/APPROACH: To help answer these questions, we review studies that have examined the effects of early work experiences on academic achievement, positive youth development, and health-risk behaviors. We also draw upon nationally representative data from the Monitoring the Future (MTF) study to illustrate some new findings on youth employment. FINDINGS: Moderate work hours, especially in jobs of higher-quality, are associated with a broad range of positive developmental outcomes. IMPLICATIONS: These questions are not only important to teenagers and their parents, they also reflect key debates among scholars in sociology, developmental psychology, and economics regarding the potential short- and long-term consequences of early work experiences for social development and socioeconomic achievement. ORIGINALITY/VALUE: Although work intensity is an important dimension of adolescent work experience, it is clearly not the only one and we argue that it may not even be the most important one. By focusing on types and qualities of jobs, more can be gained in terms of understanding for whom and under what conditions teenage work does provide benefits for and detriments to youth development.
The Muslim Headscarf and Face Perception: “They All Look the Same, Don't They?”
Toseeb, Umar; Bryant, Eleanor J.; Keeble, David R. T.
2014-01-01
The headscarf conceals hair and other external features of a head (such as the ears). It therefore may have implications for the way in which such faces are perceived. Images of faces with hair (H) or alternatively, covered by a headscarf (HS) were used in three experiments. In Experiment 1 participants saw both H and HS faces in a yes/no recognition task in which the external features either remained the same between learning and test (Same) or switched (Switch). Performance was similar for H and HS faces in both the Same and Switch condition, but in the Switch condition it dropped substantially compared to the Same condition. This implies that the mere presence of the headscarf does not reduce performance, rather, the change between the type of external feature (hair or headscarf) causes the drop in performance. In Experiment 2, which used eye-tracking methodology, it was found that almost all fixations were to internal regions, and that there was no difference in the proportion of fixations to external features between the Same and Switch conditions, implying that the headscarf influenced processing by virtue of extrafoveal viewing. In Experiment 3, similarity ratings of the internal features of pairs of HS faces were higher than pairs of H faces, confirming that the internal and external features of a face are perceived as a whole rather than as separate components. PMID:24520313
Oliveira Neto, Alfredo de; Pinheiro, Roseni
2013-02-01
The field of Communication and Health in Brazil has been developing and getting stronger after each National Health Conference (NHC). In the final report of the XII NHC, in 2003, there was clear recognition that community radio is an instrument for the dissemination and treatment of issues related to the Brazilian Unified Health System (SUS). This study seeks to analyze the relationships that are established between health professionals, listeners/users and popular communicators as a means of understanding the nexus between a radio program on health and the imaginations of the listeners. A qualitative methodological approach was used of ethnographic and media audience methodologies. The field was a radio program about health, Bloco Mulher Saúde, broadcast by the Rádio Comunidade FM 104,9 in Nova Friburgo, State of Rio de Janeiro. The discussions were divided into analytical categories. The conclusion drawn is that community communication can be a cultural and political mediator for the expression of the demands of the community on health; the predominant medical jargon is maintained and reproduced by the physicians when participating on radio; community communication can contribute to the creation of strategies that broaden the social control of SUS.
Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?
NASA Astrophysics Data System (ADS)
Renner, Jonathan Christopher
Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.
"Describing our whole experience": the statistical philosophies of W. F. R. Weldon and Karl Pearson.
Pence, Charles H
2011-12-01
There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton's footsteps. I argue for two related theses in light of this standard interpretation, based on a reading of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. Copyright © 2011 Elsevier Ltd. All rights reserved.
Mind and Body Practices for Fibromyalgia: What the Science Says
... by a small number of studies with low methodological quality. What Does the Research Show? A 2015 ... and alternative medical therapies in fibromyalgia . Current Pharmaceutical Design. 2006;12(1):47–57. Schneider M, Vernon ...
Create full-scale predictive economic models on ROI and innovation with performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, Earl C.; Conway, Steve
The U.S. Department of Energy (DOE), the world's largest buyer and user of supercomputers, awarded IDC Research, Inc. a grant to create two macroeconomic models capable of quantifying, respectively, financial and non-financial (innovation) returns on investments in HPC resources. Following a 2013 pilot study in which we created the models and tested them on about 200 real-world HPC cases, DOE authorized us to conduct a full-out, three-year grant study to collect and measure many more examples, a process that would also subject the methodology to further testing and validation. A secondary, "stretch" goal of the full-out study was to advancemore » the methodology from association toward (but not all the way to) causation, by eliminating the effects of some of the other factors that might be contributing, along with HPC investments, to the returns produced in the investigated projects.« less
[Methodologic inconsistency in anamnesis education at medical schools].
Zago, M A
1989-01-01
Some relevant points of the process of obtaining the medical anamnesis and physical examination, and the formulation of diagnostic hypotheses are analyzed. The main methodological features include: preponderance of qualitative data, absence of preselected hypotheses, direct involvement of the observer (physician) with the data source (patient), and selection of hypotheses and changes of the patient during the process. Thus, diagnostic investigation does not follow the paradigm of quantitative scientific method, rooted on the logic positivism, which dominates medical research and education.
Higgins, A; Barnett, J; Meads, C; Singh, J; Longworth, L
2014-12-01
To systematically review the existing literature on the value associated with convenience in health care delivery, independent of health outcomes, and to try to estimate the likely magnitude of any value found. A systematic search was conducted for previously published studies that reported preferences for convenience-related aspects of health care delivery in a manner that was consistent with either cost-utility analysis or cost-benefit analysis. Data were analyzed in terms of the methodologies used, the aspects of convenience considered, and the values reported. Literature searches generated 4715 records. Following a review of abstracts or full-text articles, 27 were selected for inclusion. Twenty-six studies reported some evidence of convenience-related process utility, in the form of either a positive utility or a positive willingness to pay. The aspects of convenience valued most often were mode of administration (n = 11) and location of treatment (n = 6). The most common valuation methodology was a discrete-choice experiment containing a cost component (n = 15). A preference for convenience-related process utility exists, independent of health outcomes. Given the diverse methodologies used to calculate it, and the range of aspects being valued, however, it is difficult to assess how large such a preference might be, or how it may be effectively incorporated into an economic evaluation. Increased consistency in reporting these preferences is required to assess these issues more accurately. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Employing an ethnographic approach: key characteristics.
Lambert, Veronica; Glacken, Michele; McCarron, Mary
2011-01-01
Nurses are increasingly embracing ethnography as a useful research methodology. This paper presents an overview of some of the main characteristics we considered and the challenges encountered when using ethnography to explore the nature of communication between children and health professionals in a children's hospital. There is no consensual definition or single procedure to follow when using ethnography. This is largely attributable to the re-contextualisation of ethnography over time through diversification in and across many disciplines. Thus, it is imperative to consider some of ethnography's trademark features. To identify core trademark features of ethnography, we collated data following a scoping review of pertinent ethnographic textbooks, journal articles, attendance at ethnographic workshops and discussions with principle ethnographers. This is a methodological paper. Essentially, ethnography is a field-orientated activity that has cultural interpretations at its core, although the levels of those interpretations vary. We identified six trademark features to be considered when embracing an ethnographic approach: naturalism; context; multiple data sources; small case numbers; 'emic' and 'etic' perspectives, and ethical considerations. Ethnography has an assortment of meanings, so it is not often used in a wholly orthodox way and does not fall under the auspices of one epistemological belief. Yet, there are core criteria and trademark features that researchers should take into account alongside their particular epistemological beliefs when embracing an ethnographic inquiry. We hope this paper promotes a clearer vision of the methodological processes to consider when embarking on ethnography and creates an avenue for others to disseminate their experiences of and challenges encountered when applying ethnography's trademark features in different healthcare contexts.
Energy Return on Investment - Fuel Recycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halsey, W; Simon, A J; Fratoni, M
2012-06-06
This report provides a methodology and requisite data to assess the potential Energy Return On Investment (EROI) for nuclear fuel cycle alternatives, and applies that methodology to a limited set of used fuel recycle scenarios. This paper is based on a study by Lawrence Livermore National Laboratory and a parallel evaluation by AREVA Federal Services LLC, both of which were sponsored by the DOE Fuel Cycle Technologies (FCT) Program. The focus of the LLNL effort was to develop a methodology that can be used by the FCT program for such analysis that is consistent with the broader energy modeling community,more » and the focus of the AREVA effort was to bring industrial experience and operational data into the analysis. This cooperative effort successfully combined expertise from the energy modeling community with expertise from the nuclear industry. Energy Return on Investment is one of many figures of merit on which investment in a new energy facility or process may be judged. EROI is the ratio of the energy delivered by a facility divided by the energy used to construct, operate and decommission that facility. While EROI is not the only criterion used to make an investment decision, it has been shown that, in technologically advanced societies, energy supplies must exceed a minimum EROI. Furthermore, technological history shows a trend towards higher EROI energy supplies. EROI calculations have been performed for many components of energy technology: oil wells, wind turbines, photovoltaic modules, biofuels, and nuclear reactors. This report represents the first standalone EROI analysis of nuclear fuel reprocessing (or recycling) facilities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
1998-10-01
The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality duringmore » the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
A Quantitative Methodology to Examine the Development of Moral Judgment
ERIC Educational Resources Information Center
Buchanan, James P.; Thompson, Spencer K.
1973-01-01
Unlike Piaget's clinical procedure, the experiment's methodology allowed substantiation of the ability of children to simultaneously weigh damage and intent information when making a moral judgment. Other advantages of this quantitative methodology are also presented. (Authors)
Methodological Issues and Practices in Qualitative Research.
ERIC Educational Resources Information Center
Bradley, Jana
1993-01-01
Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…
Noonan, Robert J; Fairclough, Stuart J; Knowles, Zoe R; Boddy, Lynne M
2017-07-14
Understanding family physical activity (PA) behaviour is essential for designing effective family-based PA interventions. However, effective approaches to capture the perceptions and "lived experiences" of families are not yet well established. The aims of the study were to: (1) demonstrate how a "write, draw, show and tell" (WDST) methodological approach can be appropriate to family-based PA research, and (2) present two distinct family case studies to provide insights into the habitual PA behaviour and experiences of a nuclear and single-parent family. Six participants (including two "target" children aged 9-11 years, two mothers and two siblings aged 6-8 years) from two families were purposefully selected to take part in the study, based on their family structure. Participants completed a paper-based PA diary and wore an ActiGraph GT9X accelerometer on their left wrist for up to 10 weekdays and 16 weekend days. A range of WDST tasks were then undertaken by each family to offer contextual insight into their family-based PA. The selected families participated in different levels and modes of PA, and reported contrasting leisure opportunities and experiences. These novel findings encourage researchers to tailor family-based PA intervention programmes to the characteristics of the family.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
34 CFR 644.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 644.22 Section 644.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION EDUCATIONAL OPPORTUNITY CENTERS How Does the...
Design of experiments (DOE) - history, concepts, and relevance to in vitro culture
USDA-ARS?s Scientific Manuscript database
Design of experiments (DOE) is a large and well-developed field for understanding and improving the performance of complex systems. Because in vitro culture systems are complex, but easily manipulated in controlled conditions, they are particularly well-suited for the application of DOE principle...
Chue, Amanda E; Gunthert, Kathleen C; Ahrens, Anthony H; Skalina, Lauren M
2017-02-01
Research has suggested that there are benefits to socially sharing anger as an emotion regulation strategy. We hypothesized that these benefits may depend on the frequency with which one is experiencing anger. We used an experience sampling methodology to explore the interaction between frequency of anger and reliance on social expression of anger as a predictor of changes in depression symptoms 4 months later. We found that a strong reliance on social expression prospectively predicted lower depression symptoms when participants endorsed anger infrequently but predicted an increase in subsequent depression symptoms when anger was endorsed frequently. This interaction was specific to anger and did not extend to sadness or anxiety. These results highlight the importance of considering the effectiveness of emotion regulation strategies in the context of specific emotions and the frequency of the experienced emotion in everyday life. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Rioland, Guillaume; Dutournié, Patrick; Faye, Delphine; Daou, T Jean; Patarin, Joël
2016-01-01
Zeolite pellets containing 5 wt % of binder (methylcellulose or sodium metasilicate) were formed with a hydraulic press. This paper describes a mathematical model to predict the mechanical properties (uniaxial and diametric compression) of these pellets for arbitrary dimensions (height and diameter) using a design of experiments (DOE) methodology. A second-degree polynomial equation including interactions was used to approximate the experimental results. This leads to an empirical model for the estimation of the mechanical properties of zeolite pellets with 5 wt % of binder. The model was verified by additional experimental tests including pellets of different dimensions created with different applied pressures. The optimum dimensions were found to be a diameter of 10-23 mm, a height of 1-3.5 mm and an applied pressure higher than 200 MPa. These pellets are promising for technological uses in molecular decontamination for aerospace-based applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-12-31
The ``Environmental Management Technology Leveraging Initiative,`` a cooperative agreement between the Global Environment and Technology Foundation and the Department of Energy-Morgantown Energy Technology Center, has completed its second year. This program, referred to as the Global Environmental Technology Enterprise (GETE) is an experiment to bring together the public and private sectors to identify, formulate, promote and refine methods to develop more cost-effective clean-up treatments. Working closely with Department of Energy officials, National Laboratory representatives, business people, academia, community groups, and other stakeholders, this program attempts to commercialize innovative, DOE-developed technologies. The methodology to do so incorporates three elements: business assistance,more » information, and outreach. A key advance this year was the development of a commercialization guidance document which can be used to diagnose the commercialization level and needs for innovative technologies.« less
Hofmann, Wilhelm
2016-01-01
How does power manifest itself in everyday life? Using experience-sampling methodology, we investigated the prevalence, sources, and correlates of power in people’s natural environments. Participants experienced power-relevant situations regularly, though not frequently. High power was not restricted to a limited few: almost half of the sample reported experiencing high-power positions. Positional power and subjective feelings of power were strongly related but had unique relations with several individual difference measures and independent effects on participants’ affect, cognition, and interpersonal relations. Subjective feelings of power resulted more from within-participant situational fluctuation, such as the social roles participants held at different times, than from stable differences between people. Our data supported some theoretical predictions about power’s effects on affect, cognition, and interpersonal relations, but qualified others, particularly highlighting the role of responsibility in power’s effects. Although the power literature has focused on high power, we found stronger effects of low power than high power. PMID:27551069
Smith, Pamela K; Hofmann, Wilhelm
2016-09-06
How does power manifest itself in everyday life? Using experience-sampling methodology, we investigated the prevalence, sources, and correlates of power in people's natural environments. Participants experienced power-relevant situations regularly, though not frequently. High power was not restricted to a limited few: almost half of the sample reported experiencing high-power positions. Positional power and subjective feelings of power were strongly related but had unique relations with several individual difference measures and independent effects on participants' affect, cognition, and interpersonal relations. Subjective feelings of power resulted more from within-participant situational fluctuation, such as the social roles participants held at different times, than from stable differences between people. Our data supported some theoretical predictions about power's effects on affect, cognition, and interpersonal relations, but qualified others, particularly highlighting the role of responsibility in power's effects. Although the power literature has focused on high power, we found stronger effects of low power than high power.
Search for neutrino generated air shower candidates with energy ≥ 1019 eV and Zenith angle θ
NASA Astrophysics Data System (ADS)
Knurenko, Stanislav; Petrov, Igor; Sabourov, Artem
2017-06-01
The description of the methodology and results of searching for air showers generated by neutral particles such as high energy gamma quanta and astroneutrinos are presented. For this purpose, we conducted a comprehensive analysis of the data: the electron, the muon and the EAS Cerenkov light, and their response time in scintillation and Cherenkov detectors. Air showers with energy more than 5·1018 eV and zenith angle θ ≥ 55∘ are selected and analyzed. Search results indicate a lack of air shower events formed by gamma-rays or high-energy neutrinos, but it does not mean that such air showers do not exist in nature; for example, experiments that recorded showers having a marked low muon content, i.e., "Muonless", are likely to be candidates for showers produced by neutral primary particles.
Homan, J. Michael
2010-01-01
Objective: The 2009 Janet Doe Lecture reflects on the continuing value and increasing return on investment of librarian-mediated services in the constantly evolving digital ecology and complex knowledge environment of the health sciences. Setting: The interrelationship of knowledge, decision making based on knowledge, technology used to access and retrieve knowledge, and the important linkage roles of expert librarian intermediaries is examined. Methodology: Professional experiences from 1969 to 2009, occurring during a time of unprecedented changes in the digital ecology of librarianship, are the base on which the evolving role and value of librarians as knowledge coaches and expert intermediaries are examined. Conclusion: Librarian-mediated services linking knowledge and critical decision making in health care have become more valuable than ever as technology continues to reshape an increasingly complex knowledge environment. PMID:20098655
Effects of crude Dubai oil on Salmo gairdneri Rich. and Carassius auratus L
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramusino, M.C.; Dellavedova, P.; Zanzottera, D.
1984-03-01
In 1980, owing to the bursting of an oil pipeline, about 663 tons of crude Dubai oil poured into the river Po, the largest Italian river (northern Italy). The aim of this work was to try and identify, by way of laboratory tests, the lethal concentrations and subsequently calculate the concentration threshold value, below which mortality does not occur (safety dose). In planning and carrying out the experiments the methodological problems encountered were many, arising from the nature of the toxicants and the lack of bibliographical references about pollution by hydrocarbons of a river ecosystem. The focal point of themore » problem is precisely the absence of a working method for tests with hydrocarbons. This makes the preparation of test solutions and the exposure of the animals to the toxic important for the validity of the entire work.« less
Size does matter - span of control in hospitals.
Holm-Petersen, Christina; Østergaard, Sussanne; Andersen, Per Bo Noergaard
2017-04-10
Purpose Centralization, mergers and cost reductions have generally led to increasing levels of span of control (SOC), and thus potentially to lower leadership capacity. The purpose of this paper is to explore how a large SOC impacts hospital staff and their leaders. Design/methodology/approach The study is based on a qualitative explorative case study of three large inpatient wards. Findings The study finds that the nursing staff and their frontline leaders experience challenges in regard to visibility and role of the leader, e.g., in creating overview, coordination, setting-up clear goals, following up and being in touch. However, large wards also provide flexibility and development possibilities. Practical implications The authors discuss the implications of these findings for decision makers in deciding future SOC and for future SOC research. Originality/value Only few studies have qualitatively explored the consequences of large SOC in hospitals.
Human punishment is not primarily motivated by inequality.
Marczyk, Jesse
2017-01-01
Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player's payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value.
Crozier, Sarah E; Cassell, Catherine M
2016-06-01
The use of longitudinal methodology as a means of capturing the intricacies in complex organizational phenomena is well documented, and many different research strategies for longitudinal designs have been put forward from both a qualitative and quantitative stance. This study explores a specific emergent qualitative methodology, audio diaries, and assesses their utility for work psychology research drawing on the findings from a four-stage study addressing transient working patterns and stress in UK temporary workers. Specifically, we explore some important methodological, analytical and technical issues for practitioners and researchers who seek to use these methods and explain how this type of methodology has much to offer when studying stress and affective experiences at work. We provide support for the need to implement pluralistic and complementary methodological approaches in unearthing the depth in sense-making and assert their capacity to further illuminate the process orientation of stress. This study illustrates the importance of verbalization in documenting stress and affective experience as a mechanism for accessing cognitive processes in making sense of such experience.This study compares audio diaries with more traditional qualitative methods to assess applicability to different research contexts.This study provides practical guidance and a methodological framework for the design of audio diary research and design, taking into account challenges and solutions for researchers and practitioners.
Q and you: The application of Q methodology in recreation research
Whitney Ward
2010-01-01
Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...
ERIC Educational Resources Information Center
Bejan, Stelian Andrei; Janatuinen, Tero; Jurvelin, Jouni; Klöpping, Susanne; Malinen, Heikki; Minke, Bernhard; Vacareanu, Radu
2015-01-01
This paper reports on methodological approaches, experiences and expectations referring to impact analysis of quality assurance from the perspective of three higher education institutions (students, teaching staff, quality managers) from Germany, Finland and Romania. The presentations of the three sample institutions focus on discussing the core…
ERIC Educational Resources Information Center
Chae, Yoojin; Goodman, Gail S.; Bederian-Gardner, Daniel; Lindsay, Adam
2011-01-01
Scientific studies of child maltreatment victims' memory abilities and court experiences have important legal, psychological, and clinical implications. However, state-of-the-art research on child witnesses is often hindered by methodological challenges. In this paper, we address specific problems investigators may encounter when attempting such…
Using Phenomenology to Conduct Environmental Education Research: Experience and Issues
ERIC Educational Resources Information Center
Nazir, Joanne
2016-01-01
Recently, I applied a phenomenological methodology to study environmental education at an outdoor education center. In this article, I reflect on my experience of doing phenomenological research to highlight issues researchers may want to consider in using this type of methodology. The main premise of the article is that phenomenology, with its…
Resisting Coherence: Trans Men's Experiences and the Use of Grounded Theory Methods
ERIC Educational Resources Information Center
Catalano, D. Chase J.
2017-01-01
In this methodological reflective manuscript, I explore my decision to use a grounded theoretical approach to my dissertation study on trans* men in higher education. Specifically, I question whether grounded theory as a methodology is capable of capturing the complexity and capaciousness of trans*-masculine experiences. Through the lenses of…
A Methodology for the Assessment of Experiential Learning Lean: The Lean Experience Factory Study
ERIC Educational Resources Information Center
De Zan, Giovanni; De Toni, Alberto Felice; Fornasier, Andrea; Battistella, Cinzia
2015-01-01
Purpose: The purpose of this paper is to present a methodology to assess the experiential learning processes of learning lean in an innovative learning environment: the lean model factories. Design/methodology/approach: A literature review on learning and lean management literatures was carried out to design the methodology. Then, a case study…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khaleel, R.; Mehta, S.; Nichols, W. E.
This annual review provides the projected dose estimates of radionuclide inventories disposed in the active 200 West Area Low-Level Burial Grounds (LLBGs) since September 26, 1988. These estimates area calculated using the original does methodology developed in the performance assessment (PA) analysis (WHC-EP-0645).
The Joint Expeditionary Culture Gap
2004-05-26
and three, politically powerful lobbies- feminists , homosexuals, even the disabled-will be enraged that the force you have assembled does not look... therapy of victimhood.61 Present recruiting methodology stresses incentives and boasts benefits, while undermining the privilege and honor of
Expanding perspective on music therapy for symptom management in cancer care.
Potvin, Noah; Bradt, Joke; Kesslick, Amy
2015-01-01
Symptom management is a frequently researched treatment topic in music therapy and cancer care. Representations in the literature of music interventions for symptom management, however, have often overlooked the human experiences shaping those symptoms. This may result in music therapy being perceived as a linear intervention process that does not take into account underlying experiences that contribute to symptom experiences. This study explored patient experiences underlying symptoms and symptom management in cancer care, and examined the role of music therapy in that clinical process. This study analyzed semi-structured, open-ended exit interviews obtained from 30 participants during a randomized controlled trial investigating the differential impact of music therapy versus music medicine interventions on symptom management in participants with cancer. Interviews were conducted by a research assistant not involved with the clinical interventions. Exit interview transcripts for 30 participants were analyzed using an inductive, latent, constructivist method of thematic analysis. Three themes-Relaxation, Therapeutic relationship, and Intrapersonal relating-capture elements of the music therapy process that (a) modified participants' experiences of adjustments in their symptoms and (b) highlighted the depth of human experience shaping their symptoms. These underlying human experiences naturally emerged in the therapeutic setting, requiring the music therapist's clinical expertise for appropriate support. Symptom management extends beyond fluctuation in levels and intensity of a surface-level symptom to incorporate deeper lived experiences. The authors provide recommendations for clinical work, entry-level training as related to symptom management, implications for evidence-based practice in music therapy, and methodology for future mixed methods research. © the American Music Therapy Association 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa
2013-01-01
The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.
Chitosan production by psychrotolerant Rhizopus oryzae in non-sterile open fermentation conditions.
Tasar, Ozden Canli; Erdal, Serkan; Taskin, Mesut
2016-08-01
A new chitosan producing fungus was locally isolated from soil samples collected around Erzurum, Turkey and identified as Rhizopus oryzae PAS 17 (GenBank accession number KU318422.1). Cultivation in low cost non-sterile conditions was achieved by exploiting its ability to grow at low temperature and pH, thus, undesired microbial contamination could be eliminated when appropriate culture conditions (incubation temperature as 15°C and initial pH of the medium as 4.5) were selected. Medium composition and culture conditions were optimized using Taguchi orthogonal array (OA) design of experiment (DOE). An OA layout of L16 (4(5)) was constructed with five most influensive factors at four levels on chitosan production like, carbon source (molasses), metal ion (Mg(2+)), inoculum amount, agitation speed and incubation time. The optimal combinations of factors (molasses, 70ml/l; MgSO4·7H2O, 0.5g/l; inoculum, 6.7×10(6) spores/disc; agitation speed, 150rpm and incubation time, 8days) obtained from the proposed DOE methodology was further validated by analysis of variance (ANOVA) test and the results revealed the increment of chitosan and biomass yields of 14.45 and 8.58 folds from its unoptimized condition, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.
Multi-disciplinary optimization of railway wheels
NASA Astrophysics Data System (ADS)
Nielsen, J. C. O.; Fredö, C. R.
2006-06-01
A numerical procedure for multi-disciplinary optimization of railway wheels, based on Design of Experiments (DOE) methodology and automated design, is presented. The target is a wheel design that meets the requirements for fatigue strength, while minimizing the unsprung mass and rolling noise. A 3-level full factorial (3LFF) DOE is used to collect data points required to set up Response Surface Models (RSM) relating design and response variables in the design space. Computationally efficient simulations are thereafter performed using the RSM to identify the solution that best fits the design target. A demonstration example, including four geometric design variables in a parametric finite element (FE) model, is presented. The design variables are wheel radius, web thickness, lateral offset between rim and hub, and radii at the transitions rim/web and hub/web, but more variables (including material properties) can be added if needed. To improve further the performance of the wheel design, a constrained layer damping (CLD) treatment is applied on the web. For a given load case, compared to a reference wheel design without CLD, a combination of wheel shape and damping optimization leads to the conclusion that a reduction in the wheel component of A-weighted rolling noise of 11 dB can be achieved if a simultaneous increase in wheel mass of 14 kg is accepted.
Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations
Palmiotti, Giuseppe; Salvatores, Massimo
2012-01-01
The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.
The methodological defense of realism scrutinized.
Wray, K Brad
2015-12-01
I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim. Copyright © 2015 Elsevier Ltd. All rights reserved.
Appleton, Jane V; King, Lindy
2002-12-01
This paper presents our journey through a contemplation of the philosophical origins of constructivism to consider its role as an active methodology in qualitative research. The first part of the paper summarizes the philosophical background of constructivism and the five principles underpinning this paradigm as described through the works of Guba and Lincoln. The philosophical roots of constructivism are then compared with postpositivism, critical realism and participatory inquiry. The paper moves on to consider their common methodological steps, before examining how the constructivist research strategy is being adopted and adapted within the pragmatics of health service research. Recent studies will be drawn upon to illustrate the use of constructivist methodology. Questions are raised about the role of philosophy and the extent to which it should or does underpin or influence qualitative research strategies. We believe that if researchers gain an understanding of both philosophy and methodology a richer and more robust study is likely to result.
Estimating Agricultural Nitrous Oxide Emissions
USDA-ARS?s Scientific Manuscript database
Nitrous oxide emissions are highly variable in space and time and different methodologies have not agreed closely, especially at small scales. However, as scale increases, so does the agreement between estimates based on soil surface measurements (bottom up approach) and estimates derived from chang...
Daycare Staff Emotions and Coping Related to Children of Divorce: A Q Methodological Study
ERIC Educational Resources Information Center
Øverland, Klara; Størksen, Ingunn; Bru, Edvin; Thorsen, Arlene Arstad
2014-01-01
This Q methodological study explores emotional experiences and coping of daycare staff when working with children of divorce and their families. Two main coping strategies among daycare staff were identified: 1) Confident copers, and 2) Non-confident copers. Interviews exemplify the two main experiences. Both groups may struggle with coping in…
Second Life in the Library: An Empirical Study of New Users' Experiences
ERIC Educational Resources Information Center
Clarke, Christopher Peter
2012-01-01
Purpose: This paper aims to examine the experiences of new users of Second Life in order to identify potential barriers and attractors to the expansion of the userbase and therefore the market for in-world information services. Design/methodology/approach: A multi-faceted methodological approach was taken utilising two questionnaires (pre- and…
Feeling to See: Black Graduate Student Women (Re)Membering Black Womanhood through Study Abroad
ERIC Educational Resources Information Center
Green, Qiana
2017-01-01
This qualitative research study illuminates the lived experiences of Black graduate student women who study abroad. I provide insights on how these students made meaning of themselves through study abroad. I utilized sista circle methodology, a culturally responsive methodology, to examine the study abroad experiences of 23 Black graduate student…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, D; Vile, D; Rosu, M
Purpose: Assess the correct implementation of risk-based methodology of TG 100 to optimize quality management and patient safety procedures for Stereotactic Body Radiation Therapy. Methods: A detailed process map of SBRT treatment procedure was generated by a team of three physicists with varying clinical experience at our institution to assess the potential high-risk failure modes. The probabilities of occurrence (O), severity (S) and detectability (D) for potential failure mode in each step of the process map were assigned by these individuals independently on the scale from1 to 10. The risk priority numbers (RPN) were computed and analyzed. The highest 30more » potential modes from each physicist’s analysis were then compared. Results: The RPN values assessed by the three physicists ranged from 30 to 300. The magnitudes of the RPN values from each physicist were different, and there was no concordance in the highest RPN values recorded by three physicists independently. The 10 highest RPN values belonged to sub steps of CT simulation, contouring and delivery in the SBRT process map. For these 10 highest RPN values, at least two physicists, irrespective of their length of experience had concordance but no general conclusions emerged. Conclusion: This study clearly shows that the risk-based assessment of a clinical process map requires great deal of preparation, group discussions, and participation by all stakeholders. One group albeit physicists cannot effectively implement risk-based methodology proposed by TG100. It should be a team effort in which the physicists can certainly play the leading role. This also corroborates TG100 recommendation that risk-based assessment of clinical processes is a multidisciplinary team effort.« less
Becoming ecological citizens: connecting people through performance art, food matter and practices
Roe, Emma; Buser, Michael
2016-01-01
Engaging the interest of Western citizens in the complex food connections that shape theirs’ and others’ personal wellbeing around issues such as food security and access is challenging. This article is critical of the food marketplace as the site for informing consumer behaviour and argues instead for arts-based participatory activities to support the performance of ecological citizens in non-commercial spaces. Following the ongoing methodological and conceptual fascination with performance, matter and practice in cultural food studies, we outline what the ecological citizen, formed through food’s agentive potential, does and could do. This is an ecological citizen, defined not in its traditional relation to the state but rather to the world of humans and non-humans whose lives are materially interconnected through nourishment. The article draws on the theories of Berlant, Latour, Bennett and Massumi. Our methodology is a collaborative arts-led research project that explored and juxtaposed diverse food practices with artist Paul Hurley, researchers, community partners, volunteers and participants in Bristol, UK. It centred on a 10-day exhibition where visitors were exposed to a series of interactive explorations with and about food. Our experience leads us to outline two steps for enacting ecological citizenship. The first step is to facilitate sensory experiences that enable the agential qualities of foodstuffs to shape knowledge making. The second is to create a space where people can perform, or relate differently, in unusual manners to food. Through participating in the project and visiting the exhibition, people were invited to respond not only as ‘ethical consumers’ but also as ‘ecological citizens’. This participatory approach to research can contribute to understandings of human-world entanglements. PMID:29708123
Narrative inquiry: Locating Aboriginal epistemology in a relational methodology.
Barton, Sylvia S
2004-03-01
This methodology utilizes narrative analysis and the elicitation of life stories as understood through dimensions of interaction, continuity, and situation. It is congruent with Aboriginal epistemology formulated by oral narratives through representation, connection, storytelling and art. Needed for culturally competent scholarship is an experience of research whereby inquiry into epiphanies, ritual, routines, metaphors and everyday experience creates a process of reflexive thinking for multiple ways of knowing. Based on the sharing of perspectives, narrative inquiry allows for experimentation into creating new forms of knowledge by contextualizing diabetes from the experience of a researcher overlapped with experiences of participants--a reflective practice in itself. The aim of this paper is to present narrative inquiry as a relational methodology and to analyse critically its appropriateness as an innovative research approach for exploring Aboriginal people's experience living with diabetes. Narrative inquiry represents an alternative culture of research for nursing science to generate understanding and explanation of Aboriginal people's 'diabetic self' stories, and to coax open a window for co-constructing a narrative about diabetes as a chronic illness. The ability to adapt a methodology for use in a cultural context, preserve the perspectives of Aboriginal peoples, maintain the holistic nature of social problems, and value co-participation in respectful ways are strengths of an inquiry partial to a responsive and embodied scholarship.
Social Networking Sites' Influence on Travelers' Authentic Experience a Case Study of Couch Surfing
ERIC Educational Resources Information Center
Liu, Xiao
2013-01-01
This study explored travelers' experiences in the era of network hospitality 2.0 using CouchSurfing.org as a case study. The following research questions guided this study: 1) what experience does CouchSurfing create for travelers before, during and after their travel? 2) how does couch surfers' experience relate to authenticity in context of…
Waste Package Component Design Methodology Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.C. Mecham
2004-07-12
This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and usemore » of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational requirements of the YMP. Four waste package configurations have been selected to illustrate the application of the methodology during the licensing process. These four configurations are the 21-pressurized water reactor absorber plate waste package (21-PWRAP), the 44-boiling water reactor waste package (44-BWR), the 5 defense high-level radioactive waste (HLW) DOE spent nuclear fuel (SNF) codisposal short waste package (5-DHLWDOE SNF Short), and the naval canistered SNF long waste package (Naval SNF Long). Design work for the other six waste packages will be completed at a later date using the same design methodology. These include the 24-boiling water reactor waste package (24-BWR), the 21-pressurized water reactor control rod waste package (21-PWRCR), the 12-pressurized water reactor waste package (12-PWR), the 5 defense HLW DOE SNF codisposal long waste package (5-DHLWDOE SNF Long), the 2 defense HLW DOE SNF codisposal waste package (2-MC012-DHLW), and the naval canistered SNF short waste package (Naval SNF Short). This report is only part of the complete design description. Other reports related to the design include the design reports, the waste package system description documents, manufacturing specifications, and numerous documents for the many detailed calculations. The relationships between this report and other design documents are shown in Figure 1.« less
The use of experimental design to find the operating maximum power point of PEM fuel cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crăciunescu, Aurelian; Pătularu, Laurenţiu; Ciumbulea, Gloria
2015-03-10
Proton Exchange Membrane (PEM) Fuel Cells are difficult to model due to their complex nonlinear nature. In this paper, the development of a PEM Fuel Cells mathematical model based on the Design of Experiment methodology is described. The Design of Experiment provides a very efficient methodology to obtain a mathematical model for the studied multivariable system with only a few experiments. The obtained results can be used for optimization and control of the PEM Fuel Cells systems.
22 CFR 126.5 - Canadian exemptions.
Code of Federal Regulations, 2011 CFR
2011-04-01
... such persons publicly available through the Internet Web site of the Directorate of Defense Trade... that a foreign consignee can produce a defense article from engineering drawings without any technical...); and does not include (v) Design methodology, such as: The underlying engineering methods and design...
34 CFR 428.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2013 CFR
2013-07-01
... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...
34 CFR 428.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2014 CFR
2014-07-01
... individuals enrolled in vocational education programs, and how those needs should influence teaching strategies and program design. (ii) Understanding of bilingual vocational training methodologies. (iii... points) The Secretary reviews each application for an effective plan of management that ensures proper...
Asteroids: Does Space Weathering Matter?
NASA Technical Reports Server (NTRS)
Gaffey, Michael J.
2001-01-01
The interpretive calibrations and methodologies used to extract mineralogy from asteroidal spectra appear to remain valid until the space weathering process is advanced to a degree which appears to be rare or absent on asteroid surfaces. Additional information is contained in the original extended abstract.
48 CFR 970.1100-1 - Performance-based contracting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... SUPPLEMENTARY REGULATIONS DOE MANAGEMENT AND OPERATING CONTRACTS Describing Agency Needs 970.1100-1 Performance... practicable, performance-based contracting methods in its management and operating contracts. The Office of...-based contracting concepts and methodologies that may be generally applied to management and operating...
Profilometric characterization of DOEs with continuous microrelief
NASA Astrophysics Data System (ADS)
Korolkov, V. P.; Ostapenko, S. V.; Shimansky, R. V.
2008-09-01
Methodology of local characterization of continuous-relief diffractive optical elements has been discussed. The local profile depth can be evaluated using "approximated depth" defined without taking a profile near diffractive zone boundaries into account. Several methods to estimate the approximated depth have been offered.
Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.F.A. Deng; M. Saglam; L.J. Gratton
2001-05-23
In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{submore » eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.« less
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David; Johnson, Kenneth
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
NASA Technical Reports Server (NTRS)
Parsons, David S.; Ordway, David O.; Johnson, Kenneth L.
2013-01-01
This experimental study seeks to quantify the impact various composite parameters have on the structural response of a composite structure in a pyroshock environment. The prediction of an aerospace structure's response to pyroshock induced loading is largely dependent on empirical databases created from collections of development and flight test data. While there is significant structural response data due to pyroshock induced loading for metallic structures, there is much less data available for composite structures. One challenge of developing a composite pyroshock response database as well as empirical prediction methods for composite structures is the large number of parameters associated with composite materials. This experimental study uses data from a test series planned using design of experiments (DOE) methods. Statistical analysis methods are then used to identify which composite material parameters most greatly influence a flat composite panel's structural response to pyroshock induced loading. The parameters considered are panel thickness, type of ply, ply orientation, and pyroshock level induced into the panel. The results of this test will aid in future large scale testing by eliminating insignificant parameters as well as aid in the development of empirical scaling methods for composite structures' response to pyroshock induced loading.
NASA Astrophysics Data System (ADS)
Nouri, N. M.; Mostafapour, K.; Kamran, M.
2018-02-01
In a closed water-tunnel circuit, the multi-component strain gauge force and moment sensor (also known as balance) are generally used to measure hydrodynamic forces and moments acting on scaled models. These balances are periodically calibrated by static loading. Their performance and accuracy depend significantly on the rig and the method of calibration. In this research, a new calibration rig was designed and constructed to calibrate multi-component internal strain gauge balances. The calibration rig has six degrees of freedom and six different component-loading structures that can be applied separately and synchronously. The system was designed based on the applicability of formal experimental design techniques, using gravity for balance loading and balance positioning and alignment relative to gravity. To evaluate the calibration rig, a six-component internal balance developed by Iran University of Science and Technology was calibrated using response surface methodology. According to the results, calibration rig met all design criteria. This rig provides the means by which various methods of formal experimental design techniques can be implemented. The simplicity of the rig saves time and money in the design of experiments and in balance calibration while simultaneously increasing the accuracy of these activities.
Selby, Edward A; Nock, Matthew K; Kranzler, Amy
2014-02-28
One of the most frequently reported, yet understudied, motivations for non-suicidal self-injury (NSSI) involves automatic positive reinforcement (APR), wherein sensations arising from NSSI reinforce and promote the behavior. The current study used experience sampling methodology with a clinical sample of self-injuring adolescents (N=30) over a 2-week period during which the adolescents reported NSSI behaviors, and rated if an APR motivation was present, and if so whether that motivation pertained to feeling "pain," "stimulation," or "satisfaction." Over 50% of the sample reported at least one instance of NSSI for APR reasons. No significant differences were found on demographic factors or psychiatric comorbidity for those with and without an APR motivation. However, those with an APR motivation reported elevated NSSI thoughts, longer duration of those thoughts, and more NSSI behaviors. They also reported more alcohol use thoughts, alcohol use, impulsive spending, and binge eating. The most commonly reported sensation following NSSI for APR was "satisfaction." However those endorsing feeling pain reported the most NSSI behaviors. These findings provide new information about the APR motivations for NSSI and shed light on the different sensations felt. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Assimilation of pseudo-tree-ring-width observations into an atmospheric general circulation model
NASA Astrophysics Data System (ADS)
Acevedo, Walter; Fallah, Bijan; Reich, Sebastian; Cubasch, Ulrich
2017-05-01
Paleoclimate data assimilation (DA) is a promising technique to systematically combine the information from climate model simulations and proxy records. Here, we investigate the assimilation of tree-ring-width (TRW) chronologies into an atmospheric global climate model using ensemble Kalman filter (EnKF) techniques and a process-based tree-growth forward model as an observation operator. Our results, within a perfect-model experiment setting, indicate that the "online DA" approach did not outperform the "off-line" one, despite its considerable additional implementation complexity. On the other hand, it was observed that the nonlinear response of tree growth to surface temperature and soil moisture does deteriorate the operation of the time-averaged EnKF methodology. Moreover, for the first time we show that this skill loss appears significantly sensitive to the structure of the growth rate function, used to represent the principle of limiting factors (PLF) within the forward model. In general, our experiments showed that the error reduction achieved by assimilating pseudo-TRW chronologies is modulated by the magnitude of the yearly internal variability in the model. This result might help the dendrochronology community to optimize their sampling efforts.
1986-01-01
Chapter it Research Methodology This chapter describes the methodology and the experimental design used for this research. Prior to discussing the...50 Experimental Design ............................... 50 Task/Treatm ent ................................... 55 Task Design ...Figure 3.3 Interface Experiment Elements ............... 54 Figure 3.4 Experimental Design ....................... 55 Figure 3.5 Subject Assignment
Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments
2015-09-30
statistical inference methodologies for ocean- acoustic problems by investigating and applying statistical methods to data collected from scale-model...to begin planning experiments for statistical inference applications. APPROACH In the ocean acoustics community over the past two decades...solutions for waveguide parameters. With the introduction of statistical inference to the field of ocean acoustics came the desire to interpret marginal
Menopause and Methodological Doubt
ERIC Educational Resources Information Center
Spence, Sheila
2005-01-01
Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…
Residential Two-Stage Gas Furnaces - Do They Save Energy?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lekov, Alex; Franco, Victor; Lutz, James
2006-05-12
Residential two-stage gas furnaces account for almost a quarter of the total number of models listed in the March 2005 GAMA directory of equipment certified for sale in the United States. Two-stage furnaces are expanding their presence in the market mostly because they meet consumer expectations for improved comfort. Currently, the U.S. Department of Energy (DOE) test procedure serves as the method for reporting furnace total fuel and electricity consumption under laboratory conditions. In 2006, American Society of Heating Refrigeration and Air-conditioning Engineers (ASHRAE) proposed an update to its test procedure which corrects some of the discrepancies found in themore » DOE test procedure and provides an improved methodology for calculating the energy consumption of two-stage furnaces. The objectives of this paper are to explore the differences in the methods for calculating two-stage residential gas furnace energy consumption in the DOE test procedure and in the 2006 ASHRAE test procedure and to compare test results to research results from field tests. Overall, the DOE test procedure shows a reduction in the total site energy consumption of about 3 percent for two-stage compared to single-stage furnaces at the same efficiency level. In contrast, the 2006 ASHRAE test procedure shows almost no difference in the total site energy consumption. The 2006 ASHRAE test procedure appears to provide a better methodology for calculating the energy consumption of two-stage furnaces. The results indicate that, although two-stage technology by itself does not save site energy, the combination of two-stage furnaces with BPM motors provides electricity savings, which are confirmed by field studies.« less
NASA Technical Reports Server (NTRS)
Buchner, S.; LaBel, K.; Barth, J.; Campbell, A.
2005-01-01
Space experiments are occasionally launched to study the effects of radiation on electronic and photonic devices. This begs the following questions: Are space experiments necessary? Do the costs justify the benefits? How does one judge success of space experiment? What have we learned from past space experiments? How does one design a space experiment? This viewgraph presentation provides information on the usefulness of space and ground tests for simulating radiation damage to spacecraft components.
Negotiating Parenthood: Experiences of Economic Hardship among Parents with Cognitive Difficulties
ERIC Educational Resources Information Center
Fernqvist, Stina
2015-01-01
People with cognitive difficulties often have scarce economic resources, and parents with cognitive difficulties are no exception. In this article, parents' experiences are put forth and discussed, for example, how does economic hardship affect family life? How do the parents experience support, what kind of strain does the scarce economy put on…
Shadows Alter Facial Expressions of Noh Masks
Kawai, Nobuyuki; Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo
2013-01-01
Background A Noh mask, worn by expert actors during performance on the Japanese traditional Noh drama, conveys various emotional expressions despite its fixed physical properties. How does the mask change its expressions? Shadows change subtly during the actual Noh drama, which plays a key role in creating elusive artistic enchantment. We here describe evidence from two experiments regarding how attached shadows of the Noh masks influence the observers’ recognition of the emotional expressions. Methodology/Principal Findings In Experiment 1, neutral-faced Noh masks having the attached shadows of the happy/sad masks were recognized as bearing happy/sad expressions, respectively. This was true for all four types of masks each of which represented a character differing in sex and age, even though the original characteristics of the masks also greatly influenced the evaluation of emotions. Experiment 2 further revealed that frontal Noh mask images having shadows of upward/downward tilted masks were evaluated as sad/happy, respectively. This was consistent with outcomes from preceding studies using actually tilted Noh mask images. Conclusions/Significance Results from the two experiments concur that purely manipulating attached shadows of the different types of Noh masks significantly alters the emotion recognition. These findings go in line with the mysterious facial expressions observed in Western paintings, such as the elusive qualities of Mona Lisa’s smile. They also agree with the aesthetic principle of Japanese traditional art “yugen (profound grace and subtlety)”, which highly appreciates subtle emotional expressions in the darkness. PMID:23940748
[Living with a chronic progressive form of multiple sclerosis--a balance act].
Hellige, Barbara
2002-12-01
There are only a few studies in the German speaking countries available which address the subject of the experience of chronically ill people during the course of their illness. Therefore it was the intent of this present study to find answers for central nurse science issues: How do people who suffer from a chronically progressive course of multiple sclerosis experience the downward trajectory? Which strategies do they develop in order to integrate the disease into their lives? In what ways does the scientific paradigm of medicine influence the perceptions of the illness and the life with it? After an analysis of the national and international state of nursing research follows a description of the methodology of the Grounded Theory and the individual examination steps of the study. The second part examines some essentials of the study. The results show that the trajectory is characterized by four phases. At first the ill person concentrates on the social discussion of the pathogenesis and the medical paradigma. But with physical experiences, exchange with other affected people and acquired information, those concerned develop a very case-specific knowledge. Often disillusioned by the medication strategies, based on the experience that their subjectivity and their individuality are not taken into account, they increasingly disapprove the deficit-oriented perspective of the professionals. They develop distinctly self-caring potentialities. To identify and support this self-care potentialities should be the very core of nursing.
Translating standards into practice - one Semantic Web API for Gene Expression.
Deus, Helena F; Prud'hommeaux, Eric; Miller, Michael; Zhao, Jun; Malone, James; Adamusiak, Tomasz; McCusker, Jim; Das, Sudeshna; Rocca Serra, Philippe; Fox, Ronan; Marshall, M Scott
2012-08-01
Sharing and describing experimental results unambiguously with sufficient detail to enable replication of results is a fundamental tenet of scientific research. In today's cluttered world of "-omics" sciences, data standards and standardized use of terminologies and ontologies for biomedical informatics play an important role in reporting high-throughput experiment results in formats that can be interpreted by both researchers and analytical tools. Increasing adoption of Semantic Web and Linked Data technologies for the integration of heterogeneous and distributed health care and life sciences (HCLSs) datasets has made the reuse of standards even more pressing; dynamic semantic query federation can be used for integrative bioinformatics when ontologies and identifiers are reused across data instances. We present here a methodology to integrate the results and experimental context of three different representations of microarray-based transcriptomic experiments: the Gene Expression Atlas, the W3C BioRDF task force approach to reporting Provenance of Microarray Experiments, and the HSCI blood genomics project. Our approach does not attempt to improve the expressivity of existing standards for genomics but, instead, to enable integration of existing datasets published from microarray-based transcriptomic experiments. SPARQL Construct is used to create a posteriori mappings of concepts and properties and linking rules that match entities based on query constraints. We discuss how our integrative approach can encourage reuse of the Experimental Factor Ontology (EFO) and the Ontology for Biomedical Investigations (OBIs) for the reporting of experimental context and results of gene expression studies. Copyright © 2012 Elsevier Inc. All rights reserved.
Systematic Approach to Better Understanding Integration Costs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, Gregory B.
2015-09-01
This research presents a systematic approach to evaluating the costs of integrating new generation and operational procedures into an existing power system, and the methodology is independent of the type of change or nature of the generation. The work was commissioned by the U.S. Department of Energy and performed by the National Renewable Energy Laboratory to investigate three integration cost-related questions: (1) How does the addition of new generation affect a system's operational costs, (2) How do generation mix and operating parameters and procedures affect costs, and (3) How does the amount of variable generation (non-dispatchable wind and solar) impactmore » the accuracy of natural gas orders? A detailed operational analysis was performed for seven sets of experiments: variable generation, large conventional generation, generation mix, gas prices, fast-start generation, self-scheduling, and gas supply constraints. For each experiment, four components of integration costs were examined: cycling costs, non-cycling VO&M costs, fuel costs, and reserves provisioning costs. The investigation was conducted with PLEXOS production cost modeling software utilizing an updated version of the Institute of Electrical and Electronics Engineers 118-bus test system overlaid with projected operating loads from the Western Electricity Coordinating Council for the Sacramento Municipal Utility District, Puget Sound Energy, and Public Service Colorado in the year 2020. The test system was selected in consultation with an industry-based technical review committee to be a reasonable approximation of an interconnection yet small enough to allow the research team to investigate a large number of scenarios and sensitivity combinations. The research should prove useful to market designers, regulators, utilities, and others who want to better understand how system changes can affect production costs.« less
Pawar, Jaywant; Suryawanshi, Dilipkumar; Moravkar, Kailas; Aware, Rahul; Shetty, Vasant; Maniruzzaman, Mohammed; Amin, Purnima
2018-02-09
The current study investigates the dissolution rate performance of amorphous solid solutions of a poorly water-soluble drug, efavirenz (EFV), in amorphous Soluplus® (SOL) and Kollidon® VA 64 (KVA64) polymeric systems. For the purpose of the study, various formulations with varying drug loadings of 30, 50, and 70% w/w were developed via hot-melt extrusion processing and adopting a Box-Behnken design of experiment (DoE) approach. The polymers were selected based on the Hansen solubility parameter calculation and the prediction of the possible drug-polymer miscibility. In DoE experiments, a Box-Behnken factorial design was conducted to evaluate the effect of independent variables such as Soluplus® ratio (A 1 ), HME screw speed (A 2 ), and processing temperature (A 3 ), and Kollidon®VA64 ratio (B 1 ), screw speed (B 2 ), and processing temperature (B 3 ) on responses such as solubility (X 1 and Y 1 ) and dissolution rate (X 2 and Y 2 ) for both ASS [EFV:SOL] and BSS [EFV:KVA64] systems. DSC and XRD data confirmed that bulk crystalline EFV transformed to amorphous form during the HME processing. Advanced chemical analyses conducted via 2D COSY NMR, FTIR chemical imaging, AFM analysis, and FTIR showed that EFV was homogenously dispersed in the respective polymer matrices. The maximum solubility and dissolution rate was observed in formulations containing 30% EFV with both SOL and KVA64 alone. This could be attributed to the maximum drug-polymer miscibility in the optimized formulations. The actual and predicted values of both responses were found precise and close to each other.
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
METHODOLOGIES FOR CALIBRATION AND PREDICTIVE ANALYSIS OF A WATERSHED MODEL
The use of a fitted-parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can l...
Research needs for developing a commodity-driven freight modeling approach.
DOT National Transportation Integrated Search
2003-01-01
It is well known that better freight forecasting models and data are needed, but the literature does not clearly indicate which components of the modeling methodology are most in need of improvement, which is a critical need in an era of limited rese...
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
Black, I; Seaton, R; Chackiath, S; Wagland, S T; Pollard, S J T; Longhurst, P J
2011-12-01
The identification of risk and its appropriate allocation to partners in project consortia is essential for minimizing overall project risks, ensuring timely delivery and maximizing benefit for money invested. Risk management guidance available from government bodies, especially in the UK, does not specify methodologies for quantitative risk assessment, nor does it offer a procedure for allocating risk among project partners. Here, a methodology to quantify project risk and potential approaches to allocating risk and their implications are discussed. Construction and operation of a waste management facility through a public-private finance contract are discussed. Public-private partnership contracts are special purpose vehicle (SPV) financing methods promoted by the UK government to boost private sector investment in facilities for public service enhancement. Our findings question the appropriateness of using standard deviation as a measure for project risk and confirm the concept of portfolio theory, suggesting the pooling of risk can reduce total risk and its impact.
Barrett, R. F.; Crozier, P. S.; Doerfler, D. W.; ...
2014-09-28
Computational science and engineering application programs are typically large, complex, and dynamic, and are often constrained by distribution limitations. As a means of making tractable rapid explorations of scientific and engineering application programs in the context of new, emerging, and future computing architectures, a suite of miniapps has been created to serve as proxies for full scale applications. Each miniapp is designed to represent a key performance characteristic that does or is expected to significantly impact the runtime performance of an application program. In this paper we introduce a methodology for assessing the ability of these miniapps to effectively representmore » these performance issues. We applied this methodology to four miniapps, examining the linkage between them and an application they are intended to represent. Herein we evaluate the fidelity of that linkage. This work represents the initial steps required to begin to answer the question, ''Under what conditions does a miniapp represent a key performance characteristic in a full app?''« less
Pawar, Rajesh; Bromhal, Grant; Carroll, Susan; ...
2014-12-31
Risk assessment for geologic CO₂ storage including quantification of risks is an area of active investigation. The National Risk Assessment Partnership (NRAP) is a US-Department of Energy (US-DOE) effort focused on developing a defensible, science-based methodology and platform for quantifying risk profiles at geologic CO₂ sequestration sites. NRAP has been developing a methodology that centers round development of an integrated assessment model (IAM) using system modeling approach to quantify risks and risk profiles. The IAM has been used to calculate risk profiles with a few key potential impacts due to potential CO₂ and brine leakage. The simulation results are alsomore » used to determine long-term storage security relationships and compare the long-term storage effectiveness to IPCC storage permanence goal. Additionally, we also demonstrate application of IAM for uncertainty quantification in order to determine parameters to which the uncertainty in model results is most sensitive.« less
Investigating patients' experiences: methodological usefulness of interpretive interactionism.
Tower, Marion; Rowe, Jennifer; Wallis, Marianne
2012-01-01
To demonstrate the methodological usefulness of interpretive interactionism by applying it to the example of a study investigating healthcare experiences of women affected by domestic violence. Understanding patients' experiences of health, illness and health care is important to nurses. For many years, biomedical discourse has prevailed in healthcare language and research, and has influenced healthcare responses. Contemporary nursing scholarship can be developed by engaging with new ways of understanding therapeutic interactions with patients. Research that uses qualitative methods of inquiry is an important paradigm for nurses who seek to explain and understand or describe experiences rather than predict outcomes. Interpretive interactionism is an interpretive form of inquiry for conducting studies of social or personal problems that have healthcare policy implications. It puts the patient at the centre of the research process and makes visible the experiences of patients as they interact with the healthcare and social systems that surround them. Interpretive interactionism draws on concepts of symbolic interactionism, phenomenology and hermeneutics. Interpretive interactionism is a patient-centred methodology that provides an alternative way of understanding patients' experiences. It can contribute to policy and practice development by drawing on the perspectives and experiences of patients, who are central to the research process. It also allows research findings to be situated in and linked to healthcare policy, professional ethics and organisational approaches to care. Interpretive interactionism has methodological utility because it can contribute to policy and practice development by drawing on the perspectives and experiences of patients who are central to the research process. Interpretive interactionism allows research findings to be situated in and linked to health policy, professional ethics and organisational approaches to caring.
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Owens, Lewis R.; Lin, John C.
2006-01-01
This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan-face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan-face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3- Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCP(sub avg), the circumferential distortion level at the engine fan-face.
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.
2006-01-01
This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine fan face.
How does our brain constitute defense mechanisms? First-person neuroscience and psychoanalysis.
Northoff, Georg; Bermpohl, Felix; Schoeneich, Frank; Boeker, Heinz
2007-01-01
Current progress in the cognitive and affective neurosciences is constantly influencing the development of psychoanalytic theory and practice. However, despite the emerging dialogue between neuroscience and psychoanalysis, the neuronal processes underlying psychoanalytic constructs such as defense mechanisms remain unclear. One of the main problems in investigating the psychodynamic-neuronal relationship consists in systematically linking the individual contents of first-person subjective experience to third-person observation of neuronal states. We therefore introduced an appropriate methodological strategy, 'first-person neuroscience', which aims at developing methods for systematically linking first- and third-person data. The utility of first-person neuroscience can be demonstrated by the example of the defense mechanism of sensorimotor regression as paradigmatically observed in catatonia. Combined psychodynamic and imaging studies suggest that sensorimotor regression might be associated with dysfunction in the neural network including the orbitofrontal, the medial prefrontal and the premotor cortices. In general sensorimotor regression and other defense mechanisms are psychoanalytic constructs that are hypothesized to be complex emotional-cognitive constellations. In this paper we suggest that specific functional mechanisms which integrate neuronal activity across several brain regions (i.e. neuronal integration) are the physiological substrates of defense mechanisms. We conclude that first-person neuroscience could be an appropriate methodological strategy for opening the door to a better understanding of the neuronal processes of defense mechanisms and their modulation in psychoanalytic psychotherapy. Copyright 2007 S. Karger AG, Basel.
Micro-Ramps for External Compression Low-Boom Inlets
NASA Technical Reports Server (NTRS)
Rybalko, Michael; Loth, Eric; Chima, Rodrick V.; Hirt, Stefanie M.; DeBonis, James R.
2010-01-01
The application of vortex generators for flow control in an external compression, axisymmetric, low-boom concept inlet was investigated using RANS simulations with three-dimensional (3-D), structured, chimera (overset) grids and the WIND-US code. The low-boom inlet design is based on previous scale model 1- by 1-ft wind tunnel tests and features a zero-angle cowl and relaxed isentropic compression centerbody spike, resulting in defocused oblique shocks and a weak terminating normal shock. Validation of the methodology was first performed for micro-ramps in supersonic flow on a flat plate with and without oblique shocks. For the inlet configuration, simulations with several types of vortex generators were conducted for positions both upstream and downstream of the terminating normal shock. The performance parameters included incompressible axisymmetric shape factor, separation area, inlet pressure recovery, and massflow ratio. The design of experiments (DOE) methodology was used to select device size and location, analyze the resulting data, and determine the optimal choice of device geometry. The optimum upstream configuration was found to substantially reduce the post-shock separation area but did not significantly impact recovery at the aerodynamic interface plane (AIP). Downstream device placement allowed for fuller boundary layer velocity profiles and reduced distortion. This resulted in an improved pressure recovery and massflow ratio at the AIP compared to the baseline solid-wall configuration.
Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C
2014-01-01
Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.
Five-Junction Solar Cell Optimization Using Silvaco Atlas
2017-09-01
experimental sources [1], [4], [6]. f. Numerical Method The method selected for solving the non -linear equations that make up the simulation can be...and maximize efficiency. Optimization of solar cell efficiency is carried out via nearly orthogonal balanced design of experiments methodology . Silvaco...Optimization of solar cell efficiency is carried out via nearly orthogonal balanced design of experiments methodology . Silvaco ATLAS is utilized to
ERIC Educational Resources Information Center
Melendez Alicea, Juan
1992-01-01
Presents steps taken in designing, justifying, and implementing an experimental study designed to investigate the effectiveness of distance education as a methodology for developing thinking skills. A discussion reviews major findings of the study by comparing student experiences from multimedia distance education and student experiences from…
Wennerberg, Mia M T; Lundgren, Solveig M; Danielson, Ella
2012-01-01
This article describes the theoretical foundation and methodology used in a study intended to increase knowledge concerning informal caregivers' resources to health (in salutogenesis; General Resistance Resources, GRRs). A detailed description of how the approach derived from salutogenic theory was used and how it permeated the entire study, from design to findings, is provided. How participation in the study was experienced is discussed and methodological improvements and implications suggested. Using an explorative, mixed method design, data was collected through salutogenically guided interviews with 32 Swedish caregivers to older adults. A constant comparative method of analysis was used to identify caregiver-GRRs, content analysis was further used to describe how participation was experienced. The methodology unraveled GRRs caregivers used to obtain positive experiences of caregiving, but also hindrances for such usage contributing to negative experiences. Mixed data made it possible to venture beyond actual findings to derive a synthesis describing the experienced, communal context of the population reliant on these GRRs; Caregivinghood. Participating in the salutogenic data-collection was found to be a reflective, mainly positive, empowering and enlightening experience. The methodology was advantageous, even if time-consuming, as it in one study unravelled caregiver-GRRs and hindrances for their usage on individual, communal and contextual levels. It is suggested that the ability to describe Caregivinghood may be essential when developing health-promoting strategies for caregivers at individual, municipal and national levels. The methodology makes such a description possible and suggested methodological improvements may enhance its usability and adaptability to other populations.
Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies
ERIC Educational Resources Information Center
Garcia, J.; Hernandez, A.
2010-01-01
This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…
ERIC Educational Resources Information Center
Grim, Brian J.; Harmon, Alison H.; Gromis, Judy C.
2006-01-01
There is a sharp divide between quantitative and qualitative methodologies in the social sciences. We investigate an innovative way to bridge this gap that incorporates quantitative techniques into a qualitative method, the "quanti-qualitative method" (QQM). Specifically, our research utilized small survey questionnaires and experiment-like…
Fraser, Thomas W K; Khezri, Abdolrahman; Jusdado, Juan G H; Lewandowska-Sabat, Anna M; Henry, Theodore; Ropstad, Erik
2017-07-05
Alterations in zebrafish motility are used to identify neurotoxic compounds, but few have reported how methodology may affect results. To investigate this, we exposed embryos to bisphenol A (BPA) or tetrabromobisphenol A (TBBPA) before assessing larval motility. Embryos were maintained on a day/night cycle (DN) or in constant darkness, were reared in 96 or 24 well plates (BPA only), and behavioural tests were carried out at 96, 100, or 118 (BPA only) hours post fertilisation (hpf). We found that the prior photo-regime, larval age, and/or arena size influence behavioural outcomes in response to toxicant exposure. For example, methodology determined whether 10μM BPA induced hyperactivity, hypoactivity, or had no behavioural effect. Furthermore, the minimum effect concentration was not consistent between different methodologies. Finally, we observed a mechanism previously used to explain hyperactivity following BPA exposure does not appear to explain the hypoactivity observed following minor alterations in methodology. Therefore, we demonstrate how methodology can have notable implications on dose responses and behavioural outcomes in larval zebrafish motility following identical chemical exposures. As such, our results have significant consequences for human and environmental risk assessment. Copyright © 2017 Elsevier B.V. All rights reserved.
Buyel, Johannes Felix; Fischer, Rainer
2014-01-31
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
NASA Technical Reports Server (NTRS)
Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy
2013-01-01
The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.
ERIC Educational Resources Information Center
Hamid, Jamaliah Abdul; Krauss, Steven E.
2013-01-01
Do students' experiences on university campuses cultivate motivation to lead or a sense of readiness to lead that does not necessarily translate to active leadership? To address this question, a study was conducted with 369 undergraduates from Malaysia. Campus experience was more predictive of leadership readiness than motivation. Student…
Reactor safeguards system assessment and design. Volume I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varnado, G.B.; Ericson, D.M. Jr.; Daniel, S.L.
1978-06-01
This report describes the development and application of a methodology for evaluating the effectiveness of nuclear power reactor safeguards systems. Analytic techniques are used to identify the sabotage acts which could lead to release of radioactive material from a nuclear power plant, to determine the areas of a plant which must be protected to assure that significant release does not occur, to model the physical plant layout, and to evaluate the effectiveness of various safeguards systems. The methodology was used to identify those aspects of reactor safeguards systems which have the greatest effect on overall system performance and which, therefore,more » should be emphasized in the licensing process. With further refinements, the methodology can be used by the licensing reviewer to aid in assessing proposed or existing safeguards systems.« less
Theoretical and methodological approaches in discourse analysis.
Stevenson, Chris
2004-01-01
Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.
Theoretical and methodological approaches in discourse analysis.
Stevenson, Chris
2004-10-01
Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
Contributions of the sandwich doctoral program to methodological approaches: an experience report.
Lorenzini, Elisiane; Oelke, Nelly Donszelmann; Marck, Patricia Beryl; Dall'Agnol, Clarice Maria
2016-06-01
Objective To share our experience on theoretical and methodological insights we have gained as researchers working together during the Sandwich Doctoral Program. Method This is a descriptive experience report. Results We have incorporated restoration thinking into a study on patient safety culture and will enhance knowledge translation by applying principles of deliberative dialogue to increase the uptake and implementation of research results. Conclusion Incorporating new approaches in Brazilian nursing research plays a key role in achieving international participation and visibility in different areas of nursing knowledge.
Risk Assessment and Decision-Making at the Local Level:?Who does what when, how, and to what extent?
Problem: Exposure to multiple stressors increases the likelihood of an adverse response in human and ecosystem communities. Challenge: A rigorous and scientifically robust assessment methodology is needed to characterize stressors and receptors of interest, calculate risk estimat...
DOE Office of Scientific and Technical Information (OSTI.GOV)
RITTMANN, P.D.
1999-07-14
This report contains technical information used to determine accident consequences for the Spent Nuclear Fuel Project safety documents. It does not determine accident consequences or describe specific accident scenarios, but instead provides generic information.
Verbal Strategies in the ML Classroom.
ERIC Educational Resources Information Center
Di Pietro, Robert J.
Second language teaching methodologies have been oriented toward their subject matter as the acquisition of form. This orientation does not distinguish sufficiently between language as grammatical competence and language as verbal strategy. There are many matters falling under the heading of "verbal strategy" which should become part of an…
Does Direct Instruction Cause Delinquency?
ERIC Educational Resources Information Center
Bereiter, Carl
1986-01-01
Takes issue with the High/Scope Educational Research Foundation report (Schweinhart, Weikart and Larner, 1986) indicating that preschoolers taught by direct instruction end up with twice the rate of delinquency of children who participated in High/Scope's own kind of preschool education. Argues that self-report methodology invalidates High/Scope…
A global parallel model based design of experiments method to minimize model output uncertainty.
Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E
2012-03-01
Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.
Emilio Segrè, the Antiproton, Technetium, and Astatine
of U238, DOE Technical Report, 1942 Spontaneous Fission, DOE Technical Report, November 1950 Observation of Antiprotons, DOE Technical Report, October 1955 Antiprotons, DOE Technical Report, November 1955 The Antiproton-Nucleon Annihilation Process (Antiproton Collaboration Experiment), DOE Technical
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Will E.; Mehta, S.; Nell, R. M.
This annual review provides the projected dose estimates of radionuclide inventories disposed in the active 200 East Area Low-Level Waste Burial Grounds (LLBGs) since September 26, 1988. The estimates are calculated using the original dose methodology developed in the performance assessment (PA) analysis (WHC-SD-WM-TI-7301). The estimates are compared with performance objectives defined in U.S. Department of Energy (DOE) requirements (DOE O 435.1 Chg 1,2 and companion documents DOE M 435.1-1 Chg 13 and DOE G 435.1-14). All performance objectives are currently satisfied, and operational waste acceptance criteria (HNF-EP-00635) and waste acceptance practices continue to be sufficient to maintain compliance withmore » performance objectives. Inventory estimates and associated dose estimates from future waste disposal actions are unchanged from previous years’ evaluations, which indicate potential impacts well below performance objectives. Therefore, future compliance with DOE O 435.1 Chg 1 is expected.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Will E; Nell, R. M.; Mehta, S.
This annual review provides the projected dose estimates of radionuclide inventories disposed in the active 200 West Area Low-Level Waste Burial Grounds (LLBGs) since September 26, 1988. These estimates are calculated using the original dose methodology developed in the performance assessment (PA) analysis (WHC-EP-06451). These estimates are compared with performance objectives defined in U.S. Department of Energy (DOE) requirements (DOE O 435.1 Chg 12 and its companion documents DOE M 435.1-1 Chg 13 and DOE G 435.1-14). All performance objectives are currently satisfied, and operational waste acceptance criteria (HNF-EP-00635) and waste acceptance practices continue to be sufficient to maintain compliancemore » with performance objectives. Inventory estimates and associated dose estimates from future waste disposal actions are unchanged from previous years’ evaluations, which indicate potential impacts well below performance objectives. Therefore, future compliance with DOE O 435.1 Chg 1 is expected.« less
An Equation of State for Foamed Divinylbenzene (DVB) Based on Multi-Shock Response
NASA Astrophysics Data System (ADS)
Aslam, Tariq; Schroen, Diana; Gustavsen, Richard; Bartram, Brian
2013-06-01
The methodology for making foamed Divinylbenzene (DVB) is described. For a variety of initial densities, foamed DVB is examined through multi-shock compression and release experiments. Results from multi-shock experiments on LANL's 2-stage gas gun will be presented. A simple conservative Lagrangian numerical scheme, utilizing total-variation-diminishing interpolation and an approximate Riemann solver, will be presented as well as the methodology of calibration. It has been previously demonstrated that a single Mie-Gruneisen fitting form can replicate foam multi-shock compression response at a variety of initial densities; such a methodology will be presented for foamed DVB.
Berlin, R H; Janzon, B; Rybeck, B; Schantz, B; Seeman, T
1982-01-01
A standard methodology for estimating the energy transfer characteristics of small calibre bullets and other fast missiles is proposed, consisting of firings against targets made of soft soap. The target is evaluated by measuring the size of the permanent cavity remaining in it after the shot. The method is very simple to use and does not require access to any sophisticated measuring equipment. It can be applied under all circumstances, even under field conditions. Adequate methods of calibration to ensure good accuracy are suggested. The precision and limitations of the method are discussed.
Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.
1999-01-01
Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.
Direct manipulation of wave amplitude and phase through inverse design of isotropic media
NASA Astrophysics Data System (ADS)
Liu, Y.; Vial, B.; Horsley, S. A. R.; Philbin, T. G.; Hao, Y.
2017-07-01
In this article we propose a new design methodology allowing us to control both amplitude and phase of electromagnetic waves from a cylindrical incident wave. This results in isotropic materials and does not resort to transformation optics or its quasi-conformal approximations. Our method leads to two-dimensional isotropic, inhomogeneous material profiles of permittivity and permeability, to which a general class of scattering-free wave solutions arise. Our design is based on the separation of the complex wave solution into amplitude and phase. We give two types of examples to validate our methodology.
Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.
2010-09-24
The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pugh, C.E.
2001-01-29
Numerous large-scale fracture experiments have been performed over the past thirty years to advance fracture mechanics methodologies applicable to thick-wall pressure vessels. This report first identifies major factors important to nuclear reactor pressure vessel (RPV) integrity under pressurized thermal shock (PTS) conditions. It then covers 20 key experiments that have contributed to identifying fracture behavior of RPVs and to validating applicable assessment methodologies. The experiments are categorized according to four types of specimens: (1) cylindrical specimens, (2) pressurized vessels, (3) large plate specimens, and (4) thick beam specimens. These experiments were performed in laboratories in six different countries. This reportmore » serves as a summary of those experiments, and provides a guide to references for detailed information.« less
Designing a Strategic Plan through an Emerging Knowledge Generation Process: The ATM Experience
ERIC Educational Resources Information Center
Zanotti, Francesco
2012-01-01
Purpose: The aim of this contribution is to describe a new methodology for designing strategic plans and how it was implemented by ATM, a public transportation agency based in Milan, Italy. Design/methodology/approach: This methodology is founded on a new system theory, called "quantum systemics". It is based on models and metaphors both…
Scrum Methodology in Higher Education: Innovation in Teaching, Learning and Assessment
ERIC Educational Resources Information Center
Jurado-Navas, Antonio; Munoz-Luna, Rosa
2017-01-01
The present paper aims to detail the experience developed in a classroom of English Studies from the Spanish University of Málaga, where an alternative project-based learning methodology has been implemented. Such methodology is inspired by scrum sessions widely extended in technological companies where staff members work in teams and are assigned…
Autoethnography as a Methodological Approach in Adult Vocational Education and Technology
ERIC Educational Resources Information Center
Grenier, Robin S.
2016-01-01
Autoethnography is a form of qualitative inquiry intended to connect the lived experience of a researcher to the larger social and cultural world (Ellis, 2004) and it can be a valuable methodological tool for adult educators and learners. This paper explores the use and value of autoethnography as a qualitative methodology and explores the…
Design-of-Experiments Approach to Improving Inferior Vena Cava Filter Retrieval Rates.
Makary, Mina S; Shah, Summit H; Warhadpande, Shantanu; Vargas, Ivan G; Sarbinoff, James; Dowell, Joshua D
2017-01-01
The association of retrievable inferior vena cava filters (IVCFs) with adverse events has led to increased interest in prompt retrieval, particularly in younger patients given the progressive nature of these complications over time. This study takes a design-of-experiments (DOE) approach to investigate methods to best improve filter retrieval rates, with a particular focus on younger (<60 years) patients. A DOE approach was executed in which combinations of variables were tested to best improve retrieval rates. The impact of a virtual IVCF clinic, primary care physician (PCP) letters, and discharge instructions was investigated. The decision for filter retrieval in group 1 was determined solely by the referring physician. Group 2 included those patients prospectively followed in an IVCF virtual clinic in which filter retrieval was coordinated by the interventional radiologist when clinically appropriate. In group 3, in addition to being followed through the IVCF clinic, each patient's PCP was faxed a follow-up letter, and information regarding IVCF retrieval was added to the patient's discharge instructions. A total of 10 IVCFs (8.4%) were retrieved among 119 retrievable IVCFs placed in group 1. Implementation of the IVCF clinic in group 2 significantly improved the retrieval rate to 25.3% (23 of 91 retrievable IVCFs placed, P < .05). The addition of discharge instructions and PCP letters to the virtual clinic (group 3) resulted in a retrieval rate of 33.3% (17 of 51). The retrieval rates demonstrated more pronounced improvement when examining only younger patients, with retrieval rates of 11.3% (7 of 62), 29.5% (13 of 44, P < .05), and 45.2% (14 of 31) for groups 1, 2, and 3, respectively. DOE methodology is not routinely executed in health care, but it is an effective approach to evaluating clinical practice behavior and patient quality measures. In this study, implementation of the combination of a virtual clinic, PCP letters, and discharge instructions improved retrieval rates compared with a virtual clinic alone. Quality improvement strategies such as these that augment patient and referring physician knowledge on interventional radiologic procedures may ultimately improve patient safety and personalized care. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
De Beer, T R M; Wiggenhorn, M; Hawe, A; Kasper, J C; Almeida, A; Quinten, T; Friess, W; Winter, G; Vervaet, C; Remon, J P
2011-02-15
The aim of the present study was to examine the possibilities/advantages of using recently introduced in-line spectroscopic process analyzers (Raman, NIR and plasma emission spectroscopy), within well-designed experiments, for the optimization of a pharmaceutical formulation and its freeze-drying process. The formulation under investigation was a mannitol (crystalline bulking agent)-sucrose (lyo- and cryoprotector) excipient system. The effects of two formulation variables (mannitol/sucrose ratio and amount of NaCl) and three process variables (freezing rate, annealing temperature and secondary drying temperature) upon several critical process and product responses (onset and duration of ice crystallization, onset and duration of mannitol crystallization, duration of primary drying, residual moisture content and amount of mannitol hemi-hydrate in end product) were examined using a design of experiments (DOE) methodology. A 2-level fractional factorial design (2(5-1)=16 experiments+3 center points=19 experiments) was employed. All experiments were monitored in-line using Raman, NIR and plasma emission spectroscopy, which supply continuous process and product information during freeze-drying. Off-line X-ray powder diffraction analysis and Karl-Fisher titration were performed to determine the morphology and residual moisture content of the end product, respectively. In first instance, the results showed that - besides the previous described findings in De Beer et al., Anal. Chem. 81 (2009) 7639-7649 - Raman and NIR spectroscopy are able to monitor the product behavior throughout the complete annealing step during freeze-drying. The DOE approach allowed predicting the optimum combination of process and formulation parameters leading to the desired responses. Applying a mannitol/sucrose ratio of 4, without adding NaCl and processing the formulation without an annealing step, using a freezing rate of 0.9°C/min and a secondary drying temperature of 40°C resulted in efficient freeze-drying supplying end products with a residual moisture content below 2% and a mannitol hemi-hydrate content below 20%. Finally, using Monte Carlo simulations it became possible to determine how varying the factor settings around their optimum still leads to fulfilled response criteria, herewith having an idea about the probability to exceed the acceptable response limits. This multi-dimensional combination and interaction of input variables (factor ranges) leading to acceptable response criteria with an acceptable probability reflects the process design space. Copyright © 2010 Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2012 CFR
2012-07-01
... multidisciplinary, and emphasizes scientific methodology, and may involve collaboration among institutions. (3... Rehabilitation Research Training Project? 350.12 Section 350.12 Education Regulations of the Offices of the... EDUCATION DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Projects Does the...
Code of Federal Regulations, 2014 CFR
2014-07-01
... multidisciplinary, and emphasizes scientific methodology, and may involve collaboration among institutions. (3... Rehabilitation Research Training Project? 350.12 Section 350.12 Education Regulations of the Offices of the... EDUCATION DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Projects Does the...
Code of Federal Regulations, 2013 CFR
2013-07-01
... multidisciplinary, and emphasizes scientific methodology, and may involve collaboration among institutions. (3... Rehabilitation Research Training Project? 350.12 Section 350.12 Education Regulations of the Offices of the... EDUCATION DISABILITY AND REHABILITATION RESEARCH PROJECTS AND CENTERS PROGRAM What Projects Does the...
Martini Qualitative Research: Shaken, Not Stirred
ERIC Educational Resources Information Center
Nieuwenhuis, F. J.
2015-01-01
Although the number of qualitative research studies has boomed in recent years, close observation reveals that often the research designs and methodological considerations and approaches have developed a type of configuration that does not adhere to purist definitions of the labels attached. Very often so called interpretivist studies are not…
Does Accumulated Knowledge Impact Academic Performance in Cost Accounting?
ERIC Educational Resources Information Center
Alanzi, Khalid A.; Alfraih, Mishari M.
2017-01-01
Purpose: This quantitative study aims to examine the impact of accumulated knowledge of accounting on the academic performance of Cost Accounting students. Design/methodology/approach The sample consisted of 89 students enrolled in the Accounting program run by a business college in Kuwait during 2015. Correlation and linear least squares…
75 FR 32195 - Procedures and Costs for Use of the Research Data Center
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-07
... related variables are published so that outside analysts may conduct original research and special studies... and the proposals should address why the requested data are needed for the proposed study. Overly... their application does not constitute endorsement by NCHS of the substantive, methodological...
Positioning Pedagogy--A Matter of Children's Rights
ERIC Educational Resources Information Center
Devine, Dympna; McGillicuddy, Deirdre
2016-01-01
This paper foregrounds pedagogy in the realisation of children's rights to non-discrimination and serving their best interests, as articulated in the UNCRC. Drawing on a mixed methodological study of teachers in 12 schools it does so through exploring teacher pedagogies in terms of how they "think", "do" and "talk"…
Guidelines for line-oriented flight training, volume 2
NASA Technical Reports Server (NTRS)
Lauber, J. K.; Foushee, H. C.
1981-01-01
Current approaches to line-oriented flight training used by six American airlines are described. This recurrent training methodology makes use of a full-crew and full-mission simulation to teach and assess resource management skills, but does not necessarily fulfill requirements for the training and manipulation of all skills.
Development of the Life Story in Early Adolescence
ERIC Educational Resources Information Center
Steiner, Kristina L.; Pillemer, David B.
2018-01-01
Life span developmental psychology proposes that the ability to create a coherent life narrative does not develop until early adolescence. Using a novel methodology, 10-, 12-, and 14-year-old participants were asked to tell their life stories aloud to a researcher. Later, participants separated their transcribed narratives into self-identified…
34 CFR 377.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Secretary use? The Secretary uses the following criteria to evaluate an application: (a) Plan of operation. (30 points) The Secretary reviews each application to determine the quality of the plan of operation... outcomes; (2) The extent to which the plan of operation specifies the methodology for accomplishing each...
34 CFR 377.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Secretary use? The Secretary uses the following criteria to evaluate an application: (a) Plan of operation. (30 points) The Secretary reviews each application to determine the quality of the plan of operation... outcomes; (2) The extent to which the plan of operation specifies the methodology for accomplishing each...
34 CFR 377.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Secretary use? The Secretary uses the following criteria to evaluate an application: (a) Plan of operation. (30 points) The Secretary reviews each application to determine the quality of the plan of operation... outcomes; (2) The extent to which the plan of operation specifies the methodology for accomplishing each...
34 CFR 377.21 - What selection criteria does the Secretary use?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Secretary use? The Secretary uses the following criteria to evaluate an application: (a) Plan of operation. (30 points) The Secretary reviews each application to determine the quality of the plan of operation... outcomes; (2) The extent to which the plan of operation specifies the methodology for accomplishing each...
Behavior States Are Real and Observable.
ERIC Educational Resources Information Center
Guess, Doug; Roberts, Sally; Rues, Jane
2000-01-01
This article critiques the research methodology used by Mudford, Hogg, and Roberts (1999) that resulted in a failure to achieve inter-observer agreement on adults with mental retardation when using an experimental, 13-category behavior state code. Arguments are provided on why their videotape study does not meet requirements of acceptable…
Does the Recording Medium Influence Phonetic Transcription of Cleft Palate Speech?
ERIC Educational Resources Information Center
Klintö, Kristina; Lohmander, Anette
2017-01-01
Background: In recent years, analyses of cleft palate speech based on phonetic transcriptions have become common. However, the results vary considerably among different studies. It cannot be excluded that differences in assessment methodology, including the recording medium, influence the results. Aims: To compare phonetic transcriptions from…
Teaching Citizenship under an Authoritarian Regime: A Case-Study of Burma
ERIC Educational Resources Information Center
Treadwell, Brooke Andrea
2013-01-01
What does citizenship education look like in a society ruled by an authoritarian military regime? This dissertation seeks to answer this question by examining official citizenship education policy in Burma/Myanmar and how it is implemented in contemporary government primary schools. Using critical qualitative methodology, I identify the…
Fuel Cells | Climate Neutral Research Campuses | NREL
to develop fuel cells on campus. Does your campus support telecommunications networks where there is captures waste heat to generate hot water. Additionally, the exhaust carbon dioxide is routed to an energy conversion calculation methodologies. U.S. Department of Energy - Fuel Cell Animation: Provides an
77 FR 68686 - Xylenesulfonic Acid, Sodium Salt; Exemption From the Requirement of a Tolerance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... this methodology is compounded by EPA's decision to assume that, for each commodity, the active... be affected. The North American Industrial Classification System (NAICS) codes have been provided to....'' This includes exposure through drinking water and in residential settings, but does not include...
77 FR 14482 - Petroleum Reduction and Alternative Fuel Consumption Requirements for Federal Fleets
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... agencies to use this methodology to determine fleet inventory targets and to prepare fleet management plans.... Department of Energy, Office of Energy Efficiency and Renewable Energy, Federal Energy Management Program (EE... DOE receives will be made available on the Federal Energy Management Program's Federal Fleet...
Does Positive Affect Influence Health?
ERIC Educational Resources Information Center
Pressman, Sarah D.; Cohen, Sheldon
2005-01-01
This review highlights consistent patterns in the literature associating positive affect (PA) and physical health. However, it also raises serious conceptual and methodological reservations. Evidence suggests an association of trait PA and lower morbidity and of state and trait PA and decreased symptoms and pain. Trait PA is also associated with…
Does Training Influence Organisational Performance?: Analysis of the Spanish Hotel Sector
ERIC Educational Resources Information Center
Ubeda-Garcia, Mercedes; Marco-Lajara, Bartolome; Sabater-Sempere, Vicente; Garcia-Lillo, Francisco
2013-01-01
Purpose: The aim of the paper is to identify which variables of training policy have a significant and positive impact on organisational performance. Design/methodology/approach: A targeted literature review was conducted to identify and collate a comprehensive range of human resource management and training conceptualisations/investigations. This…
Does Science Presuppose Naturalism (or Anything at All)?
ERIC Educational Resources Information Center
Fishman, Yonatan I.; Boudry, Maarten
2013-01-01
Several scientists, scientific institutions, and philosophers have argued that science is committed to Methodological Naturalism (MN), the view that science, by virtue of its methods, is limited to studying "natural" phenomena and cannot consider or evaluate hypotheses that refer to supernatural entities. While they may in fact exist, gods,…
Educational Neuroscience: Motivations, Methodology, and Implications
ERIC Educational Resources Information Center
Campbell, Stephen R.
2011-01-01
"What does the brain have to do with learning?" "Prima facie", this may seem like a strange thing for anyone to say, especially educational scholars, researchers, practitioners, and policy makers. There are, however, valid objections to injecting various and sundry neuroscientific considerations piecemeal into the vast field of education. These…
Employability for the Workers--What Does This Mean?
ERIC Educational Resources Information Center
Little, Brenda Margaret
2011-01-01
Purpose: UK government strategies for higher education (HE) continue to emphasise the promotion and enhancement of students' employability skills and subsequent graduate opportunities. The purpose of this paper is to explore what this means for those HE learners already in work. Design/methodology/approach: The paper presents the findings of a…
76 FR 65631 - Energy Conservation Program: Test Procedures for Microwave Ovens
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... Conservation Program: Test Procedures for Microwave Ovens AGENCY: Office of Energy Efficiency and Renewable... (DOE) has initiated a test procedure rulemaking to develop active mode testing methodologies for... Federal Register a final rule for the microwave oven test procedure rulemaking (July TP repeal final rule...
How Does Leadership Development Help Universities Become Learning Organisations?
ERIC Educational Resources Information Center
Gentle, Paul; Clifton, Louise
2017-01-01
Purpose: The purpose of this paper is to draw on empirical data to interrogate the correlation between participation in leadership development programmes by individual leaders and the ability of higher education institutions to learn organisationally from such participation. Design/methodology/approach: Applying a multi-stakeholder perspective,…
Professors Behaving Badly: Faculty Misconduct in Graduate Education
ERIC Educational Resources Information Center
Braxton, John M.; Proper, Eve; Bayer, Alan E.
2011-01-01
A faculty member publishes an article without offering coauthorship to a graduate assistant who has made a substantial conceptual or methodological contribution to the article. A professor does not permit graduate students to express viewpoints different from her own. A graduate student close to finishing his dissertation cannot reach his…
Finding a Voice through Research
ERIC Educational Resources Information Center
Lobo, Jose; Vizcaino, Alida
2006-01-01
One question guided this experimental study: What impact does the change from teacher training to educational research have on university teachers' methodology and attitudes to teaching? To find answers to this question, the researchers selected five teachers of English as a Foreign Language (EFL) at the language centre of a private university on…
Does Teaching Ethics Do Any Good?
ERIC Educational Resources Information Center
Jonson, Elizabeth Prior; McGuire, Linda; Cooper, Brian
2016-01-01
Purpose: This matched-pairs study of undergraduates at an Australian University investigates whether business ethics education has a positive effect on student ethical behaviour. The paper aims to discuss this issue. Design/methodology/approach: This study uses a matched-pairs design to look at responses before and after students have taken a…
Does Organizational Forgetting Matter? Organizational Survival for Life Coaching Companies
ERIC Educational Resources Information Center
Aydin, Erhan; Gormus, Alparslan Sahin
2015-01-01
Purpose: The purposes of this paper are to determine the role of organizational forgetting in different type of coaching companies and to determine organizational survival based on both knowledge structure of coaching companies and organizational forgetting with core features of organizations. Design/methodology/approach: Within the context of…
32 CFR 218.3 - Dose reconstruction methodology.
Code of Federal Regulations, 2010 CFR
2010-07-01
... each source of radiation. Detailed modeling of the human body, in appropriate postures in the trench... radiation field does not reflect the shielding of the film badge afforded by the human body. This shielding has been determined for pertinent body positions by the solution of radiation transport equations as...
32 CFR 218.3 - Dose reconstruction methodology.
Code of Federal Regulations, 2011 CFR
2011-07-01
... each source of radiation. Detailed modeling of the human body, in appropriate postures in the trench... radiation field does not reflect the shielding of the film badge afforded by the human body. This shielding has been determined for pertinent body positions by the solution of radiation transport equations as...
ERIC Educational Resources Information Center
Agbenyega, Joseph S.; Tamakloe, Deborah E.; Klibthong, Sunanta
2017-01-01
This research utilised a "stimulated recall" methodology [Calderhead, J. 1981. "Stimulated Recall: A Method for Research on Teaching." "British Journal of Educational Psychology" 51: 211-217] to explore the potential of African folklore, specifically Ghanaian folk stories in the development of children's reflective…
Students' Entrepreneurial Self-Efficacy: Does the Teaching Method Matter?
ERIC Educational Resources Information Center
Abaho, Ernest; Olomi, Donath R.; Urassa, Goodluck Charles
2015-01-01
Purpose: The purpose of this paper is to examine the various entrepreneurship teaching methods in Uganda and how these methods relate to entrepreneurial self-efficacy (ESE). Design/methodology/approach: A sample of 522 final year students from selected universities and study programs was surveyed using self-reported questionnaires. Findings: There…
Spending on School Infrastructure: Does Money Matter?
ERIC Educational Resources Information Center
Crampton, Faith E.
2009-01-01
Purpose: The purpose of this study is to further develop an emerging thread of quantitative research that grounds investment in school infrastructure in a unified theoretical framework of investment in human, social, and physical capital. Design/methodology/approach: To answer the research question, what is the impact of investment in human,…
10 CFR 830.204 - Documented safety analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Documented safety analysis. 830.204 Section 830.204 Energy DEPARTMENT OF ENERGY NUCLEAR SAFETY MANAGEMENT Safety Basis Requirements § 830.204 Documented safety analysis... approval from DOE for the methodology used to prepare the documented safety analysis for the facility...
Found in Translation: Interdisciplinary Arts Integration in Project AIM
ERIC Educational Resources Information Center
Pruitt, Lara; Ingram, Debra; Weiss, Cynthia
2014-01-01
This paper will share the arts-integration methodology used in Project AIM and address the question; "How is translation evident in interdisciplinary arts instruction, and how does it affect students?" Methods: The staff and researchers from Project AIM, (an arts-integration program of the Center for Community Arts Partnerships at…
ERIC Educational Resources Information Center
Beinicke, Andrea; Pässler, Katja; Hell, Benedikt
2014-01-01
The study investigates consequences of eliminating items showing gender-specific differential item functioning (DIF) on the psychometric structure of a standard RIASEC interest inventory. Holland's hexagonal model was tested for structural invariance using a confirmatory methodological approach (confirmatory factor analysis and randomization…
Chronic Pain and Depression: Does the Evidence Support a Relationship?
ERIC Educational Resources Information Center
Romano, Joan M.; Turner, Judith A.
1985-01-01
A critical evaluation of the relevant literature provides some support for an association between depression and chronic pain. Common conceptual and methodological problems are discussed. Current biological and psychological models of the mechanisms by which the two syndromes may interact are summarized, and suggestions are made for future…
Does Personality Type Effect Online versus In-Class Course Satisfaction?
ERIC Educational Resources Information Center
Daughenbaugh, Richard; Ensminger, David; Frederick, Lynda; Surry, Daniel
This study sought to determine if different personality types express more or less satisfaction with courses delivered online versus those delivered in the classroom. The methodology employed two online surveys--the Keirsey Temperament Sorter (KTS) and a course satisfaction instrument. The participants were 146 college students taking online and…
Olson's "Cognitive Development": A Commentary.
ERIC Educational Resources Information Center
Follettie, Joseph F.
This report is a review of Olson's "Cognitive Development." Unlike a typical book review it does not compare and contrast the author's theoretical framework and methodological practices with those of others in the field, but rather it extensively describes and critiques the reported empirical work. The reasons given for this approach are that…
Researching Entrepreneurship and Education. Part 1: What Is Entrepreneurship and Does It Matter?
ERIC Educational Resources Information Center
Matlay, Harry
2005-01-01
Purpose: This article is the first in a series of conceptual and empirical contributions that, individually and cumulatively, seek to analyse, develop and link two important fields of research: "entrepreneurship" and "entrepreneurship education." Design/Methodology/Approach: The paper undertakes a critical literature review and a methodical…
Design an optimum safety policy for personnel safety management - A system dynamic approach
NASA Astrophysics Data System (ADS)
Balaji, P.
2014-10-01
Personnel safety management (PSM) ensures that employee's work conditions are healthy and safe by various proactive and reactive approaches. Nowadays it is a complex phenomenon because of increasing dynamic nature of organisations which results in an increase of accidents. An important part of accident prevention is to understand the existing system properly and make safety strategies for that system. System dynamics modelling appears to be an appropriate methodology to explore and make strategy for PSM. Many system dynamics models of industrial systems have been built entirely for specific host firms. This thesis illustrates an alternative approach. The generic system dynamics model of Personnel safety management was developed and tested in a host firm. The model was undergone various structural, behavioural and policy tests. The utility and effectiveness of model was further explored through modelling a safety scenario. In order to create effective safety policy under resource constraint, DOE (Design of experiment) was used. DOE uses classic designs, namely, fractional factorials and central composite designs. It used to make second order regression equation which serve as an objective function. That function was optimized under budget constraint and optimum value used for safety policy which shown greatest improvement in overall PSM. The outcome of this research indicates that personnel safety management model has the capability for acting as instruction tool to improve understanding of safety management and also as an aid to policy making.
ERIC Educational Resources Information Center
Bradford, Gyndolyn
2017-01-01
The idea of mentoring in higher education is considered a good thing for students and faculty. What is missing in the research is how does mentoring influence and shape the student experience, does mentoring help retention, and how does it contribute to student development? (Crisp, Baker, Griffin, Lunsford, Pifer, 2017). The mentoring relationship…
Simundic, Ana-Maria; Nikolac, Nora; Topic, Elizabeta
2009-01-01
The aims of this article are to evaluate the methodological quality of genetic association studies on the inherited thrombophilia published during 2003 to 2005, to identify the most common mistakes made by authors of those studies, and to examine if overall quality of the article correlates with the quality of the journal. Articles were evaluated by 2 independent reviewers using the checklist of 16 items. A total of 58 eligible studies were identified. Average total score was 7.59 +/- 1.96. Total article score did not correlate with the journal impact factor (r = 0.3971; 95% confidence interval [CI], 0.1547-0.5944, P = .002). Total score did not differ across years (P = .624). Finally, it is concluded that methodological quality of genetic association studies is not optimal, and it does not depend on the quality of the journal. Journals should adopt methodological criteria for reporting the genetic association studies, and editors should encourage authors to strictly adhere to those criteria.
NASA Technical Reports Server (NTRS)
Owe, Manfred; deJeu, Richard; Walker, Jeffrey; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
A methodology for retrieving surface soil moisture and vegetation optical depth from satellite microwave radiometer data is presented. The procedure is tested with historical 6.6 GHz brightness temperature observations from the Scanning Multichannel Microwave Radiometer over several test sites in Illinois. Results using only nighttime data are presented at this time, due to the greater stability of nighttime surface temperature estimation. The methodology uses a radiative transfer model to solve for surface soil moisture and vegetation optical depth simultaneously using a non-linear iterative optimization procedure. It assumes known constant values for the scattering albedo and roughness. Surface temperature is derived by a procedure using high frequency vertically polarized brightness temperatures. The methodology does not require any field observations of soil moisture or canopy biophysical properties for calibration purposes and is totally independent of wavelength. Results compare well with field observations of soil moisture and satellite-derived vegetation index data from optical sensors.
Rabbit Trails, Ephemera, and Other Stories: Feminist Methodology and Collaborative Research.
ERIC Educational Resources Information Center
Burnett, Rebecca E.; Ewald, Helen Rothschild
1994-01-01
Explores the intersections between feminist methodology and collaborative research to identify flashpoints that might derail attempts at collaboration. Analyzes the personal experiences of the authors in four ongoing collaborative research groups. (HB)
2017-01-01
Parkinson’s Disease (PD) is a progressive neurodegenerative movement disease affecting over 6 million people worldwide. Loss of dopamine-producing neurons results in a range of both motor and non-motor symptoms, however there is currently no definitive test for PD by non-specialist clinicians, especially in the early disease stages where the symptoms may be subtle and poorly characterised. This results in a high misdiagnosis rate (up to 25% by non-specialists) and people can have the disease for many years before diagnosis. There is a need for a more accurate, objective means of early detection, ideally one which can be used by individuals in their home setting. In this investigation, keystroke timing information from 103 subjects (comprising 32 with mild PD severity and the remainder non-PD controls) was captured as they typed on a computer keyboard over an extended period and showed that PD affects various characteristics of hand and finger movement and that these can be detected. A novel methodology was used to classify the subjects’ disease status, by utilising a combination of many keystroke features which were analysed by an ensemble of machine learning classification models. When applied to two separate participant groups, this approach was able to successfully discriminate between early-PD subjects and controls with 96% sensitivity, 97% specificity and an AUC of 0.98. The technique does not require any specialised equipment or medical supervision, and does not rely on the experience and skill of the practitioner. Regarding more general application, it currently does not incorporate a second cardinal disease symptom, so may not differentiate PD from similar movement-related disorders. PMID:29190695
A Prospective Analysis of the Costs, Benefits, and Impacts of U.S. Renewable Portfolio Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mai, Trieu; Wiser, Ryan; Barbose, Galen
As states have gained experience with renewable portfolio standards (RPS) policies, many have made significant revisions to existing programs. In 2015 and 2016, seven states raised and extended their final RPS targets, while another state enacted a new RPS policy (Barbose 2016b). Interest in expanding and strengthening state RPS programs may continue, while efforts like recent proposals in many states to repeal or freeze existing RPS policies may also persist. In either context, questions about the potential costs, benefits, and other impacts of RPS programs are usually central to the decision-making process. This report follows on previous analyses that havemore » focused on the historical costs, benefits, and other impacts of existing state RPS programs (Heeter et al. 2014; Wiser et al. 2016a). This report examines RPS outcomes prospectively, considering both current RPS policies as well as a potential expansion of those policies. The goal of this work is to provide a consistent and independent analytical methodology for that examination. This analysis relies on National Renewable Energy Laboratory’s (NREL’s) Regional Energy Deployment System (ReEDS) model to estimate changes to the U.S. electric power sector across a number of scenarios and sensitivity cases, focusing on the 2015–2050 timeframe. Based on those modeled results, we evaluate the costs, benefits, and other impacts of renewable energy contributing to RPS compliance using the suite of methods employed in a number of recent studies sponsored by the U.S. Department of Energy (DOE): a report examining retrospective benefits and impacts of RPS programs (Wiser et al. 2016a), the Wind Vision report (DOE 2015), the On the Path to SunShot report focusing on environmental benefits (Wiser et al. 2016b), and the Hydropower Vision report (DOE 2016).« less
Aerodynamic Performance of an Active Flow Control Configuration Using Unstructured-Grid RANS
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Viken, Sally A.
2001-01-01
This research is focused on assessing the value of the Reynolds-Averaged Navier-Stokes (RANS) methodology for active flow control applications. An experimental flow control database exists for a TAU0015 airfoil, which is a modification of a NACA0015 airfoil. The airfoil has discontinuities at the leading edge due to the implementation of a fluidic actuator and aft of mid chord on the upper surface. This paper documents two- and three-dimensional computational results for the baseline wing configuration (no control) with tile experimental results. The two-dimensional results suggest that the mid-chord discontinuity does not effect the aerodynamics of the wing and can be ignored for more efficient computations. The leading-edge discontinuity significantly affects tile lift and drag; hence, the integrity of the leading-edge notch discontinuity must be maintained in the computations to achieve a good match with the experimental data. The three-dimensional integrated performance results are in good agreement with the experiments inspite of some convergence and grid resolution issues.
Optimizing LX-17 Thermal Decomposition Model Parameters with Evolutionary Algorithms
NASA Astrophysics Data System (ADS)
Moore, Jason; McClelland, Matthew; Tarver, Craig; Hsu, Peter; Springer, H. Keo
2017-06-01
We investigate and model the cook-off behavior of LX-17 because this knowledge is critical to understanding system response in abnormal thermal environments. Thermal decomposition of LX-17 has been explored in conventional ODTX (One-Dimensional Time-to-eXplosion), PODTX (ODTX with pressure-measurement), TGA (thermogravimetric analysis), and DSC (differential scanning calorimetry) experiments using varied temperature profiles. These experimental data are the basis for developing multiple reaction schemes with coupled mechanics in LLNL's multi-physics hydrocode, ALE3D (Arbitrary Lagrangian-Eulerian code in 2D and 3D). We employ evolutionary algorithms to optimize reaction rate parameters on high performance computing clusters. Once experimentally validated, this model will be scalable to a number of applications involving LX-17 and can be used to develop more sophisticated experimental methods. Furthermore, the optimization methodology developed herein should be applicable to other high explosive materials. This work was performed under the auspices of the U.S. DOE by LLNL under contract DE-AC52-07NA27344. LLNS, LLC.
Optimization and kinetic study of ultrasonic assisted esterification process from rubber seed oil.
Trinh, Huong; Yusup, Suzana; Uemura, Yoshimitsu
2018-01-01
Recently, rubber seed oil (RSO) has been considered as a promising potential oil source for biodiesel production. However, RSO is a non-edible feedstock with a significant high free fatty acid (FFA) content which has an adverse impact on the process of biodiesel production. In this study, ultrasonic-assisted esterification process was conducted as a pre-treatment step to reduce the high FFA content of RSO from 40.14% to 0.75%. Response surface methodology (RSM) using central composite design (CCD) was applied to the design of experiments (DOE) and the optimization of esterification process. The result showed that methanol to oil molar ratio was the most influential factor for FFA reduction whereas the effect of amount of catalyst and the reaction were both insignificant. The kinetic study revealed that the activation energy and the frequency factor of the process are 52.577kJ/mol and 3.53×10 8 min -1 , respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dvorakova, Antonie
2016-12-01
When Hall, Yip, and Zárate (2016) suggested that cultural psychology focused on reporting differences between groups, they described comparative research conducted in other fields, including cross-cultural psychology. Cultural psychology is a different discipline with methodological approaches reflecting its dissimilar goal, which is to highlight the cultural grounding of human psychological characteristics, and ultimately make culture central to psychology in general. When multicultural psychology considers, according to Hall et al., the mechanisms of culture's influence on behavior, it treats culture the same way as cross-cultural psychology does. In contrast, cultural psychology goes beyond treating culture as an external variable when it proposes that culture and psyche are mutually constitutive. True psychology of the human experience must encompass world populations through research of the ways in which (a) historically grounded sociocultural contexts enable the distinct meaning systems that people construct, and (b) these systems simultaneously guide the human formation of the environments. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Verbal overshadowing of face memory does occur in children too!
Dehon, Hedwige; Vanootighem, Valentine; Brédart, Serge
2013-01-01
Verbal descriptions of unfamiliar faces have been found to impair later identification of these faces in adults, a phenomenon known as the “verbal overshadowing effect” (VOE). Although determining whether children are good at describing unfamiliar individuals and whether these descriptions impair their recognition performance is critical to gaining a better understanding children's eyewitness ability, only a couple of studies have examined this dual issue in children and these found no evidence of VOE. However, as there are some methodological criticisms of these studies, we decided to conduct two further experiments in 7–8, 10–11, and 13–14-year-old children and in adults using a more optimal method for the VOE to be observed. Evidence of the VOE on face identification was found in both children and adults. Moreover, neither the accuracy of descriptions, nor delay nor target presence in the lineup was found to be associated with identification accuracy. The theoretical and developmental implications of these findings are discussed. PMID:24399985
On use of the multistage dose-response model for assessing laboratory animal carcinogenicity
Nitcheva, Daniella; Piegorsch, Walter W.; West, R. Webster
2007-01-01
We explore how well a statistical multistage model describes dose-response patterns in laboratory animal carcinogenicity experiments from a large database of quantal response data. The data are collected from the U.S. EPA’s publicly available IRIS data warehouse and examined statistically to determine how often higher-order values in the multistage predictor yield significant improvements in explanatory power over lower-order values. Our results suggest that the addition of a second-order parameter to the model only improves the fit about 20% of the time, while adding even higher-order terms apparently does not contribute to the fit at all, at least with the study designs we captured in the IRIS database. Also included is an examination of statistical tests for assessing significance of higher-order terms in a multistage dose-response model. It is noted that bootstrap testing methodology appears to offer greater stability for performing the hypothesis tests than a more-common, but possibly unstable, “Wald” test. PMID:17490794
Human punishment is not primarily motivated by inequality
Marczyk, Jesse
2017-01-01
Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player’s payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value. PMID:28187166
Trajectory Dispersed Vehicle Process for Space Launch System
NASA Technical Reports Server (NTRS)
Statham, Tamara; Thompson, Seth
2017-01-01
The Space Launch System (SLS) vehicle is part of NASA's deep space exploration plans that includes manned missions to Mars. Manufacturing uncertainties in design parameters are key considerations throughout SLS development as they have significant effects on focus parameters such as lift-off-thrust-to-weight, vehicle payload, maximum dynamic pressure, and compression loads. This presentation discusses how the SLS program captures these uncertainties by utilizing a 3 degree of freedom (DOF) process called Trajectory Dispersed (TD) analysis. This analysis biases nominal trajectories to identify extremes in the design parameters for various potential SLS configurations and missions. This process utilizes a Design of Experiments (DOE) and response surface methodologies (RSM) to statistically sample uncertainties, and develop resulting vehicles using a Maximum Likelihood Estimate (MLE) process for targeting uncertainties bias. These vehicles represent various missions and configurations which are used as key inputs into a variety of analyses in the SLS design process, including 6 DOF dispersions, separation clearances, and engine out failure studies.
NASA Astrophysics Data System (ADS)
Khai Tran, Van
2018-04-01
Socio-economic growth in Vietnam greatly depends on the new rural development process in this country; the work is considered a strategic position in national development. Thus, the study of the principles of the architectural design of new rural housing solutions in accordance with the New Rural Environment as required by the Vietnam Ministry of Construction has become urgent. Climate change has become a global concern, so the creation of significant impacts by architectural designs to respond to climate change and make the living environment of rural people better is a major demand. Experience has shown that the dogmatic application of current urban-type housing does not reach the requirements of New Rural Housing. This research intends to show that the solutions of the traditional Vietnamese rural house, which retains the advantages of traditional architecture with excellent characteristics that have been challenged over thousands of years, when combined with advanced design methodologies and technologies, will be the appropriate solution for ‘New Rural Housing’.
Aguirre, Ana-Maria; Bassi, Amarjeet
2014-07-01
Biofuels from algae are considered a technically viable energy source that overcomes several of the problems present in previous generations of biofuels. In this research high pressure steaming (HPS) was studied as a hydrothermal pre-treatment for extraction of lipids from Chlorella vulgaris, and analysis by response surface methodology allowed finding operational points in terms of target temperature and algae concentration for high lipid and glucose yields. Within the range covered by these experiments the best conditions for high bio-crude yield are temperatures higher than 174°C and low biomass concentrations (<5 g/L). For high glucose yield there are two suitable operational ranges, either low temperatures (<105°C) and low biomass concentrations (<4 g/L); or low temperatures (<105°C) and high biomass concentrations (<110 g/L). High pressure steaming is a good hydrothermal treatment for lipid recovery and does not significantly change the fatty acids profile for the range of temperatures studied. Copyright © 2014 Elsevier Ltd. All rights reserved.
Thermal-Aware Test Access Mechanism and Wrapper Design Optimization for System-on-Chips
NASA Astrophysics Data System (ADS)
Yu, Thomas Edison; Yoneda, Tomokazu; Chakrabarty, Krishnendu; Fujiwara, Hideo
Rapid advances in semiconductor manufacturing technology have led to higher chip power densities, which places greater emphasis on packaging and temperature control during testing. For system-on-chips, peak power-based scheduling algorithms have been used to optimize tests under specified power constraints. However, imposing power constraints does not always solve the problem of overheating due to the non-uniform distribution of power across the chip. This paper presents a TAM/Wrapper co-design methodology for system-on-chips that ensures thermal safety while still optimizing the test schedule. The method combines a simplified thermal-cost model with a traditional bin-packing algorithm to minimize test time while satisfying temperature constraints. Furthermore, for temperature checking, thermal simulation is done using cycle-accurate power profiles for more realistic results. Experiments show that even a minimal sacrifice in test time can yield a considerable decrease in test temperature as well as the possibility of further lowering temperatures beyond those achieved using traditional power-based test scheduling.
Hydrogel design of experiments methodology to optimize hydrogel for iPSC-NPC culture.
Lam, Jonathan; Carmichael, S Thomas; Lowry, William E; Segura, Tatiana
2015-03-11
Bioactive signals can be incorporated in hydrogels to direct encapsulated cell behavior. Design of experiments methodology methodically varies the signals systematically to determine the individual and combinatorial effects of each factor on cell activity. Using this approach enables the optimization of three ligands concentrations (RGD, YIGSR, IKVAV) for the survival and differentiation of neural progenitor cells. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, Paul R.; Boyce, Donald E.; Park, Jun-Sang
A robust methodology is presented to extract slip system strengths from lattice strain distributions for polycrystalline samples obtained from high-energy x-ray diffraction (HEXD) experiments with in situ loading. The methodology consists of matching the evolution of coefficients of a harmonic expansion of the distributions from simulation to the coefficients derived from measurements. Simulation results are generated via finite element simulations of virtual polycrystals that are subjected to the loading history applied in the HEXD experiments. Advantages of the methodology include: (1) its ability to utilize extensive data sets generated by HEXD experiments; (2) its ability to capture trends in distributionsmore » that may be noisy (both measured and simulated); and (3) its sensitivity to the ratios of the family strengths. The approach is used to evaluate the slip system strengths of Ti-6Al-4V using samples having relatively equiaxed grains. These strength estimates are compared to values in the literature.« less
Dipolar recoupling in solid state NMR by phase alternating pulse sequences
Lin, J.; Bayro, M.; Griffin, R. G.; Khaneja, N.
2009-01-01
We describe some new developments in the methodology of making heteronuclear and homonuclear recoupling experiments in solid state NMR insensitive to rf-inhomogeneity by phase alternating the irradiation on the spin system every rotor period. By incorporating delays of half rotor periods in the pulse sequences, these phase alternating experiments can be made γ encoded. The proposed methodology is conceptually different from the standard methods of making recoupling experiments robust by the use of ramps and adiabatic pulses in the recoupling periods. We show how the concept of phase alternation can be incorporated in the design of homonuclear recoupling experiments that are both insensitive to chemical-shift dispersion and rf-inhomogeneity. PMID:19157931
ERIC Educational Resources Information Center
Zimmerman, Aaron Samuel; Kim, Jeong-Hee
2017-01-01
Narrative inquiry has been a popular methodology in different disciplines for the last few decades. Using stories, narrative inquiry illuminates lived experience, serving as a valuable complement to research methodologies that are rooted in positivist epistemologies. In this article, we present a brief introduction to narrative inquiry including…
Story-Making as Methodology: Disrupting Dominant Stories through Multimedia Storytelling.
Rice, Carla; Mündel, Ingrid
2018-05-01
In this essay, we discuss multimedia story-making methodologies developed through Re•Vision: The Centre for Art and Social Justice that investigates the power of the arts, especially story, to positively influence decision makers in diverse sectors. Our story-making methodology brings together majority and minoritized creators to represent previously unattended experiences (e.g., around mind-body differences, queer sexuality, urban Indigenous identity, and Inuit cultural voice) with an aim to building understanding and shifting policies/practices that create barriers to social inclusion and justice. We analyze our ongoing efforts to rework our storytelling methodology, spotlighting acts of revising carried out by facilitators and researchers as they/we redefine methodological terms for each storytelling context, by researcher-storytellers as they/we rework material from our lives, and by receivers of the stories as we revise our assumptions about particular embodied histories and how they are defined within dominant cultural narratives and institutional structures. This methodology, we argue, contributes to the existing qualitative lexicon by providing innovative new approaches not only for chronicling marginalized/misrepresented experiences and critically researching selves, but also for scaffolding intersectional alliances and for imagining more just futures. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.
Where does good quality qualitative health care research get published?
Richardson, Jane C; Liddle, Jennifer
2017-09-01
This short report aims to give some insight into current publication patterns for high-quality qualitative health research, using the Research Excellence Framework (REF) 2014 database. We explored patterns of publication by range and type of journal, by date and by methodological focus. We also looked at variations between the publications submitted to different Units of Assessment, focussing particularly on the one most closely aligned with our own research area of primary care. Our brief analysis demonstrates that general medical/health journals with high impact factors are the dominant routes of publication, but there is variation according to the methodological approach adopted by articles. The number of qualitative health articles submitted to REF 2014 overall was small, and even more so for articles based on mixed methods research, qualitative methodology or reviews/syntheses that included qualitative articles.
77 FR 53059 - Risk-Based Capital Guidelines: Market Risk
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under current methodologies; and increase transparency through enhanced disclosures. The final rule does not include all of the methodologies adopted by the Basel Committee on Banking Supervision for calculating the standardized specific risk capital requirements for debt and securitization positions due to their reliance on credit ratings, which is impermissible under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. Instead, the final rule includes alternative methodologies for calculating standardized specific risk capital requirements for debt and securitization positions.
Flight Screening Program Effects on Attrition in Undergraduate Pilot Training
1987-08-01
the final fiveý lesson grades (8-12), suggesting that a UPT screening decision could be made at an earl~er stage of FSP than is the current practice...Does FSP Provide An Opportunity For SIE? ....... .... 6 Training/EAperience Effects of FS?: Does the FSP Give a Training/ Experience Benefit in UPT...effect. FSP Screening: Does FSP Provide an Opportunity for SIE? Some individuals who have had no previous flying experience (other than as passengers) may
Using Sandelowski and Barroso's Meta-Synthesis Method in Advancing Qualitative Evidence.
Ludvigsen, Mette S; Hall, Elisabeth O C; Meyer, Gabriele; Fegran, Liv; Aagaard, Hanne; Uhrenfeldt, Lisbeth
2016-02-01
The purpose of this article was to iteratively account for and discuss the handling of methodological challenges in two qualitative research syntheses concerning patients' experiences of hospital transition. We applied Sandelowski and Barroso's guidelines for synthesizing qualitative research, and to our knowledge, this is the first time researchers discuss their methodological steps. In the process, we identified a need for prolonged discussions to determine mutual understandings of the methodology. We discussed how to identify the appropriate qualitative research literature and how to best conduct exhaustive literature searches on our target phenomena. Another finding concerned our status as third-order interpreters of participants' experiences and what this meant for synthesizing the primary findings. Finally, we discussed whether our studies could be classified as metasummaries or metasyntheses. Although we have some concerns regarding the applicability of the methodology, we conclude that following Sandelowski and Barroso's guidelines contributed to valid syntheses of our studies. © The Author(s) 2015.
The work of Galileo and conformation of the experiment in physics
NASA Astrophysics Data System (ADS)
Alvarez, J. L.; Posadas, Y.
2003-02-01
It is very frequent to find comments and references to Galileo's work suggesting that he based his affirmations on a logic thought and not on observations. In this paper we present an analysis of some experiments that he realized and were unknown in the XVI and XVII centuries; in they we find a clear description of the methodology that Galileo follows in order to reach the results that he presents in his formal work, particularly in Discorsi. In contrast with the Aristotelian philosophy, in these manuscripts Galileo adopt a methodology with which he obtain great contributions for the modem conformation of the experimental method, founding so a methodology for the study of the movement. We use this analysis as an example of the difficulties that are present in the conformation of the modem experimentation and we point out the necessity to stress the importance of the scientific methodology in the teaching of physics.
Fudin, R
1999-08-01
Analyses of procedures in Lloyd H. Silverman's subliminal psychodynamic activation experiments identify problems and questions. Given the information provided, none of his experiments can be replicated, and none of his positive results were found under luminance conditions he reckoned in 1983 were typical of such outcomes. Furthermore, there is no evidence in any of his experiments that all stimuli were presented completely within the fovea, a condition critical to the production of positive findings (Silverman & Geisler, 1986). These considerations and the fact that no experiment using Silverman's procedures can yield unambiguous positive results (Fudin, 1986) underscore the need to start anew research in this area. Such research should be undertaken with a greater appreciation of methodological issues involved in exposing and encoding subliminal stimuli than that found in all but a few experiments on subliminal psychodynamic activation.
Synthesis and use of 2-[ 18F]fluoromalondialdehyde, an accessible synthon for bioconjugation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooker, Jacob M.
We proposed methods for the synthesis and purification of 2-[ 18F]fluoromalondialdehyde, which will be a readily accessible synthon for bioconjugation. Our achievements in these areas will specifically address a stated goal of the DOE providing a transformational technology for macromolecule radiolabeling. Accomplishment of our aims will serve both DOE mission-related research as well as nuclear medicine research supported by the NIH and industry. At the heart of our proposal is the aim to “improve synthetic methodology for rapidly and efficiently incorporating radionuclides into a wide range of organic compounds.”
NASA Technical Reports Server (NTRS)
Schulte, Peter Z.; Moore, James W.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.
How to survive (and enjoy) doing a thesis: the experiences of a methodological working group.
Giddings, Lynne S; Wood, Pamela J
2006-03-01
'Doing a thesis', whether for Masters or PhD, can be a lonely and tortuous journey. This article offers a complementary process to the traditional apprenticeship supervision model. It describes the experiences of students who during their thesis research met monthly in a grounded theory working group. They reflected on their experiences during a focus group interview. After describing the background to how the group started in 1999 and exploring some of the ideas in the literature concerning the thesis experience, the article presents the interview. To focus the presentation, specific questions are used as category headings. Overall, the participants found attending the group was a "life-line" that gave them "hope" and was complementary to the supervision process. Through the support of peers, guidance from those ahead in the process, and consultancy with teachers and visiting methodological scholars, these students not only successfully completed their theses, but reported that they had some enjoyment along the way. This is the fifteenth in a series of articles which have been based on interviews with nursing and midwifery researchers, and were primarily designed to offer the beginning researcher a first-hand account of the experience of using particular methodologies.
Bayesian cross-entropy methodology for optimal design of validation experiments
NASA Astrophysics Data System (ADS)
Jiang, X.; Mahadevan, S.
2006-07-01
An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.
2017 DOE Vehicle Technologies Office Annual Merit Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The 2017 U.S. Department of Energy (DOE) Hydrogen and Fuel Cells Program and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 5-9, 2017, in Washington, DC. The review encompassed work done by the Hydrogen and Fuel Cells Program and VTO: 263 individual activities were reviewed for VTO by 191 reviewers. Exactly 1,241 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputsmore » to DOE with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.« less
2016 DOE Vehicle Technologies Office Annual Merit Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The 2016 U.S. Department of Energy (DOE) Hydrogen and Fuel Cells Program and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 6-9, 2016, in Washington, DC. The review encompassed work done by the Hydrogen and Fuel Cells Program and VTO: 226 individual activities were reviewed for VTO, by 171 reviewers. A total of 1,044 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia tomore » give inputs to DOE with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.« less
2015 DOE Vehicle Technologies Office Annual Merit Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The 2015 U.S. Department of Energy (DOE) Fuel Cell Technologies Office (FCTO) and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 8-12, 2015, in Arlington, Virginia. The review encompassed all of the work done by the FCTO and the VTO: 258 individual activities were reviewed for VTO, by 170 reviewers. A total of 1,095 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia tomore » give inputs to DOE on the Office with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.« less
2010-01-01
Background In rubber hand illusions and full body illusions, touch sensations are projected to non-body objects such as rubber hands, dolls or virtual bodies. The robustness, limits and further perceptual consequences of such illusions are not yet fully explored or understood. A number of experiments are reported that test the limits of a variant of the rubber hand illusion. Methodology/Principal Findings A variant of the rubber hand illusion is explored, in which the real and foreign hands are aligned in personal space. The presence of the illusion is ascertained with participants' scores and temperature changes of the real arm. This generates a basic illusion of touch projected to a foreign arm. Participants are presented with further, unusual visuotactile stimuli subsequent to onset of the basic illusion. Such further visuotactile stimulation is found to generate very unusual experiences of supernatural touch and touch on a non-hand object. The finding of touch on a non-hand object conflicts with prior findings, and to resolve this conflict a further hypothesis is successfully tested: that without prior onset of the basic illusion this unusual experience does not occur. Conclusions/Significance A rubber hand illusion is found that can arise when the real and the foreign arm are aligned in personal space. This illusion persists through periods of no tactile stimulation and is strong enough to allow very unusual experiences of touch felt on a cardboard box and experiences of touch produced at a distance, as if by supernatural causation. These findings suggest that one's visual body image is explained away during experience of the illusion and they may be of further importance to understanding the role of experience in delusion formation. The findings of touch on non-hand objects may help reconcile conflicting results in this area of research. In addition, new evidence is provided that relates to the recently discovered psychologically induced temperature changes that occur during the illusion. PMID:20195378
The Split Fovea Theory and the Leicester critique: what do the data say?
Van der Haegen, Lise; Drieghe, Denis; Brysbaert, Marc
2010-01-01
According to the Split Fovea Theory (SFT) recognition of foveally presented words involves interhemispheric transfer. This is because letters to the left of the fixation location are initially sent to the right hemisphere, whereas letters to the right of the fixation position are projected to the left hemisphere. Both sources of information must be integrated for words to be recognized. Evidence for the SFT comes from the Optimal Viewing Position (OVP) paradigm, in which foveal word recognition is examined as a function of the letter fixated. OVP curves are different for left and right language dominant participants, indicating a time cost when information is presented in the half-field ipsilateral to the dominant hemisphere (Hunter, Brysbaert, & Knecht, 2007). The methodology of the SFT research has recently been questioned, because not enough efforts were made to ensure adequate fixation. The aim of the present study is to test the validity of this argument. Experiment 1 replicated the OVP effect in a naming task by presenting words at different fixation positions, with the experimental settings applied in previous OVP research. Experiment 2 monitored and controlled eye fixations of the participants and presented the stimuli within the boundaries of the fovea. Exactly the same OVP curve was obtained. In Experiment 3, the eyes were also tracked and monocular viewing was used. Results again revealed the same OVP effect, although latencies were remarkably higher than in the previous experiments. From these results we can conclude that although noise is present in classical SFT studies without eye-tracking, this does not change the OVP effect observed with left dominant individuals.
Perspectives of optic nerve prostheses.
Lane, Frank John; Nitsch, Kristian; Huyck, Margaret; Troyk, Philip; Schug, Ken
2016-01-01
A number of projects exist that are investigating the ability to restore visual percepts for individuals who are blind through a visual prosthesis. While many projects have reported the results from a technical basis, very little exists in the professional literature on the human experience of visual implant technology. The current study uses an ethnographic methodological approach to document the experiences of the research participants and study personnel of a optic nerve vision prosthesis project in Brussels, Belgium. The findings have implications for motivation for participating in clinical trials, ethical safeguards of participants and the role of the participant in a research study. Implications for Rehabilitation Rehabilitation practitioners are often solicited by prospective participants to assist in evaluating a clinical trial before making a decision about participation. Rehabilitation professionals should be aware that: The decision to participate in a clinical trial is ultimately up to the individual participant. However, participants should be aware that family members might experience stress from of a lack of knowledge about the research study. The more opportunities a participant has to share thoughts and feelings about the research study with investigators will likely result in a positive overall experience. Ethical safeguards put in place to protect the interests of an individual participant may have the opposite effect and create stress. Rehabilitation professionals can play an important role as participant advocates from recruitment through termination of the research study. Participant hope is an important component of participation in a research study. Information provided to participants by investigators during the consent process should be balanced carefully with potential benefits, so it does not destroy a participant's hope.
34 CFR 647.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 647.22 Section 647.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION RONALD E. MCNAIR POSTBACCALAUREATE ACHIEVEMENT...
Practical aspects of running DOE for improving growth media for in vitro plants
USDA-ARS?s Scientific Manuscript database
Experiments using DOE software to improve plant tissue culture growth medium are complicated and require complex setups. Once the experimental design is set and the treatment points calculated, media sheets and mixing charts must be developed. Since these experiments require three passages on the sa...
24 CFR 904.205 - Training methodology.
Code of Federal Regulations, 2010 CFR
2010-04-01
... DEVELOPMENT LOW RENT HOUSING HOMEOWNERSHIP OPPORTUNITIES Homeownership Counseling and Training § 904.205... training methodology. Because groups vary, there should be adaptability in the communication and learning experience. Methods to be utilized may include group presentations, small discussion groups, special classes...
Power-sharing Partnerships: Teachers' Experiences of Participatory Methodology.
Ferreira, Ronél; Ebersöhn, Liesel; Mbongwe, Bathsheba B
2015-01-01
This article reports on the experiences of teachers as coresearchers in a long-term partnership with university researchers, who participated in an asset-based intervention project known as Supportive Teachers, Assets and Resilience (STAR). In an attempt to inform participatory research methodology, the study investigated how coresearchers (teachers) experienced power relations. We utilized Gaventa's power cube as a theoretical framework and participatory research as our methodologic paradigm. Ten teachers of a primary school in the Eastern Cape and five teachers of a secondary school in a remote area in the Mpumalanga Province in South Africa participated (n=15). We employed multiple data generation techniques, namely Participatory Reflection and Action (PRA) activities, observation, focus group discussions, and semistructured interviews, using thematic analysis and categorical aggregation for data analysis. We identified three themes, related to the (1) nature of power in participatory partnerships, (2) coreasearchers' meaning making of power and partnerships, and their (3) role in taking agency. Based on these findings, we developed a framework of power sharing partnerships to extend Gaventa's power cube theory. This framework, and its five interrelated elements (leadership as power, identifying vision and mission, synergy, interdependent role of partners, and determination), provide insight into the way coresearchers shared their experiences of participatory research methodology. We theorise power-sharing partnerships as a complimentary platform hosting partners' shared strengths, skills, and experience, creating synergy in collaborative projects.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594
Where Does Latin "Sum" Come From?
ERIC Educational Resources Information Center
Nyman, Martti A.
1977-01-01
The derivation of Latin "sum,""es(s),""est" from Indo-European "esmi,""est,""esti" involves methodological problems. It is claimed here that the development of "sum" from "esmi" is related to the origin of the variation "est-st" (less than"esti"). The study is primarily concerned with this process, but chronological suggestions are also made. (CHK)
21 CFR 514.8 - Supplements and other changes to an approved application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...
21 CFR 514.8 - Supplements and other changes to an approved application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the drug as manufactured without the change; (C) Changes that may affect drug substance or drug... the proposed change; (C) The drug(s) involved; (D) The manufacturing site(s) or area(s) affected; (E...) Replacement of equipment with that of a different design that does not affect the process methodology or...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
...-Filing system does not support unlisted software, and the NRC Meta System Help Desk will not be able to... Setpoint Methodology for LSSS [Limiting Safety System Setting] Functions,'' which included the instrument... System Instrumentation,'' Function 3, Condensate Storage Tank Level--Low. The supporting TS Bases will...
Publications | Regional Energy Deployment System Model | Energy Analysis |
Methodologies: Approximated DC Flow vs. Pipe Flow along AC Lines. Golden, CO: National Renewable Energy Energy. DOE/GO-102015-4557. Macknick, Jordan, and Stuart Cohen. 2015. Water Impacts of High Solar PV , Jordan, Stuart Cohen, Robin Newmark, Andrew Martinez, Patrick Sullivan, and Vince Tidwell. 2015. Water
Reading Rocks: Creating a Space for Preservice Teachers to Become Responsive Teachers
ERIC Educational Resources Information Center
Assaf, Lori Czop; Lopez, Minda
2012-01-01
Set in a yearlong, school-based tutoring program, designed as a community of practice, we use qualitative methodology to examine how 14 preservice teachers learned to become responsive teachers. We focus on one question: In what ways does participating in a yearlong, supervised tutoring program mediate preservice teachers' learning about…
A Degradation Analysis Methodology for Maintenance Tasks
1985-05-12
Support Maintenance Company ......... ................... ...... 115 ................. .................. ,.... - ’ vi LIST OF FIGURES Figure Page 1-I...Versus Individual Effectiveness ..................... 115 5-4. Effectiveiess Results for Threat One ..................... 117 5-5. Effectiveness...model does not differenti - ate between a task which requires a high degree of fine finger dexteri- ty from another task involving only gross body
Growing and Growing: Promoting Functional Thinking with Geometric Growing Patterns
ERIC Educational Resources Information Center
Markworth, Kimberly A.
2010-01-01
Design research methodology is used in this study to develop an empirically-substantiated instruction theory about students' development of functional thinking in the context of geometric growing patterns. The two research questions are: (1) How does students' functional thinking develop in the context of geometric growing patterns? (2) What are…
Does Adolescent Bullying Distinguish between Male Offending Trajectories in Late Middle Age?
ERIC Educational Resources Information Center
Piquero, Alex R.; Connell, Nadine M.; Piquero, Nicole Leeper; Farrington, David P.; Jennings, Wesley G.
2013-01-01
The perpetration of bullying is a significant issue among researchers, policymakers, and the general public. Although researchers have examined the link between bullying and subsequent antisocial behavior, data and methodological limitations have hampered firm conclusions. This study uses longitudinal data from 411 males in the Cambridge Study in…
ERIC Educational Resources Information Center
Zhou, Wencang; Hu, Huajing; Shi, Xuli
2015-01-01
Purpose: The purpose of this paper is to develop a framework for studying organizational learning, firm innovation and firm financial performance. Design/methodology/approach: This paper examines the effects of organizational learning on innovation and performance among 287 listed Chinese companies. Findings: The results indicate a positive…
30 CFR 1206.170 - What does this subpart contain?
Code of Federal Regulations, 2011 CFR
2011-07-01
... applies to all gas production from Indian (tribal and allotted) oil and gas leases (except leases on the... Indian oil and gas lease are inconsistent with any regulation in this subpart, then the Federal statute... valuation methodology. (d) All royalty payments you make to ONRR are subject to monitoring, review, audit...
Regression Discontinuity Design in Gifted and Talented Education Research
ERIC Educational Resources Information Center
Matthews, Michael S.; Peters, Scott J.; Housand, Angela M.
2012-01-01
This Methodological Brief introduces the reader to the regression discontinuity design (RDD), which is a method that when used correctly can yield estimates of research treatment effects that are equivalent to those obtained through randomized control trials and can therefore be used to infer causality. However, RDD does not require the random…
Learning Gains for Core Concepts in a Serious Game on Scientific Reasoning
ERIC Educational Resources Information Center
Forsyth, Carol; Pavlik, Philip, Jr.; Graesser, Arthur C.; Cai, Zhiqiang; Germany, Mae-lynn; Millis, Keith; Dolan, Robert P.; Butler, Heather; Halpern, Diane
2012-01-01
"OperationARIES!" is an Intelligent Tutoring System that teaches scientific inquiry skills in a game-like atmosphere. Students complete three different training modules, each with natural language conversations, in order to acquire deep-level knowledge of 21 core concepts of research methodology (e.g., correlation does not mean…
Factors for Successful E-Learning: Does Age Matter?
ERIC Educational Resources Information Center
Fleming, Julie; Becker, Karen; Newton, Cameron
2017-01-01
Purpose: The purpose of this paper is to examine the factors affecting employees' overall acceptance, satisfaction and future use of e-learning, specifically exploring the impact that age has on the intended future use of e-learning relative to the other potential predictors. Design/Methodology/Approach: The project developed an online survey and…
ERIC Educational Resources Information Center
Barron, Kenneth E.; Apple, Kevin J.
2014-01-01
Coursework in statistics and research methods is a core requirement in most undergraduate psychology programs. However, is there an optimal way to structure and sequence methodology courses to facilitate student learning? For example, should statistics be required before research methods, should research methods be required before statistics, or…
Taking-On: A Grounded Theory of Addressing Barriers in Task Completion
ERIC Educational Resources Information Center
Austinson, Julie Ann
2011-01-01
This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…
Handbook of the Economics of Education. Volume 3
ERIC Educational Resources Information Center
Hanushek, Eric A., Ed.; Machin, Stephen J., Ed.; Woessmann, Ludger, Ed.
2011-01-01
How does education affect economic and social outcomes, and how can it inform public policy? Volume 3 of the Handbooks in the Economics of Education uses newly available high quality data from around the world to address these and other core questions. With the help of new methodological approaches, contributors cover econometric methods and…
ERIC Educational Resources Information Center
Olsson, Tina M.
2012-01-01
Background: As research moves from questions of efficacy (can an intervention work) to questions of effectiveness (does an intervention work in practice), questions of efficiency (what are the costs and consequences of the intervention) become increasingly important. The incorporation of economic evaluation into the planning and execution of…
Democratic Schooling and Citizenship Education: What Does the Research Reveal?
ERIC Educational Resources Information Center
Hepburn, Mary A.
This paper examines four major research studies spanning approximately 45 years which provide solid evidence that democratic schooling is possible and is an extremely important factor in the education of young citizens for a democratic society. The objectives, methodology, limitations, and results of each study are examined. From the studies, the…
DOT National Transportation Integrated Search
2016-04-22
Each table below represents the list of Cost Centers associated with each APT Subfamily at the time the data for this report was gathered from APT in April 2016. The #701 Capital Family is not included as it does not have Cost Centers in a traditiona...
Research in the Field of Inclusive Education: Time for a Rethink?
ERIC Educational Resources Information Center
Messiou, Kyriaki
2017-01-01
This paper sets out to challenge thinking and practice amongst researchers in the field of inclusive education. It does this based on an analysis of published articles in the "International Journal of Inclusive Education" between 2005 and 2015, which identified topics and methodologies used in studies of inclusive education. The analysis…
Toward a General Research Process for Using Dubin's Theory Building Model
ERIC Educational Resources Information Center
Holton, Elwood F.; Lowe, Janis S.
2007-01-01
Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…
Does the Availability of Vocational Qualifications through Work Assist Social Inclusion?
ERIC Educational Resources Information Center
Smith, Erica; Smith, Andy
2011-01-01
Purpose: The purpose of this paper is to discuss whether the availability of qualifications through work-based traineeships in Australia assists social inclusion. Design/methodology/approach: Industry case studies, of the finance and cleaning industries, were undertaken as part of a national research project on quality in traineeships. The two…
ERIC Educational Resources Information Center
de Haan, Haijing Helen
2015-01-01
Purpose: The purpose of this paper is to critically investigate the discourse on "competitive advantage", a concept that has been widely applied in the public higher education sector, but rarely defined and conceptualised. Design/methodology/approach: In order to get some insightful understanding about how "competitive…
ERIC Educational Resources Information Center
Shawer, Saad F.
2013-01-01
This quantitative investigation examined the influence of low and high self-efficacy on candidate teacher academic performance in a foreign language teaching methodology course through testing the speculation that high self-efficacy levels would improve pedagogical-content knowledge (PCK). Positivism guided the research design at the levels of…
Getting inside the Insider Researcher: Does Race-Symmetry Help or Hinder Research?
ERIC Educational Resources Information Center
Vass, Greg
2017-01-01
This article engages with methodological concerns connected to insider education research and the "race-symmetry" shared between the researcher and teacher participants. To do this, race critical reflexive strategies are utilized to show how and why this practice productively contributed to the knowledge about race making constructed in…
Recycling as a Result of "Cultural Greening"?
ERIC Educational Resources Information Center
Flagg, Julia A.; Bates, Diane C.
2016-01-01
Purpose: This study aims to test whether faculty and students who have developed the most pro-environmental values and concerns are also the most likely to reduce the on-campus waste stream. It does so by using the theory of ecological modernization. Design/methodology/approach: Questionnaires were created and disseminated to a representative…
From I to We: Collaboration in Entrepreneurship Education and Learning?
ERIC Educational Resources Information Center
Warhuus, Jan P.; Tanggaard, Lene; Robinson, Sarah; Ernø, Steffen Moltrup
2017-01-01
Purpose: The purpose of this paper is to ask: what effect does moving from individual to collective understandings of the entrepreneur in enterprising education have on the student's learning? And given this shift in understanding, is there a need for a new paradigm in entrepreneurship learning? Design/methodology/approach This paper draws on…
78 FR 76183 - Request for Information for the 2014; Trafficking in Persons Report
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-16
..., interviews, direct observations, or other sources of quantitative or qualitative data, details on the research or data-gathering methodology should be provided. The Department does not include in the Report... (``AG Report'') mandated by section 105 of the TVPA (22 U.S.C. 7103(d)(7)). In addition, the United...
Initiating "The Methodology of Jacques Ranciere": How Does It All Start?
ERIC Educational Resources Information Center
Mercieca, Duncan P.
2012-01-01
Educationalists are currently engaging with Jacques Ranciere's thought on emancipation and equality. The focus of this paper is on what initiates the process that starts emancipation. With reference to teachers the question is: how do teachers become emancipated? This paper discusses how the teacher's life is made "sensible" and how sense is…
Improved False Discovery Rate Estimation Procedure for Shotgun Proteomics.
Keich, Uri; Kertesz-Farkas, Attila; Noble, William Stafford
2015-08-07
Interpreting the potentially vast number of hypotheses generated by a shotgun proteomics experiment requires a valid and accurate procedure for assigning statistical confidence estimates to identified tandem mass spectra. Despite the crucial role such procedures play in most high-throughput proteomics experiments, the scientific literature has not reached a consensus about the best confidence estimation methodology. In this work, we evaluate, using theoretical and empirical analysis, four previously proposed protocols for estimating the false discovery rate (FDR) associated with a set of identified tandem mass spectra: two variants of the target-decoy competition protocol (TDC) of Elias and Gygi and two variants of the separate target-decoy search protocol of Käll et al. Our analysis reveals significant biases in the two separate target-decoy search protocols. Moreover, the one TDC protocol that provides an unbiased FDR estimate among the target PSMs does so at the cost of forfeiting a random subset of high-scoring spectrum identifications. We therefore propose the mix-max procedure to provide unbiased, accurate FDR estimates in the presence of well-calibrated scores. The method avoids biases associated with the two separate target-decoy search protocols and also avoids the propensity for target-decoy competition to discard a random subset of high-scoring target identifications.
NASA Astrophysics Data System (ADS)
Baris, Engin
Distributed electric propulsion systems benefit from the inherent scale independence of electric propulsion. This property allows the designer to place multiple small electric motors along the wing of an aircraft instead of using a single or several internal combustion motors with gear boxes or other power train components. Aircraft operating at low Reynolds numbers are ideal candidates for benefiting from increased local flow velocities as provided by distributed propulsion systems. In this study, a distributed electric propulsion system made up of eight motor/propellers was integrated into the leading edge of a small fixed wing-body model to investigate the expected improvements on the aerodynamics available to small UAVs operating at low Reynolds numbers. Wind tunnel tests featuring a Design of Experiments (DOE) methodology were used for aerodynamic characterization. Experiments were performed in four modes: all-propellers-on, wing-tip-propellers-alone-on, wing-alone mode, and two-inboard-propellers-on-alone mode. In addition, the all-propeller-on, wing-alone, and a single-tractor configuration were analyzed using VSPAERO, a vortex lattice code, to make comparisons between these different configurations. Results show that the distributed propulsion system has higher normal force, endurance, and range features, despite a potential weight penalty.
Fairclough, Stuart J.; Knowles, Zoe R.; Boddy, Lynne M.
2017-01-01
Understanding family physical activity (PA) behaviour is essential for designing effective family-based PA interventions. However, effective approaches to capture the perceptions and “lived experiences” of families are not yet well established. The aims of the study were to: (1) demonstrate how a “write, draw, show and tell” (WDST) methodological approach can be appropriate to family-based PA research, and (2) present two distinct family case studies to provide insights into the habitual PA behaviour and experiences of a nuclear and single-parent family. Six participants (including two “target” children aged 9–11 years, two mothers and two siblings aged 6–8 years) from two families were purposefully selected to take part in the study, based on their family structure. Participants completed a paper-based PA diary and wore an ActiGraph GT9X accelerometer on their left wrist for up to 10 weekdays and 16 weekend days. A range of WDST tasks were then undertaken by each family to offer contextual insight into their family-based PA. The selected families participated in different levels and modes of PA, and reported contrasting leisure opportunities and experiences. These novel findings encourage researchers to tailor family-based PA intervention programmes to the characteristics of the family. PMID:28708114
Seponski, Desiree M; Lewis, Denise C; Megginson, Maegan C
2014-01-01
Mental health issues are significant contributors to the global burden of disease with the highest incidence in resource poor countries; 90% of those in need of mental health treatment reside in low resource countries but receive only 10% of the world's resources. Cambodia, the eighth least developed country in the world, serves as one example of the need to address mental health concerns in low-income, resource poor countries. The current study utilises responsive evaluation methodology to explore how poverty-stricken Cambodian clients, therapists and supervisors experience Western models of therapy as culturally responsive to their unique needs. Quantitative and qualitative data were triangulated across multiple stakeholders using numerous methods including a focus group, interviews, surveys, case illustrations and live supervision observation and analysed using constant comparative analysis. Emerging findings suggest that poverty, material needs, therapy location and financial situations greatly impact the daily lives and mental health conditions of Cambodians and hinder clients' therapeutic progress. The local community needs and context of poverty greatly hinder clients' therapeutic progress in therapy treatment and when therapy does not directly address the culture of poverty, clients did not experience therapy as valuable despite some temporary decreases in mental health symptoms.
Improved False Discovery Rate Estimation Procedure for Shotgun Proteomics
2016-01-01
Interpreting the potentially vast number of hypotheses generated by a shotgun proteomics experiment requires a valid and accurate procedure for assigning statistical confidence estimates to identified tandem mass spectra. Despite the crucial role such procedures play in most high-throughput proteomics experiments, the scientific literature has not reached a consensus about the best confidence estimation methodology. In this work, we evaluate, using theoretical and empirical analysis, four previously proposed protocols for estimating the false discovery rate (FDR) associated with a set of identified tandem mass spectra: two variants of the target-decoy competition protocol (TDC) of Elias and Gygi and two variants of the separate target-decoy search protocol of Käll et al. Our analysis reveals significant biases in the two separate target-decoy search protocols. Moreover, the one TDC protocol that provides an unbiased FDR estimate among the target PSMs does so at the cost of forfeiting a random subset of high-scoring spectrum identifications. We therefore propose the mix-max procedure to provide unbiased, accurate FDR estimates in the presence of well-calibrated scores. The method avoids biases associated with the two separate target-decoy search protocols and also avoids the propensity for target-decoy competition to discard a random subset of high-scoring target identifications. PMID:26152888