Avila, Mónica; Zougagh, Mohammed; Escarpa, Alberto; Ríos, Angel
2009-10-01
A new strategy based on the fast separation of the fingerprint markers of Vanilla planifolia extracts and vanilla-related samples on microfluidic-electrochemistry chip is proposed. This methodology allowed the detection of all required markers for confirmation of common frauds in this field. The elution order was strategically connected with sequential sample screening and analyte confirmation steps, where first ethyl vanillin was detected to distinguish natural from adultered samples; second, vanillin as prominent marker in V. planifolia, but frequently added in its synthetic form; and third, the final detection of the fingerprint markers (p-hydroxybenzaldehyde, vanillic acid, and p-hydroxybenzoic acid) of V. planifolia with confirmation purposes. The reliability of the proposed methodology was demonstrated in the confirmation the natural or non-natural origin of vanilla in samples using V. planifolia extracts and other selected food samples containing this flavor.
ERIC Educational Resources Information Center
Pennsylvania State Univ., University Park. Inst. for the Study of Adult Literacy.
The goal of this research project was to create a guide on the effective use of assessment instruments and methodologies, related resources, and guidelines for measuring adult learners' attainment of basic skills and competencies to document educational gains and demonstrate program quality. The project focused on confirming current use of…
Design of the rivaroxaban for heparin-induced thrombocytopenia study.
Linkins, Lori-Ann; Warkentin, Theodore E; Pai, Menaka; Shivakumar, Sudeep; Manji, Rizwan A; Wells, Philip S; Crowther, Mark A
2014-11-01
Rivaroxaban is an ideal potential candidate for treatment of heparin-induced thrombocytopenia (HIT) because it is administered orally by fixed dosing, requires no laboratory monitoring and is effective in the treatment of venous and arterial thromboembolism in other settings. The Rivaroxaban for HIT study is a prospective, multicentre, single-arm, cohort study evaluating the incidence of new symptomatic venous and arterial thromboembolism in patients with suspected or confirmed HIT who are treated with rivaroxaban. Methodological challenges faced in the design of this study include heterogeneity of the patient population, differences in the baseline risk of thrombosis and bleeding dependent on whether HIT is confirmed or just suspected, and heterogeneity in laboratory confirmation of HIT. The rationale for how these challenges were addressed and the final design of the Rivaroxaban for HIT study is reviewed.
Methodological Issues in Questionnaire Design.
Song, Youngshin; Son, Youn Jung; Oh, Doonam
2015-06-01
The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
Beaufays, Jérôme; Adam, Benoît; Decrem, Yves; Prévôt, Pierre-Paul; Santini, Sébastien; Brasseur, Robert; Brossard, Michel; Lins, Laurence
2008-01-01
Background During their blood meal, ticks secrete a wide variety of proteins that interfere with their host's defense mechanisms. Among these proteins, lipocalins play a major role in the modulation of the inflammatory response. Methodology/Principal Findings Screening a cDNA library in association with RT-PCR and RACE methodologies allowed us to identify 14 new lipocalin genes in the salivary glands of the Ixodes ricinus hard tick. A computational in-depth structural analysis confirmed that LIRs belong to the lipocalin family. These proteins were called LIR for “Lipocalin from I. ricinus” and numbered from 1 to 14 (LIR1 to LIR14). According to their percentage identity/similarity, LIR proteins may be assigned to 6 distinct phylogenetic groups. The mature proteins have calculated pM and pI varying from 21.8 kDa to 37.2 kDa and from 4.45 to 9.57 respectively. In a western blot analysis, all recombinant LIRs appeared as a series of thin bands at 50–70 kDa, suggesting extensive glycosylation, which was experimentally confirmed by treatment with N-glycosidase F. In addition, the in vivo expression analysis of LIRs in I. ricinus, examined by RT-PCR, showed homogeneous expression profiles for certain phylogenetic groups and relatively heterogeneous profiles for other groups. Finally, we demonstrated that LIR6 codes for a protein that specifically binds leukotriene B4. Conclusions/Significance This work confirms that, regarding their biochemical properties, expression profile, and sequence signature, lipocalins in Ixodes hard tick genus, and more specifically in the Ixodes ricinus species, are segregated into distinct phylogenetic groups suggesting potential distinct function. This was particularly demonstrated by the ability of LIR6 to scavenge leukotriene B4. The other LIRs did not bind any of the ligands tested, such as 5-hydroxytryptamine, ADP, norepinephrine, platelet activating factor, prostaglandins D2 and E2, and finally leukotrienes B4 and C4. PMID:19096708
77 FR 66471 - Methodology for Designation of Frontier and Remote Areas
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-05
... the use of a shorter, more intuitively appealing descriptive label in research publications and other...) Selection of final methodological approach; and (8) Analyses using final methodology on 2000 data. All the...
High Mitochondrial DNA Stability in B-Cell Chronic Lymphocytic Leukemia
Cerezo, María; Bandelt, Hans-Jürgen; Martín-Guerrero, Idoia; Ardanaz, Maite; Vega, Ana; Carracedo, Ángel; García-Orad, África; Salas, Antonio
2009-01-01
Background Chronic Lymphocytic Leukemia (CLL) leads to progressive accumulation of lymphocytes in the blood, bone marrow, and lymphatic tissues. Previous findings have suggested that the mtDNA could play an important role in CLL. Methodology/Principal Findings The mitochondrial DNA (mtDNA) control-region was analyzed in lymphocyte cell DNA extracts and compared with their granulocyte counterpart extract of 146 patients suffering from B-Cell CLL; B-CLL (all recruited from the Basque country). Major efforts were undertaken to rule out methodological artefacts that would render a high false positive rate for mtDNA instabilities and thus lead to erroneous interpretation of sequence instabilities. Only twenty instabilities were finally confirmed, most of them affecting the homopolymeric stretch located in the second hypervariable segment (HVS-II) around position 310, which is well known to constitute an extreme mutational hotspot of length polymorphism, as these mutations are frequently observed in the general human population. A critical revision of the findings in previous studies indicates a lack of proper methodological standards, which eventually led to an overinterpretation of the role of the mtDNA in CLL tumorigenesis. Conclusions/Significance Our results suggest that mtDNA instability is not the primary causal factor in B-CLL. A secondary role of mtDNA mutations cannot be fully ruled out under the hypothesis that the progressive accumulation of mtDNA instabilities could finally contribute to the tumoral process. Recommendations are given that would help to minimize erroneous interpretation of sequencing results in mtDNA studies in tumorigenesis. PMID:19924307
Corradini, M G; Normand, M D; Newcomer, C; Schaffner, D W; Peleg, M
2009-01-01
Theoretically, if an organism's resistance can be characterized by 3 survival parameters, they can be found by solving 3 simultaneous equations that relate the final survival ratio to the lethal agent's intensity. (For 2 resistance parameters, 2 equations will suffice.) In practice, the inevitable experimental scatter would distort the results of such a calculation or render the method unworkable. Averaging the results obtained with more than 3 final survival ratio triplet combinations, determined in four or more treatments, can remove this impediment. This can be confirmed by the ability of a kinetic inactivation model derived from the averaged parameters to predict survival patterns under conditions not employed in their determination, as demonstrated with published isothermal survival data of Clostridium botulinum spores, isobaric data of Escherichia coli under HPP, and Pseudomonas exposed to hydrogen peroxide. Both the method and the underlying assumption that the inactivation followed a Weibull-Log logistic (WeLL) kinetics were confirmed in this way, indicating that when an appropriate survival model is available, it is possible to predict the entire inactivation curves from several experimental final survival ratios alone. Where applicable, the method could simplify the experimental procedure and lower the cost of microbial resistance determinations. In principle, the methodology can be extended to deteriorative chemical reactions if they too can be characterized by 2 or 3 kinetic parameters.
Creativity and psychopathology: a systematic review.
Thys, Erik; Sabbe, Bernard; De Hert, Marc
2014-01-01
The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.
Leça, João M; Pereira, Ana C; Vieira, Ana C; Reis, Marco S; Marques, José C
2015-08-05
Vicinal diketones, namely diacetyl (DC) and pentanedione (PN), are compounds naturally found in beer that play a key role in the definition of its aroma. In lager beer, they are responsible for off-flavors (buttery flavor) and therefore their presence and quantification is of paramount importance to beer producers. Aiming at developing an accurate quantitative monitoring scheme to follow these off-flavor compounds during beer production and in the final product, the head space solid-phase microextraction (HS-SPME) analytical procedure was tuned through experiments planned in an optimal way and the final settings were fully validated. Optimal design of experiments (O-DOE) is a computational, statistically-oriented approach for designing experiences that are most informative according to a well-defined criterion. This methodology was applied for HS-SPME optimization, leading to the following optimal extraction conditions for the quantification of VDK: use a CAR/PDMS fiber, 5 ml of samples in 20 ml vial, 5 min of pre-incubation time followed by 25 min of extraction at 30 °C, with agitation. The validation of the final analytical methodology was performed using a matrix-matched calibration, in order to minimize matrix effects. The following key features were obtained: linearity (R(2) > 0.999, both for diacetyl and 2,3-pentanedione), high sensitivity (LOD of 0.92 μg L(-1) and 2.80 μg L(-1), and LOQ of 3.30 μg L(-1) and 10.01 μg L(-1), for diacetyl and 2,3-pentanedione, respectively), recoveries of approximately 100% and suitable precision (repeatability and reproducibility lower than 3% and 7.5%, respectively). The applicability of the methodology was fully confirmed through an independent analysis of several beer samples, with analyte concentrations ranging from 4 to 200 g L(-1). Copyright © 2015 Elsevier B.V. All rights reserved.
Gruber, Dieter P; Buder-Stroisznigg, Michael; Wallner, Gernot; Strauß, Bernhard; Jandel, Lothar; Lang, Reinhold W
2012-07-10
With one measurement configuration, existing gloss measurement methodologies are generally restricted to specific gloss levels. A newly developed image-analytical gloss parameter called "clarity" provides the possibility to describe the perceptual result of a broad range of different gloss levels with one setup. In order to analyze and finally monitor the perceived gloss of products, a fast and flexible method also for the automated inspection is highly demanded. The clarity parameter is very fast to calculate and therefore usable for fast in-line surface inspection. Coated metal specimens were deformed by varying degree and polished afterwards in order to study the clarity parameter regarding the quantification of varying surface gloss types and levels. In order to analyze the correlation with the human gloss perception a study was carried out in which experts were asked to assess gloss properties of a series of surface samples under standardized conditions. The study confirmed clarity to exhibit considerably better correlation to the human perception than alternative gloss parameters.
NASA Astrophysics Data System (ADS)
Orlaineta-Agüero, S.; Del Sol-Fernández, S.; Sánchez-Guzmán, D.; García-Salcedo, R.
2017-01-01
In the present work we show the implementation of a learning sequence based on an active learning methodology for teaching Physics, this proposal tends to promote a better learning in high school students with the use of a comic book and it combines the use of different low-cost experimental activities for teaching the electrical concepts of Current, Resistance and Voltage. We consider that this kind of strategy can be easily extrapolated to higher-education levels like Engineering-college/university level and other disciplines of Science. To evaluate this proposal, we used some conceptual questions from the Electric Circuits Concept Evaluation survey developed by Sokoloff and the results from this survey was analysed with the Normalized Conceptual Gain proposed by Hake and the Concentration Factor that was proposed by Bao and Redish, to identify the effectiveness of the methodology and the models that the students presented after and before the instruction, respectively. We found that this methodology was more effective than only the implementation of traditional lectures, we consider that these results cannot be generalized but gave us the opportunity to view many important approaches in Physics Education; finally, we will continue to apply the same experiment with more students, in the same and upper levels of education, to confirm and validate the effectiveness of this methodology proposal.
Extension of Companion Modeling Using Classification Learning
NASA Astrophysics Data System (ADS)
Torii, Daisuke; Bousquet, François; Ishida, Toru
Companion Modeling is a methodology of refining initial models for understanding reality through a role-playing game (RPG) and a multiagent simulation. In this research, we propose a novel agent model construction methodology in which classification learning is applied to the RPG log data in Companion Modeling. This methodology enables a systematic model construction that handles multi-parameters, independent of the modelers ability. There are three problems in applying classification learning to the RPG log data: 1) It is difficult to gather enough data for the number of features because the cost of gathering data is high. 2) Noise data can affect the learning results because the amount of data may be insufficient. 3) The learning results should be explained as a human decision making model and should be recognized by the expert as being the result that reflects reality. We realized an agent model construction system using the following two approaches: 1) Using a feature selction method, the feature subset that has the best prediction accuracy is identified. In this process, the important features chosen by the expert are always included. 2) The expert eliminates irrelevant features from the learning results after evaluating the learning model through a visualization of the results. Finally, using the RPG log data from the Companion Modeling of agricultural economics in northeastern Thailand, we confirm the capability of this methodology.
Nalavany, Blace Arthur; Carawan, Lena Williams; Rennick, Robyn A
2011-01-01
Concept mapping (a mixed qualitative-quantitative methodology) was used to describe and understand the psychosocial experiences of adults with confirmed and self-identified dyslexia. Using innovative processes of art and photography, Phase 1 of the study included 15 adults who participated in focus groups and in-depth interviews and were asked to elucidate their experiences with dyslexia. On index cards, 75 statements and experiences with dyslexia were recorded. The second phase of the study included 39 participants who sorted these statements into self-defined categories and rated each statement to reflect their personal experiences to produce a visual representation, or concept map, of their experience. The final concept map generated nine distinct cluster themes: Organization Skills for Success; Finding Success; A Good Support System Makes the Difference; On Being Overwhelmed; Emotional Downside; Why Can't They See It?; Pain, Hurt, and Embarrassment From Past to Present; Fear of Disclosure; and Moving Forward. Implications of these findings are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... wage determinations based on the new prevailing wage methodology set forth in the Wage Rule, as to the... comment, we published a Final Rule on August 1, 2011, which set the new effective date for the Wage Rule... date of the Wage Methodology for the Temporary Non- agricultural Employment H-2B Program Final Rule...
Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak
2015-07-01
This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Asemota, Eseosa; Crawford, Glen; Kovarik, Carrie; Brod, Bruce A
There is currently no standardized protocol for photopatch testing and phototesting in the United States. Certain testing paramaters (such as chemicals tested, time between test application and irradiation, and time of final interpretation) vary from provider to provider. These variations may impact comparability and consistency of test results. The goal of our survey-based study was to outline the photopatch test and phototest protocols used by US contact dermatologists. The information obtained will aid in the development of a national consensus on testing methodologies. Based on a literature search conducted on differences in testing methodologies, we constructed a questionnaire. The survey was distributed at the American Contact Dermatitis Society annual meeting and via the American Contact Dermatitis Society Web site. Standard descriptive analysis was performed on data obtained. Of the 800 dermatologists contacted, 117 agreed to participate in the survey. Among these respondents, 64 (54.8%) conduct photopatch testing. Results of the survey are presented, and they confirm that a variety of techniques and testing materials are used. It would be beneficial to enlist a panel of expert contact dermatologists to create by formal consensus, using these research findings, a standard photopatch test protocol for use in this country.
Oxygen consumption rates by different oenological tannins in a model wine solution.
Pascual, Olga; Vignault, Adeline; Gombau, Jordi; Navarro, Maria; Gómez-Alonso, Sergio; García-Romero, Esteban; Canals, Joan Miquel; Hermosín-Gutíerrez, Isidro; Teissedre, Pierre-Louis; Zamora, Fernando
2017-11-01
The kinetics of oxygen consumption by different oenological tannins were measured in a model wine solution using the non-invasive method based on luminiscence. The results indicate that the oxygen consumption rate follows second-order kinetics depending on tannin and oxygen concentrations. They also confirm that the oxygen consumption rate is influenced by temperature in accordance with Arrhenius law. The indications are that ellagitannins are the fastest oxygen consumers of the different oenological tannins, followed in decreasing order by quebracho tannins, skin tannins, seed tannins and finally gallotannins. This methodology can therefore be proposed as an index for determining the effectiveness of different commercial tannins in protecting wines against oxidation. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rajeshkumar, S.; Malarkodi, C.
2017-11-01
In this study, we used bacterial strain Serratia nematodiphila for the synthesis of silver nanoparticles using optimized biomass growth. In this RSM study the variables such as sodium sulphate (g / L) (0.5, 1, 1.5), magnesium sulphate (g/L) (0.3, 0.5, 0.7), pH (6.4, 7.4, 8.4.), temperature (25, 30, 35°C) and Sodium lactate, Peptone have been used for the maximum production of biomass. We got very good a result for the silver nanoparticles was confirmed using UV-vis spectrophotometer and transmission electron microscope. Finally, we concluded that the using of RSM for nanoparticles synthesis may use in industrial biotechnology and related technologies for large scale production.
Interactive effects of aging parameters of AA6056
NASA Astrophysics Data System (ADS)
Dehghani, Kamran; Nekahi, Atiye
2012-10-01
The effect of thermomechanical treatment on the aging behavior of AA6056 aluminum alloy was modeled using response surface methodology (RSM). Two models were developed to predict the final yield stress (FYS) and elongation amounts as well as the optimum conditions of aging process. These were done based on the interactive effects of applied thermomechanical parameters. The optimum condition predicted by the model to attain the maximum strength was pre-aging at 80 °C for 15 h, followed by 70% cold work and subsequent final aging at 165 °C for 4 h, which resulted in the FYS of about 480 MPa. As for the elongation, the optimum condition was pre-aging at 80 °C for 15 h, followed by 30% cold work and final-aging at 165 °C for 4 h, which led to 21% elongation. To verify the suggested optimum conditions, the tests were carried out confirming the high accuracy (above 94%) of the RSM technique as well as the developed models. It is shown that the RSM can be used successfully to optimize the aging process, to determine the significance of aging parameters and to model the combination effect of process variables on the aging behavior of AA6056.
Valenzuela, Aníbal; Lespes, Gaëtane; Quiroz, Waldo; Aguilar, Luis F; Bravo, Manuel A
2014-07-01
A new headspace solid-phase micro-extraction (HS-SPME) method followed by gas chromatography with pulsed flame photometric detection (GC-PFPD) analysis has been developed for the simultaneous determination of 11 organotin compounds, including methyl-, butyl-, phenyl- and octyltin derivates, in human urine. The methodology has been validated by the analysis of urine samples fortified with all analytes at different concentration levels, and recovery rates above 87% and relative precisions between 2% and 7% were obtained. Additionally, an experimental-design approach has been used to model the storage stability of organotin compounds in human urine, demonstrating that organotins are highly degraded in this medium, although their stability is satisfactory during the first 4 days of storage at 4 °C and pH=4. Finally, this methodology was applied to urine samples collected from harbor workers exposed to antifouling paints; methyl- and butyltins were detected, confirming human exposure in this type of work environment. Copyright © 2014 Elsevier B.V. All rights reserved.
77 FR 53769 - Receipts-Based, Small Business Size Standard; Confirmation of Effective Date
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-04
...-Based, Small Business Size Standard; Confirmation of Effective Date AGENCY: Nuclear Regulatory Commission. ACTION: Direct final rule; confirmation of effective date. SUMMARY: The U.S. Nuclear Regulatory Commission (NRC) is confirming the effective date of August 22, 2012, for the direct final rule that appeared...
2008-07-23
This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.
75 FR 10413 - New Animal Drug Applications; Confirmation of Effective Date
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-08
.... FDA-2009-N-0436] New Animal Drug Applications; Confirmation of Effective Date AGENCY: Food and Drug Administration, HHS. ACTION: Direct final rule; confirmation of effective date. SUMMARY: The Food and Drug Administration (FDA) is confirming the effective date of March 8, 2010, for the final rule that appeared in the...
Micoli, Alessandra; Turco, Antonio; Araujo-Palomo, Elsie; Encinas, Armando; Quintana, Mildred; Prato, Maurizio
2014-04-25
Nucleoside-functionalized multi-walled carbon nanotubes (N-MWCNTs) were synthesized and characterized. A self-organization process using hydrogen bonding interactions was then used for the fabrication of self-assembled N-MWCNTs films free of stabilizing agents, polymers, or surfactants. Membranes were produced by using a simple water-dispersion-based vacuum-filtration method. Hydrogen-bond recognition was confirmed by analysis with IR spectroscopy and TEM images. Restoration of the electronic conduction properties in the N-MWCNTs membranes was performed by removing the organic portion by thermal treatment under an argon atmosphere to give d-N-MWCNTs. Electrical conductivity and thermal gravimetric analysis (TGA) measurements confirmed the efficiency of the annealing process. Finally, oxidative biodegradation of the films N-MWCNTs and d-N-MWCNTs was performed by using horseradish peroxidase (HRP) and low concentrations of H2 O2 . Our results confirm that functional groups play an important role in the biodegradation of CNT by HRP: N-MWCNTs films were completely biodegraded, whereas for d-N-MWCNTs films no degradation was observed, showing that the pristine CNT undergoes minimal enzyme-catalyzed oxidation This novel methodology offers a straightforward supramolecular strategy for the construction of conductive and biodegradable carbon nanotube films. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
NASA Astrophysics Data System (ADS)
Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.
2015-02-01
Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric survey, but also the starting point for the whole reconstruction of the city and its urban context, from the research point of view. This reconstruction process will concern even some areas that have not yet been excavated, where the application of procedural modelling can offer an important support to the reconstructive hypothesis.
Validating a new methodology for strain estimation from cardiac cine MRI
NASA Astrophysics Data System (ADS)
Elnakib, Ahmed; Beache, Garth M.; Gimel'farb, Georgy; Inanc, Tamer; El-Baz, Ayman
2013-10-01
This paper focuses on validating a novel framework for estimating the functional strain from cine cardiac magnetic resonance imaging (CMRI). The framework consists of three processing steps. First, the left ventricle (LV) wall borders are segmented using a level-set based deformable model. Second, the points on the wall borders are tracked during the cardiac cycle based on solving the Laplace equation between the LV edges. Finally, the circumferential and radial strains are estimated at the inner, mid-wall, and outer borders of the LV wall. The proposed framework is validated using synthetic phantoms of the material strains that account for the physiological features and the LV response during the cardiac cycle. Experimental results on simulated phantom images confirm the accuracy and robustness of our method.
EPA announced the availability of the final report, Causal Assessment of Biological Impairment in the Bogue Homo River, Mississippi Using the U.S. EPA’s Stressor Identification Methodology. This assessment is taken from more than 700 court ordered assessments of the cau...
Rodriguez-Lazaro, David; Gonzalez-García, Patricia; Delibato, Elisabetta; De Medici, Dario; García-Gimeno, Rosa Maria; Valero, Antonio; Hernandez, Marta
2014-08-01
The microbiological standard for detection of Salmonella relies on several cultural steps and requires more than 5 days for final confirmation, and as consequence there is a need for an alternative rapid methodology for its detection. The aim of this study was to compare different detection strategies based on real-time PCR for a rapid and sensitive detection in an ample range of food products: raw pork and poultry meat, ready to eat lettuce salad and raw sheep milk cured cheese. Three main parameters were evaluated to reduce the time and cost for final results: the initial sample size (25 and 50 g), the incubation times (6, 10 and 18 h) and the bacterial DNA extraction (simple boiling of the culture after washing the bacterial pellet, the use of the Chelex resin, and a commercial silica column). The results obtained demonstrate that a combination of an incubation in buffered peptone water for 18 h of a 25 g-sample coupled to a DNA extraction by boiling and a real-time PCR assay detected down to 2-4 Salmonella spp.CFU per sample in less than 21 h in different types of food products. This RTi-PCR-based method is fully compatible with the ISO standard, providing results more rapidly and cost-effectively. The results were confirmed in a large number of naturally contaminated food samples with at least the same analytical performance as the reference method. Copyright © 2014 Elsevier B.V. All rights reserved.
Cost-Utility Analysis: Current Methodological Issues and Future Perspectives
Nuijten, Mark J. C.; Dubois, Dominique J.
2011-01-01
The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127
Opinion Dynamics with Confirmation Bias
Allahverdyan, Armen E.; Galstyan, Aram
2014-01-01
Background Confirmation bias is the tendency to acquire or evaluate new information in a way that is consistent with one's preexisting beliefs. It is omnipresent in psychology, economics, and even scientific practices. Prior theoretical research of this phenomenon has mainly focused on its economic implications possibly missing its potential connections with broader notions of cognitive science. Methodology/Principal Findings We formulate a (non-Bayesian) model for revising subjective probabilistic opinion of a confirmationally-biased agent in the light of a persuasive opinion. The revision rule ensures that the agent does not react to persuasion that is either far from his current opinion or coincides with it. We demonstrate that the model accounts for the basic phenomenology of the social judgment theory, and allows to study various phenomena such as cognitive dissonance and boomerang effect. The model also displays the order of presentation effect–when consecutively exposed to two opinions, the preference is given to the last opinion (recency) or the first opinion (primacy) –and relates recency to confirmation bias. Finally, we study the model in the case of repeated persuasion and analyze its convergence properties. Conclusions The standard Bayesian approach to probabilistic opinion revision is inadequate for describing the observed phenomenology of persuasion process. The simple non-Bayesian model proposed here does agree with this phenomenology and is capable of reproducing a spectrum of effects observed in psychology: primacy-recency phenomenon, boomerang effect and cognitive dissonance. We point out several limitations of the model that should motivate its future development. PMID:25007078
Gianfranceschi, Monica Virginia; Rodriguez-Lazaro, David; Hernandez, Marta; González-García, Patricia; Comin, Damiano; Gattuso, Antonietta; Delibato, Elisabetta; Sonnessa, Michele; Pasquali, Frederique; Prencipe, Vincenza; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Kozačinski, Lidija; Tomic, Danijela Horvatek; Zdolec, Nevijo; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John Elmerdahl; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Paiusco, Antonella; De Cesare, Alessandra; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Listeria monocytogenes requires around 7 days for final confirmation, and due to perishable nature of RTE food products, there is a clear need for an alternative methodology for detection of this pathogen. This study presents an international (at European level) ISO 16140-based validation trial of a non-proprietary real-time PCR-based methodology that can generate final results in the following day of the analysis. This methodology is based on an ISO compatible enrichment coupled to a bacterial DNA extraction and a consolidated real-time PCR assay. Twelve laboratories from six European countries participated in this trial, and soft cheese was selected as food model since it can represent a difficult matrix for the bacterial DNA extraction and real-time PCR amplification. The limit of detection observed was down to 10 CFU per 25 of sample, showing excellent concordance and accordance values between samples and laboratories (>75%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (82.75%, 96.70% and 97.62%, respectively) when the results obtained for the real-time PCR-based methods were compared to those of the ISO 11290-1 standard method. An interesting observation was that the L. monocytogenes detection by the real-time PCR method was less affected in the presence of Listeria innocua in the contaminated samples, proving therefore to be more reliable than the reference method. The results of this international trial demonstrate that the evaluated real-time PCR-based method represents an excellent alterative to the ISO standard since it shows a higher performance as well as reduce the extent of the analytical process, and can be easily implemented routinely by the competent authorities and food industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gautam, Girish Dutt; Pandey, Arun Kumar
2018-03-01
Kevlar is the most popular aramid fiber and most commonly used in different technologically advanced industries for various applications. But the precise cutting of Kevlar composite laminates is a difficult task. The conventional cutting methods face various defects such as delamination, burr formation, fiber pullout with poor surface quality and their mechanical performance is greatly affected by these defects. The laser beam machining may be an alternative of the conventional cutting processes due to its non-contact nature, requirement of low specific energy with higher production rate. But this process also faces some problems that may be minimized by operating the machine at optimum parameters levels. This research paper examines the effective utilization of the Nd:YAG laser cutting system on difficult-to-cut Kevlar-29 composite laminates. The objective of the proposed work is to find the optimum process parameters settings for getting the minimum kerf deviations at both sides. The experiments have been conducted on Kevlar-29 composite laminates having thickness 1.25 mm by using Box-Benkhen design with two center points. The experimental data have been used for the optimization by using the proposed methodology. For the optimization, Teaching learning Algorithm based approach has been employed to obtain the minimum kerf deviation at bottom and top sides. A self coded Matlab program has been developed by using the proposed methodology and this program has been used for the optimization. Finally, the confirmation tests have been performed to compare the experimental and optimum results obtained by the proposed methodology. The comparison results show that the machining performance in the laser beam cutting process has been remarkably improved through proposed approach. Finally, the influence of different laser cutting parameters such as lamp current, pulse frequency, pulse width, compressed air pressure and cutting speed on top kerf deviation and bottom kerf deviation during the Nd:YAG laser cutting of Kevlar-29 laminates have been discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-04
... local HUD program staff. Questions on how to conduct FMR surveys or further methodological explanations... version of the RDD survey methodology for smaller, nonmetropolitan PHAs. This methodology is designed to...
Developing Army Leaders through Increased Rigor in Professional Military Training and Education
2017-06-09
leadership. Research Methodology An applied, exploratory, qualitative research methodology via a structured and focused case study comparison was...research methodology via a structured and focused case study comparison. Finally, it will discuss how the methodology will be conducted to make...development models; it serves as the base data for case study comparison. 48 Research Methodology and Data Analysis A qualitative research
González-Sáiz, José-María; Esteban-Díez, Isabel; Rodríguez-Tecedor, Sofía; Pizarro, Consuelo
2008-11-01
The overall purpose of the project, of which this study is a part, was to examine the feasibility of onion waste as a support-substrate for the profitable production of food-grade products. This study focused on the efficient production of ethanol from worthless onions by transforming the onion juice into onion liquor via alcoholic fermentation with the yeast Saccharomyces cerevisiae. The onion bioethanol produced could be later used as a favorable substrate for acetic fermentation to finally obtain onion vinegar. Near-infrared spectroscopy (NIRS), coupled with the multivariate curve resolution-alternating least squares (MCR-ALS) method, has been used to reveal the compositional and spectral profiles for both substrates and products of alcoholic fermentation runs, that is, total sugars, ethanol, and biomass concentration. The ambiguity associated with the ALS calculation was resolved by applying suitable inequality and equality constraints. The quality of the results provided by the NIR-based MCR-ALS methodology adopted was evaluated by several performance indicators, including the variance explained by the model, the lack of fit and the agreement between the MCR-ALS achieved solution and the results computed by applying previously validated PLS reference models. An additional fermentation run was employed to test the actual predictive ability of the ALS model developed. For all the components resolved in the fermentation system studied (i.e., total sugars, ethanol, and biomass), the final model obtained showed a high predictive ability and suitable accuracy and precision, both in calibration and external validation, confirmed by the very good agreement between the ALS responses and the reference values (the coefficient of determination was, in all cases, very close to 1, and the statistics confirmed that no significant difference was found between PLS reference models and the MCR-ALS methodology applied). Thus, the proven reliability of the MCR-ALS model presented in this study, based only on NIR measurements, makes it suitable for monitoring of the key species involved in the alcoholic fermentation of onion juice, allowing the process to be modeled and controlled in real time.
NASA Astrophysics Data System (ADS)
Singh, Jaswinder; Chauhan, Amit
2017-12-01
This study investigates the mechanical behavior of aluminum 2024 matrix composites reinforced with silicon carbide and red mud particles. The hybrid reinforcements were successfully incorporated into the alloy matrix using the stir casting process. An orthogonal array based on Taguchi's technique was used to acquire experimental data for mechanical properties (hardness and impact energy) of the composites. The analysis of variance (ANOVA) and response surface methodology (RSM) techniques were used to evaluate the influence of test parameters (reinforcement ratio, particle size and ageing time). The morphological analysis of the surfaces (fractured during impact tests) was conducted to identify the failure mechanism. Finally, a confirmation experiment was performed to check the adequacy of the developed model. The results indicate that the ageing time is the most effective parameter as far as the hardness of the hybrid composites is concerned. It has also been revealed that red mud wt.% has maximum influence on the impact energy characteristics of the hybrid composites. The study concludes that Al2024/SiC/red mud hybrid composites possess superior mechanical performance in comparison to pure alloy under optimized conditions.
Traffic-related particulate air pollution exposure in urban areas
NASA Astrophysics Data System (ADS)
Borrego, C.; Tchepel, O.; Costa, A. M.; Martins, H.; Ferreira, J.; Miranda, A. I.
In the last years, there has been an increase of scientific studies confirming that long- and short-term exposure to particulate matter (PM) pollution leads to adverse health effects. The development of a methodology for the determination of accumulated human exposure in urban areas is the main objective of the current work, combining information on concentrations at different microenvironments and population time-activity pattern data. A link between a mesoscale meteorological and dispersion model and a local scale air quality model was developed to define the boundary conditions for the local scale application. The time-activity pattern of the population was derived from statistical information for different sub-population groups and linked to digital city maps. Finally, the hourly PM 10 concentrations for indoor and outdoor microenvironments were estimated for the Lisbon city centre, which was chosen as the case-study, based on the local scale air quality model application for a selected period. This methodology is a first approach to estimate population exposure, calculated as the total daily values above the thresholds recommended for long- and short-term health effects. Obtained results reveal that in Lisbon city centre a large number of persons are exposed to PM levels exceeding the legislated limit value.
Garcia Nieto, P J; Sánchez Lasheras, F; de Cos Juez, F J; Alonso Fernández, J R
2011-11-15
There is an increasing need to describe cyanobacteria blooms since some cyanobacteria produce toxins, termed cyanotoxins. These latter can be toxic and dangerous to humans as well as other animals and life in general. It must be remarked that the cyanobacteria are reproduced explosively under certain conditions. This results in algae blooms, which can become harmful to other species if the cyanobacteria involved produce cyanotoxins. In this research work, the evolution of cyanotoxins in Trasona reservoir (Principality of Asturias, Northern Spain) was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS) technique. The results of the present study are two-fold. On one hand, the importance of the different kind of cyanobacteria over the presence of cyanotoxins in the reservoir is presented through the MARS model and on the other hand a predictive model able to forecast the possible presence of cyanotoxins in a short term was obtained. The agreement of the MARS model with experimental data confirmed the good performance of the same one. Finally, conclusions of this innovative research are exposed. Copyright © 2011 Elsevier B.V. All rights reserved.
2008-03-14
OSHA is confirming the effective date of its direct final rule that revises a number of standards for general industry that refer to national consensus standards. The direct final rule states that it would become effective on March 13, 2008 unless OSHA receives significant adverse comment on these revisions by January 14, 2008. OSHA received no adverse comments by that date and, therefore, is confirming that the rule will become effective on March 13, 2008.
Analysis of Additive Manufacturing for Sustainment of Naval Aviation Systems
2017-09-01
selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The methodology organizes...a component selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The... methodology organizes the resultant data using a top-down approach that aligns technical feasibility with programmatic objectives. Finally, a discrete event
Evaluation of Methodology for Estimating the Cost of Air Force On-The-Job Training. Final Report.
ERIC Educational Resources Information Center
Samers, Bernard N.; And Others
Described is the final phase of a study directed at the development of an on-the-job training (OJT) costing methodology. Utilizing a modification of survey techniques tested and evaluated during the previous phase, estimates were obtained for the cost of OJT for airman training from the l-level (unskilled to the 3-level (semiskilled) in five…
DOT National Transportation Integrated Search
1998-09-14
A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...
Human albumin solders for clinical application during laser tissue welding.
Poppas, D P; Wright, E J; Guthrie, P D; Shlahet, L T; Retik, A B
1996-01-01
Fifty percent human albumin solder significantly improves weld strength when compared to lower concentrations [Wright et al., ASLMS meeting, April, 1995]. We developed a method for preparing 50% human albumin that may be considered compatible for clinical applications. Fifty percent human albumin solder was prepared from 25% commercially available human albumin using a lyophilization technique. Assessment of sterility, viscosity, pH, and peak absorption wavelength were performed. This report describes the methodology used to prepare a 50% human albumin solder that is compatible with clinical use. Maintenance of the structural integrity of the albumin was confirmed by polyacrylamide gel electrophoresis. This solder preparation can be used alone or with the addition of exogenous chromophores. The final product is sterile, incorporates viral free protocols, maintains high viscosity, and can be applied easily during open or laparoscopic procedures.
2016-09-15
METHODOLOGY INVESTIGATION: COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION...COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON MARKSMANSHIP 5a. CONTRACT
Nasserie, Tahmina; Tuite, Ashleigh R; Whitmore, Lindsay; Hatchette, Todd; Drews, Steven J; Peci, Adriana; Kwong, Jeffrey C; Friedman, Dara; Garber, Gary; Gubbay, Jonathan; Fisman, David N
2017-01-01
Seasonal influenza epidemics occur frequently. Rapid characterization of seasonal dynamics and forecasting of epidemic peaks and final sizes could help support real-time decision-making related to vaccination and other control measures. Real-time forecasting remains challenging. We used the previously described "incidence decay with exponential adjustment" (IDEA) model, a 2-parameter phenomenological model, to evaluate the characteristics of the 2015-2016 influenza season in 4 Canadian jurisdictions: the Provinces of Alberta, Nova Scotia and Ontario, and the City of Ottawa. Model fits were updated weekly with receipt of incident virologically confirmed case counts. Best-fit models were used to project seasonal influenza peaks and epidemic final sizes. The 2015-2016 influenza season was mild and late-peaking. Parameter estimates generated through fitting were consistent in the 2 largest jurisdictions (Ontario and Alberta) and with pooled data including Nova Scotia counts (R 0 approximately 1.4 for all fits). Lower R 0 estimates were generated in Nova Scotia and Ottawa. Final size projections that made use of complete time series were accurate to within 6% of true final sizes, but final size was using pre-peak data. Projections of epidemic peaks stabilized before the true epidemic peak, but these were persistently early (~2 weeks) relative to the true peak. A simple, 2-parameter influenza model provided reasonably accurate real-time projections of influenza seasonal dynamics in an atypically late, mild influenza season. Challenges are similar to those seen with more complex forecasting methodologies. Future work includes identification of seasonal characteristics associated with variability in model performance.
ECSIN's methodological approach for hazard evaluation of engineered nanomaterials
NASA Astrophysics Data System (ADS)
Bregoli, Lisa; Benetti, Federico; Venturini, Marco; Sabbioni, Enrico
2013-04-01
The increasing production volumes and commercialization of engineered nanomaterials (ENM), together with data on their higher biological reactivity when compared to bulk counterpart and ability to cross biological barriers, have caused concerns about their potential impacts on the health and safety of both humans and the environment. A multidisciplinary component of the scientific community has been called to evaluate the real risks associated with the use of products containing ENM, and is today in the process of developing specific definitions and testing strategies for nanomaterials. At ECSIN we are developing an integrated multidisciplinary methodological approach for the evaluation of the biological effects of ENM on the environment and human health. While our testing strategy agrees with the most widely advanced line of work at the European level, the choice of methods and optimization of protocols is made with an extended treatment of details. Our attention to the methodological and technical details is based on the acknowledgment that the innovative characteristics of matter at the nano-size range may influence the existing testing methods in a partially unpredictable manner, an aspect which is frequently recognized at the discussion level but oftentimes disregarded at the laboratory bench level. This work outlines the most important steps of our testing approach. In particular, each step will be briefly discussed in terms of potential technical and methodological pitfalls that we have encountered, and which are often ignored in nanotoxicology research. The final aim is to draw attention to the need of preliminary studies in developing reliable tests, a crucial aspect to confirm the suitability of the chosen analytical and toxicological methods to be used for the specific tested nanoparticle, and to express the idea that in nanotoxicology,"devil is in the detail".
2012 Review on the Extension of the AMedP-8(C) Methodology to New Agents, Materials, and Conditions
2013-10-01
Atlantic Treaty Organization (NATO) to estimate casualties from chemical, biological , radiological, and nuclear (CBRN) weapons . The final draft...chemical, biological , radiological, and nuclear (CBRN) weapons . The final draft documenting this methodology was published by IDA in 2009 and was...from Battlefield Exposure to Chemical, Biological and Radiological Agents and Nuclear Weapon Effects. IDA Document D- 4465. Alexandria, VA: IDA
Mezcua, Milagros; Ferrer, Carmen; García-Reyes, Juan F; Martínez-Bueno, María Jesús; Albarracín, Micaela; Claret, María; Fernández-Alba, Amadeo R
2008-05-01
In this work, two analytical methods based on liquid chromatography coupled to electrospray time-of-flight mass spectrometry (LC/ESI-TOFMS) and tandem mass spectrometry (LC/ESI-MS/MS) are described for the identification, confirmation and quantitation of three insecticides non-authorized in the European Union (nitenpyram, isocarbophos and isofenphos-methyl) but detected in recent monitoring programmes in pepper samples. The proposed methodologies involved a sample extraction procedure using liquid-liquid partition with acetonitrile followed by a cleanup step based on dispersive solid-phase extraction. Recovery studies performed on peppers spiked at different fortification levels (10 and 50 microg kg(-1)) yielded average recoveries in the range 76-100% with relative standard deviation (RSD) (%) values below 10%. Identification, confirmation and quantitation were carried out by LC/TOFMS and LC/MS/MS using a hybrid triple quadrupole linear ion trap (QqLIT) instrument in multiple-reaction monitoring (MRM) mode. The obtained limits of quantitation (LOQs) were in the range 0.1-5 microg kg(-1), depending on each individual technique. Finally, the proposed methods were successfully applied to the analysis of suspected pepper samples. Copyright (c) 2008 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Brooks, Keith; And Others
1979-01-01
Discusses the benefits of the International Communication Association Communication Audit as a methodology for evaluation of organizational communication processes and outcomes. An "after" survey of 16 audited organizations confirmed the audit as a valid diagnostic methodology and organization development intervention technique which…
Park, SoMi; Hur, Hea Kung; Kim, Ki Kyong; Song, Hee Young
2017-08-01
This study was undertaken to develop and test a mastery learning program of nursing skills for undergraduate nursing students. In this methodological study, first, the preliminary draft of a mastery learning program to provide training for nursing skills was developed based on Bloom's framework for mastery learning. Second, to test the developed program, a single-blinded, nonequivalent control group nonsynchronized study was conducted on 50 senior nursing students in a University selected by convenient sampling. Thirteen students were assigned to a control group; 13, 12, and 13 of them were assigned to intravenous therapy, transfusion, and patient transfer groups, respectively. The achievement levels and performance scores of the selected nursing skills were measured before and after the completion of the program in all the groups. Lastly, the final program was confirmed based on the results of the program testing. Intravenous therapy, transfusion, and patient transfer were selected as essential nursing skills for the program based on the priorities rated by clinical instructors and staff nurses. The achievement levels of selected nursing skills were determined by Angoff scores. After participating in the program, the proportion of passers and performance scores of the nursing skills in the experimental groups were significantly higher than those in the control group. The final program was confirmed which included a diagnostic test, enrichment activities for the passers and three repetitions of corrective activities and formative assessments for non-passers. The results suggest that a mastery learning program for undergraduate students can lead to better improvement and performance of essential nursing skills. © 2017 Korean Society of Nursing Science
Recovering galaxy cluster gas density profiles with XMM-Newton and Chandra
NASA Astrophysics Data System (ADS)
Bartalucci, I.; Arnaud, M.; Pratt, G. W.; Vikhlinin, A.; Pointecouteau, E.; Forman, W. R.; Jones, C.; Mazzotta, P.; Andrade-Santos, F.
2017-12-01
We examined the reconstruction of galaxy cluster radial density profiles obtained from Chandra and XMM-Newton X-ray observations, using high quality data for a sample of twelve objects covering a range of morphologies and redshifts. By comparing the results obtained from the two observatories and by varying key aspects of the analysis procedure, we examined the impact of instrumental effects and of differences in the methodology used in the recovery of the density profiles. We find that the final density profile shape is particularly robust. We adapted the photon weighting vignetting correction method developed for XMM-Newton for use with Chandra data, and confirm that the resulting Chandra profiles are consistent with those corrected a posteriori for vignetting effects. Profiles obtained from direct deprojection and those derived using parametric models are consistent at the 1% level. At radii larger than 6″, the agreement between Chandra and XMM-Newton is better than 1%, confirming an excellent understanding of the XMM-Newton PSF. Furthermore, we find no significant energy dependence. The impact of the well-known offset between Chandra and XMM-Newton gas temperature determinations on the density profiles is found to be negligible. However, we find an overall normalisation offset in density profiles of the order of 2.5%, which is linked to absolute flux cross-calibration issues. As a final result, the weighted ratios of Chandra to XMM-Newton gas masses computed at R2500 and R500 are r = 1.03 ± 0.01 and r = 1.03 ± 0.03, respectively. Our study confirms that the radial density profiles are robustly recovered, and that any differences between Chandra and XMM-Newton can be constrained to the 2.5% level, regardless of the exact data analysis details. These encouraging results open the way for the true combination of X-ray observations of galaxy clusters, fully leveraging the high resolution of Chandra and the high throughput of XMM-Newton.
Martín, Carlos; Pastor, Loly
2018-01-01
Objectives The purpose of this study is to provide an updated systematic review to identify studies describing the prevalence of psychosis in order to explore methodological factors that could account for the variation in prevalence estimates. Methods Studies with original data related to the prevalence of psychosis (published between 1990 and 2015) were identified via searching electronic databases and reviewing manual citations. Prevalence estimates were sorted according to prevalence type (point, 12-months and lifetime). The independent association between key methodological variables and the mean effect of prevalence was examined (prevalence type, case-finding setting, method of confirming diagnosis, international classification of diseases, diagnosis category, and study quality) by meta-analytical techniques and random-effects meta-regression. Results Seventy-three primary studies were included, providing a total of 101 estimates of prevalence rates of psychosis. Across these studies, the pooled median point and 12-month prevalence for persons was 3.89 and 4.03 per 1000 respectively; and the median lifetime prevalence was 7.49 per 1000. The result of the random-effects meta-regression analysis revealed a significant effect for the prevalence type, with higher rates of lifetime prevalence than 12-month prevalence (p<0.001). Studies conducted in the general population presented higher prevalence rates than those carried out in populations attended in health/social services (p = 0.006). Compared to the diagnosis of schizophrenia only, prevalence rates were higher in the probable psychotic disorder (p = 0.022) and non-affective psychosis (p = 0.009). Finally, a higher study quality is associated with a lower estimated prevalence of psychotic disorders (p<0.001). Conclusions This systematic review provides a comprehensive comparison of methodologies used in studies of the prevalence of psychosis, which can provide insightful information for future epidemiological studies in adopting the most relevant methodological approach. PMID:29649252
Management of health care expenditure by soft computing methodology
NASA Astrophysics Data System (ADS)
Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad
2017-01-01
In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
Batchu, Sudha Rani; Ramirez, Cesar E; Gardinali, Piero R
2015-05-01
Because of its widespread consumption and its persistence during wastewater treatment, the artificial sweetener sucralose has gained considerable interest as a proxy to detect wastewater intrusion into usable water resources. The molecular resilience of this compound dictates that coastal and oceanic waters are the final recipient of this compound with unknown effects on ecosystems. Furthermore, no suitable methodologies have been reported for routine, ultra-trace detection of sucralose in seawater as the sensitivity of traditional liquid chromatography-tandem mass spectrometry (LC-MS/MS) analysis is limited by a low yield of product ions upon collision-induced dissociation (CID). In this work, we report the development and field test of an alternative analysis tool for sucralose in environmental waters, with enough sensitivity for the proper quantitation and confirmation of this analyte in seawater. The methodology is based on automated online solid-phase extraction (SPE) and high-resolving-power orbitrap MS detection. Operating in full scan (no CID), detection of the unique isotopic pattern (100:96:31 for [M-H](-), [M-H+2](-), and [M-H+4](-), respectively) was used for ultra-trace quantitation and analyte identification. The method offers fast analysis (14 min per run) and low sample consumption (10 mL per sample) with method detection and confirmation limits (MDLs and MCLs) of 1.4 and 5.7 ng/L in seawater, respectively. The methodology involves low operating costs due to virtually no sample preparation steps or consumables. As an application example, samples were collected from 17 oceanic and estuarine sites in Broward County, FL, with varying salinity (6-40 PSU). Samples included the ocean outfall of the Southern Regional Wastewater Treatment Plant (WWTP) that serves Hollywood, FL. Sucralose was detected above MCL in 78% of the samples at concentrations ranging from 8 to 148 ng/L, with the exception of the WWTP ocean outfall (at pipe end, 28 m below the surface) where the measured concentration was 8418 ± 3813 ng/L. These results demonstrate the applicability of this monitoring tool for the trace-level detection of this wastewater marker in very dilute environmental waters.
[Preliminarily application of content analysis to qualitative nursing data].
Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang
2012-10-01
Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.
NASA Astrophysics Data System (ADS)
Razafindratsima, Stephen; Guérin, Roger; Bendjoudi, Hocine; de Marsily, Ghislain
2014-09-01
A methodological approach is described which combines geophysical and geochemical data to delineate the extent of a chlorinated ethenes plume in northern France; the methodology was used to calibrate a hydrogeological model of the contaminants' migration and degradation. The existence of strong reducing conditions in some parts of the aquifer is first determined by measuring in situ the redox potential and dissolved oxygen, dissolved ferrous iron and chloride concentrations. Electrical resistivity imaging and electromagnetic mapping, using the Slingram method, are then used to determine the shape of the pollutant plume. A decreasing empirical exponential relation between measured chloride concentrations in the water and aquifer electrical resistivity is observed; the resistivity formation factor calculated at a few points also shows a major contribution of chloride concentration in the resistivity of the saturated porous medium. MODFLOW software and MT3D99 first-order parent-daughter chain reaction and the RT3D aerobic-anaerobic model for tetrachloroethene (PCE)/trichloroethene (TCE) dechlorination are finally used for a first attempt at modeling the degradation of the chlorinated ethenes. After calibration, the distribution of the chlorinated ethenes and their degradation products simulated with the model approximately reflects the mean measured values in the observation wells, confirming the data-derived image of the plume.
Multivariate Analysis and Prediction of Dioxin-Furan ...
Peer Review Draft of Regional Methods Initiative Final Report Dioxins, which are bioaccumulative and environmentally persistent, pose an ongoing risk to human and ecosystem health. Fish constitute a significant source of dioxin exposure for humans and fish-eating wildlife. Current dioxin analytical methods are costly, time-consuming, and produce hazardous by-products. A Danish team developed a novel, multivariate statistical methodology based on the covariance of dioxin-furan congener Toxic Equivalences (TEQs) and fatty acid methyl esters (FAMEs) and applied it to North Atlantic Ocean fishmeal samples. The goal of the current study was to attempt to extend this Danish methodology to 77 whole and composite fish samples from three trophic groups: predator (whole largemouth bass), benthic (whole flathead and channel catfish) and forage fish (composite bluegill, pumpkinseed and green sunfish) from two dioxin contaminated rivers (Pocatalico R. and Kanawha R.) in West Virginia, USA. Multivariate statistical analyses, including, Principal Components Analysis (PCA), Hierarchical Clustering, and Partial Least Squares Regression (PLS), were used to assess the relationship between the FAMEs and TEQs in these dioxin contaminated freshwater fish from the Kanawha and Pocatalico Rivers. These three multivariate statistical methods all confirm that the pattern of Fatty Acid Methyl Esters (FAMEs) in these freshwater fish covaries with and is predictive of the WHO TE
Quantitative trait Loci analysis using the false discovery rate.
Benjamini, Yoav; Yekutieli, Daniel
2005-10-01
False discovery rate control has become an essential tool in any study that has a very large multiplicity problem. False discovery rate-controlling procedures have also been found to be very effective in QTL analysis, ensuring reproducible results with few falsely discovered linkages and offering increased power to discover QTL, although their acceptance has been slower than in microarray analysis, for example. The reason is partly because the methodological aspects of applying the false discovery rate to QTL mapping are not well developed. Our aim in this work is to lay a solid foundation for the use of the false discovery rate in QTL mapping. We review the false discovery rate criterion, the appropriate interpretation of the FDR, and alternative formulations of the FDR that appeared in the statistical and genetics literature. We discuss important features of the FDR approach, some stemming from new developments in FDR theory and methodology, which deem it especially useful in linkage analysis. We review false discovery rate-controlling procedures--the BH, the resampling procedure, and the adaptive two-stage procedure-and discuss the validity of these procedures in single- and multiple-trait QTL mapping. Finally we argue that the control of the false discovery rate has an important role in suggesting, indicating the significance of, and confirming QTL and present guidelines for its use.
Optimization of Fenton's oxidation of herbicide dicamba in water using response surface methodology
NASA Astrophysics Data System (ADS)
Sangami, Sanjeev; Manu, Basavaraju
2017-12-01
In this study Fenton's oxidation of dicamba in aqueous medium was investigated by using the response surface methodology. The influence of H2O2/COD ( A), H2O2/Fe2+ ( B), pH ( C) and reaction time ( D) as independent variables were studied on two responses (COD and dicamba removal efficiency). The dosage of H2O2 (5.35-17.4 mM) and Fe2+ (0.09-2.13 mM) were varied and optimum percentage removal of dicamba of 84.01% with H2O2 and Fe2+ dosage of 11.38 and 0.33 mM respectively. The whole oxidation process was monitored by high performance liquid chromatography (HPLC) along with liquid chromatography/mass spectrometry (LC/MS). It was found that 82% of dicamba was mineralized to oxalic acid, chloride ion, CO2 and H2O, which was confirmed with COD removal of 81.53%. The regression analysis was performed, in which standard deviation (<4%), coefficient of variation (<8), F value (Fisher's Test) (>2.74), coefficient of correlation ( R 2 = R_{adj}2) and adequate precision (>12) were in good agreement with model values. Finally, the treatment process was validated by performing the additional experiments.
2016-01-01
We systematically assess the current clinical evidence of Gualouxiebaibanxia (GLXBBX) decoction for the treatment of angina pectoris (AP). We included RCTs testing GLXBBX against conventional drugs and GLXBBX combined with conventional drugs versus conventional drugs. 19 RCTs involving 1730 patients were finally identified, and the methodological quality was evaluated as generally low. The results of the meta-analysis showed that GLXBBX alone had significant effect on improving angina symptoms (RR: 1.24, 95% CI 1.14 to 1.35; P < 0.00001), ECG (RR: 1.28 [1.13,1.44]; P < 0.0001), and HDL-C (MD: 0.56 [0.54,0.58]; P < 0.00001) compared with anti-arrhythmic drugs. A significant improvement in angina symptoms (RR: 1.17 [1.12,1.22]; P < 0.00001) and ECG (RR = 1.22; 95% CI = [1.14,1.30]; P < 0.00001) was observed for GLXBBX plus conventional drugs when compared with conventional drugs. Eight trials reported adverse events without serious adverse effects. GLXBBX appears to have beneficial effects on improvement of ECG and reduction of angina symptoms in participants with AP. However, the evidence remains weak due to the poor methodological quality of the included studies. More rigorous trials are needed to confirm the results. PMID:27777598
The Relationship between Ethical Positions and Methodological Approaches: A Scandinavian Perspective
ERIC Educational Resources Information Center
Beach, Dennis; Eriksson, Anita
2010-01-01
In this article, based on reading ethnographic theses, books and articles and conversations with nine key informants, we have tried to describe how research ethics are approached and written about in educational ethnography in Scandinavia. The article confirms findings from previous research that there are different methodological forms of…
Nasserie, Tahmina; Tuite, Ashleigh R; Whitmore, Lindsay; Hatchette, Todd; Drews, Steven J; Peci, Adriana; Kwong, Jeffrey C; Friedman, Dara; Garber, Gary; Gubbay, Jonathan
2017-01-01
Abstract Background Seasonal influenza epidemics occur frequently. Rapid characterization of seasonal dynamics and forecasting of epidemic peaks and final sizes could help support real-time decision-making related to vaccination and other control measures. Real-time forecasting remains challenging. Methods We used the previously described “incidence decay with exponential adjustment” (IDEA) model, a 2-parameter phenomenological model, to evaluate the characteristics of the 2015–2016 influenza season in 4 Canadian jurisdictions: the Provinces of Alberta, Nova Scotia and Ontario, and the City of Ottawa. Model fits were updated weekly with receipt of incident virologically confirmed case counts. Best-fit models were used to project seasonal influenza peaks and epidemic final sizes. Results The 2015–2016 influenza season was mild and late-peaking. Parameter estimates generated through fitting were consistent in the 2 largest jurisdictions (Ontario and Alberta) and with pooled data including Nova Scotia counts (R0 approximately 1.4 for all fits). Lower R0 estimates were generated in Nova Scotia and Ottawa. Final size projections that made use of complete time series were accurate to within 6% of true final sizes, but final size was using pre-peak data. Projections of epidemic peaks stabilized before the true epidemic peak, but these were persistently early (~2 weeks) relative to the true peak. Conclusions A simple, 2-parameter influenza model provided reasonably accurate real-time projections of influenza seasonal dynamics in an atypically late, mild influenza season. Challenges are similar to those seen with more complex forecasting methodologies. Future work includes identification of seasonal characteristics associated with variability in model performance. PMID:29497629
75 FR 43 - Modification of Class E Airspace; Beckley, WV
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... and management of the aircraft operations at Raleigh County Memorial Airport. DATES: Effective 0901... Aviation Administration (FAA), DOT. ACTION: Direct final rule, confirmation of effective date. SUMMARY: This action confirms the effective date of a direct final rule published in the Federal Register that...
75 FR 43 - Modification of Class E Airspace; Sarasota, FL
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... Aviation Administration (FAA), DOT. ACTION: Direct final rule, confirmation of effective date. SUMMARY: This action confirms the effective date of a direct final rule published in the Federal Register that... increases the safety and management of the aircraft operations at Sarasota/Bradenton International Airport...
Sheeran, Paul S; Luois, Samantha; Dayton, Paul A; Matsunaga, Terry O
2011-09-06
Recent efforts in the area of acoustic droplet vaporization with the objective of designing extravascular ultrasound contrast agents has led to the development of stabilized, lipid-encapsulated nanodroplets of the highly volatile compound decafluorobutane (DFB). We developed two methods of generating DFB droplets, the first of which involves condensing DFB gas (boiling point from -1.1 to -2 °C) followed by extrusion with a lipid formulation in HEPES buffer. Acoustic droplet vaporization of micrometer-sized lipid-coated droplets at diagnostic ultrasound frequencies and mechanical indices were confirmed optically. In our second formulation methodology, we demonstrate the formulation of submicrometer-sized lipid-coated nanodroplets based upon condensation of preformed microbubbles containing DFB. The droplets are routinely in the 200-300 nm range and yield microbubbles on the order of 1-5 μm once vaporized, consistent with ideal gas law expansion predictions. The simple and effective nature of this methodology allows for the development of a variety of different formulations that can be used for imaging, drug and gene delivery, and therapy. This study is the first to our knowledge to demonstrate both a method of generating ADV agents by microbubble condensation and formulation of primarily submicrometer droplets of decafluorobutane that remain stable at physiological temperatures. Finally, activation of DFB nanodroplets is demonstrated using pressures within the FDA guidelines for diagnostic imaging, which may minimize the potential for bioeffects in humans. This methodology offers a new means of developing extravascular contrast agents for diagnostic and therapeutic applications. © 2011 American Chemical Society
75 FR 42 - Establishment of Class E Airspace; Spencer, WV
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-04
... Aviation Administration (FAA), DOT. ACTION: Direct final rule; confirmation of effective date. SUMMARY: This action confirms the effective date of a direct final rule published in the Federal Register that establishes Class E Airspace at Spencer, WV. This action enhances the safety and airspace management of Boggs...
77 FR 13969 - Revising Standards Referenced in the Acetylene Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-08
.... OSHA-2011-0183] RIN 1218-AC64 Revising Standards Referenced in the Acetylene Standard AGENCY: Occupational Safety and Health Administration (OSHA), Department of Labor. ACTION: Final rule; confirmation of effective date. SUMMARY: OSHA is confirming the effective date of its direct final rule that revises the...
18 CFR 300.21 - Final confirmation and approval.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Final confirmation and approval. 300.21 Section 300.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... reasonable number of years after first meeting the Administrator's other costs. (ii) The rates must be based...
18 CFR 300.21 - Final confirmation and approval.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Final confirmation and approval. 300.21 Section 300.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... reasonable number of years after first meeting the Administrator's other costs. (ii) The rates must be based...
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
Influence de la pression de mise en forme sur le detourage de stratifies carbone/epoxy
NASA Astrophysics Data System (ADS)
Coulon, Pierre
The need to reduce the weight of structures has led to an increasing use of composite materials in the aerospace industry. To meet the required tolerances and quality, the manufacturing processes must adapt to these new materials. The machining is one of these processes that need to be optimized to control the final part quality. This experimental study aims at understanding the relationship between manufacturing parameters of quasi-isotropic carbon fibre laminates and their machinability. After a preliminary study, it was concluded that curing pressure in autoclave was the most influential manufacturing parameter. The pressure is linked, experimentally, to the void content and then to the mechanical properties and finally to the cutting forces. The research methodology is based on a classic multifactorial design of experience in which the input factors are the curing pressure, feed rate and cutting speed. This study confirms the correlation existing between the curing pressure and void content as well as the relationship between the curing pressure and mechanical properties. The new element of this study is the correlation between the curing pressure and cutting forces during trimming. This last point is interesting because it leads to the development of a predictive model for cutting forces. Although the results of this study are hardly generalizable to other materials, the prediction of cutting forces is possible. Quality after machining is also studied through two criteria: the roughness measurement and evaluation of delamination. Roughness is measured using a roughness depth measuring equipment optimized to make best use of this technique. The study confirms the patterns already observed without being able to improve the characterization of cutting quality. Keywords: composites, trimming, curing pressure, cutting forces, void content, ILSS, delamination, roughness.
Technology Assessment for Powertrain Components Final Report CRADA No. TC-1124-95
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tokarz, F.; Gough, C.
LLNL utilized its defense technology assessment methodologies in combination with its capabilities in the energy; manufacturing, and transportation technologies to demonstrate a methodology that synthesized available but incomplete information on advanced automotive technologies into a comprehensive framework.
Roadway safety analysis methodology for Utah : final report.
DOT National Transportation Integrated Search
2016-12-01
This research focuses on the creation of a three-part Roadway Safety Analysis methodology that applies and automates the cumulative work of recently-completed roadway safety research. The first part is to prepare the roadway and crash data for analys...
Feminist methodologies and engineering education research
NASA Astrophysics Data System (ADS)
Beddoes, Kacey
2013-03-01
This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.
76 FR 47301 - Medicare Program; Hospice Wage Index for Fiscal Year 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... aggregate cap calculation methodology. This final rule will also revise the hospice requirement for a face-to-face encounter for recertification of a patient's terminal illness. Finally, this final rule will..., (410) 786-2120 for questions regarding hospice wage index and hospice face-to-face requirement. Katie...
2015-02-01
5202, Draft Final (Alexandria, VA: IDA, April 2015), 10-4. 14 North Atlantic Treaty Organization (NATO) Standardization Agency ( NSA ), NATO Glossary of...Belgium: NSA , 2012), 2-C-2. 15 Disraelly et al., “A New Methodology for CBRN Casualty Estimation,” 228. 16 Disraelly et al., A Methodology for...20 NATO NSA , AAP-06, 2-K-1. 21 Ibid., 2-D-6. 22 Disraelly et al., A Methodology for Examining Collateral Effects on Military Operations during
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
... Health Administration (OSHA); Department of Labor. ACTION: Final rule; confirmation of effective date. SUMMARY: OSHA is confirming the effective date of its direct final rule (DFR) revising the employee...(VI)). In the March 17, 2010, DFR document, OSHA stated that the DFR would become effective on June 15...
Conceptual and Preliminary Design of a Low-Cost Precision Aerial Delivery System
2016-06-01
test results. It includes an analysis of the failure modes encountered during flight experimentation , methodology used for conducting coordinate...and experimentation . Additionally, the current and desired end state of the research is addressed. Finally, this chapter outlines the methodology ...preliminary design phases are utilized to investigate and develop a potentially low-cost alternative to existing systems. Using an Agile methodology
It is Time the United States Air Force Changes the way it Feeds its Airmen
2008-03-01
narrative , phenomenology , ethnography , case study , and grounded theory . In purpose, these strategies are...methodology) the research will be analyzed. Methodology A qualitative research methodology and specifically a case study strategy for the...well as theory building in chapter five . Finally, in regards to reliability, Yin’s (2003) case study protocol guidance was used as a means to
Load and resistance factor rating (LRFR) in NYS : volume II final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
Retinal image registration for eye movement estimation.
Kolar, Radim; Tornow, Ralf P; Odstrcilik, Jan
2015-01-01
This paper describes a novel methodology for eye fixation measurement using a unique videoophthalmoscope setup and advanced image registration approach. The representation of the eye movements via Poincare plot is also introduced. The properties, limitations and perspective of this methodology are finally discussed.
Load and resistance factor rating (LRFR) in NYS : volume I final report.
DOT National Transportation Integrated Search
2011-09-01
This report develops a Load and Resistance Factor Rating (NYS-LRFR) methodology for New York bridges. The methodology is applicable for the rating of existing bridges, the posting of under-strength bridges, and checking Permit trucks. The proposed LR...
Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen
2015-01-01
The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. PMID:25583870
Teaching Research Methodology through Active Learning
ERIC Educational Resources Information Center
Lundahl, Brad W.
2008-01-01
To complement traditional learning activities in a masters-level research methodology course, social work students worked on a formal research project which involved: designing the study, constructing measures, selecting a sampling strategy, collecting data, reducing and analyzing data, and finally interpreting and communicating the results. The…
Ball, Lauren; Ball, Dianne; Leveritt, Michael; Ray, Sumantra; Collins, Clare; Patterson, Elizabeth; Ambrosini, Gina; Lee, Patricia; Chaboyer, Wendy
2017-04-01
The methodological designs underpinning many primary health-care interventions are not rigorous. Logic models can be used to support intervention planning, implementation and evaluation in the primary health-care setting. Logic models provide a systematic and visual way of facilitating shared understanding of the rationale for the intervention, the planned activities, expected outcomes, evaluation strategy and required resources. This article provides guidance for primary health-care practitioners and researchers on the use of logic models for enhancing methodological rigour of interventions. The article outlines the recommended steps in developing a logic model using the 'NutriCare' intervention as an example. The 'NutriCare' intervention is based in the Australian primary health-care setting and promotes nutrition care by general practitioners and practice nurses. The recommended approach involves canvassing the views of all stakeholders who have valuable and informed opinions about the planned project. The following four targeted, iterative steps are recommended: (1) confirm situation, intervention aim and target population; (2) document expected outcomes and outputs of the intervention; (3) identify and describe assumptions, external factors and inputs; and (4) confirm intervention components. Over a period of 2 months, three primary health-care researchers and one health-services consultant led the collaborative development of the 'NutriCare' logic model. Primary health-care practitioners and researchers are encouraged to develop a logic model when planning interventions to maximise the methodological rigour of studies, confirm that data required to answer the question are captured and ensure that the intervention meets the project goals.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Secretary of the Interior NATURAL RESOURCE DAMAGE ASSESSMENTS Assessment Plan Phase § 11.37 Must the... methodologies in the Assessment Plan, the authorized official must confirm that at least one of the natural resources identified as potentially injured in the preassessment screen has in fact been exposed to the...
Code of Federal Regulations, 2011 CFR
2011-10-01
... Secretary of the Interior NATURAL RESOURCE DAMAGE ASSESSMENTS Assessment Plan Phase § 11.37 Must the... methodologies in the Assessment Plan, the authorized official must confirm that at least one of the natural resources identified as potentially injured in the preassessment screen has in fact been exposed to the...
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Lorenz, Eric; Schmacht, Maximilian; Senz, Martin
2016-11-01
Economical yeast based glutathione (GSH) production is a process that is influenced by several factors like raw material and production costs, biomass production and efficient biotransformation of adequate precursors into the final product GSH. Nowadays the usage of cysteine for the microbial conversion into GSH is industrial state of practice. In the following study, the potential of different inducers to increase the GSH content was evaluated by means of design of experiments methodology. Investigations were executed in three natural Saccharomyces strains, S. cerevisiae, S. bayanus and S. boulardii, in a well suited 50ml shake tube system. Results of shake tube experiments were confirmed in traditional baffled shake flasks and finally via batch cultivation in lab-scale bioreactors under controlled conditions. Comprehensive studies showed that the usage of cysteine ethyl ester (CEE) for the batch-wise biotransformation into GSH led up to a more than 2.2 times higher yield compared to cysteine as inducer. Additionally, the intracellular GSH content could be significantly increased for all strains in terms of 2.29±0.29% for cysteine to 3.65±0.23% for CEE, respectively, in bioreactors. Thus, the usage of CEE provides a highly attractive inducing strategy for the GSH overproduction. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Goaller, C.; Doutreluingne, C.; Berton, M.A.
2007-07-01
This paper describes the methodology followed by the French Atomic Energy Commission (CEA) to decommission the buildings of former research facilities for demolition or possible reuse. It is a well known fact that the French nuclear safety authority has decided not to define any general release level for the decommissioning of nuclear facilities, thus effectively prohibiting radiological measurement-driven decommissioning. The decommissioning procedure therefore requires an intensive in-depth examination of each nuclear plant. This requires a good knowledge of the past history of the plant, and should be initiated as early as possible. The paper first describes the regulatory framework recentlymore » unveiled by the French Safety Authority, then, reviews its application to ongoing decommissioning projects. The cornerstone of the strategy is the definition of waste zoning in the buildings to segregate areas producing conventional waste from those generating nuclear waste. After dismantling, suitable measurements are carried out to confirm the conventional state of the remaining walls. This requires low-level measurement methods providing a suitable detection limit within an acceptable measuring time. Although this generally involves particle counting and in-situ low level gamma spectrometry, the paper focuses on y spectrometry. Finally, the lessons learned from ongoing projects are discussed. (authors)« less
Helal-Neto, Edward; Cabezas, Santiago Sánchez; Sancenón, Félix; Martínez-Máñez, Ramón; Santos-Oliveira, Ralph
2018-05-10
The use of monoclonal antibodies (Mab) in the current medicine is increasing. Antibody-drug conjugates (ADCs) represents an increasingly and important modality for treating several types of cancer. In this area, the use of Mab associated with nanoparticles is a valuable strategy. However, the methodology used to calculate the Mab entrapment, efficiency and content is extremely expensive. In this study we developed and tested a novel very simple one-step methodology to calculate monoclonal antibody entrapment in mesoporous silica (with magnetic core) nanoparticles using the radiolabeling process as primary methodology. The magnetic core mesoporous silica were successfully developed and characterised. The PXRD analysis at high angles confirmed the presence of magnetic cores in the structures and transmission electron microscopy allowed to determine structures size (58.9 ± 8.1 nm). From the isotherm curve, a specific surface area of 872 m 2 /g was estimated along with a pore volume of 0.85 cm 3 /g and an average pore diameter of 3.15 nm. The radiolabeling process to proceed the indirect determination were well-done. Trastuzumab were successfully labeled (>97%) with Tc-99m generating a clear suspension. Besides, almost all the Tc-99m used (labeling the trastuzumab) remained trapped in the surface of the mesoporous silica for a period as long as 8 h. The indirect methodology demonstrated a high entrapment in magnetic core mesoporous silica surface of Tc-99m-traztuzumab. The results confirmed the potential use from the indirect entrapment efficiency methodology using the radiolabeling process, as a one-step, easy and cheap methodology. Copyright © 2018 Elsevier B.V. All rights reserved.
Bossard, N; Descotes, F; Bremond, A G; Bobin, Y; De Saint Hilaire, P; Golfier, F; Awada, A; Mathevet, P M; Berrerd, L; Barbier, Y; Estève, J
2003-11-01
The prognostic value of cathepsin D has been recently recognized, but as many quantitative tumor markers, its clinical use remains unclear partly because of methodological issues in defining cut-off values. Guidelines have been proposed for analyzing quantitative prognostic factors, underlining the need for keeping data continuous, instead of categorizing them. Flexible approaches, parametric and non-parametric, have been proposed in order to improve the knowledge of the functional form relating a continuous factor to the risk. We studied the prognostic value of cathepsin D in a retrospective hospital cohort of 771 patients with breast cancer, and focused our overall survival analysis, based on the Cox regression, on two flexible approaches: smoothing splines and fractional polynomials. We also determined a cut-off value from the maximum likelihood estimate of a threshold model. These different approaches complemented each other for (1) identifying the functional form relating cathepsin D to the risk, and obtaining a cut-off value and (2) optimizing the adjustment for complex covariate like age at diagnosis in the final multivariate Cox model. We found a significant increase in the death rate, reaching 70% with a doubling of the level of cathepsin D, after the threshold of 37.5 pmol mg(-1). The proper prognostic impact of this marker could be confirmed and a methodology providing appropriate ways to use markers in clinical practice was proposed.
Small Hot Jet Acoustic Rig Validation
NASA Technical Reports Server (NTRS)
Brown, Cliff; Bridges, James
2006-01-01
The Small Hot Jet Acoustic Rig (SHJAR), located in the Aeroacoustic Propulsion Laboratory (AAPL) at the NASA Glenn Research Center in Cleveland, Ohio, was commissioned in 2001 to test jet noise reduction concepts at low technology readiness levels (TRL 1-3) and develop advanced measurement techniques. The first series of tests on the SHJAR were designed to prove its capabilities and establish the quality of the jet noise data produced. Towards this goal, a methodology was employed dividing all noise sources into three categories: background noise, jet noise, and rig noise. Background noise was directly measured. Jet noise and rig noise were separated by using the distance and velocity scaling properties of jet noise. Effectively, any noise source that did not follow these rules of jet noise was labeled as rig noise. This method led to the identification of a high frequency noise source related to the Reynolds number. Experiments using boundary layer treatment and hot wire probes documented this noise source and its removal, allowing clean testing of low Reynolds number jets. Other tests performed characterized the amplitude and frequency of the valve noise, confirmed the location of the acoustic far field, and documented the background noise levels under several conditions. Finally, a full set of baseline data was acquired. This paper contains the methodology and test results used to verify the quality of the SHJAR rig.
Park, Seong Ho; Han, Kyunghwa
2018-03-01
The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.
Sibbett, Ruth A; Russ, Tom C; Deary, Ian J; Starr, John M
2017-07-03
Studies investigating the risk factors for or causation of dementia must consider subjects prior to disease onset. To overcome the limitations of prospective studies and self-reported recall of information, the use of existing data is key. This review provides a narrative account of dementia ascertainment methods using sources of existing data. The literature search was performed using: MEDLINE, EMBASE, PsychInfo and Web of Science. Included articles reported a UK-based study of dementia in which cases were ascertained using existing data. Existing data included that which was routinely collected and that which was collected for previous research. After removing duplicates, abstracts were screened and the remaining articles were included for full-text review. A quality tool was used to evaluate the description of the ascertainment methodology. Of the 3545 abstracts screened, 360 articles were selected for full-text review. 47 articles were included for final consideration. Data sources for ascertainment included: death records, national datasets, research databases and hospital records among others. 36 articles used existing data alone for ascertainment, of which 27 used only a single data source. The most frequently used source was a research database. Quality scores ranged from 7/16 to 16/16. Quality scores were better for articles with dementia ascertainment as an outcome. Some papers performed validation studies of dementia ascertainment and most indicated that observed rates of dementia were lower than expected. We identified a lack of consistency in dementia ascertainment methodology using existing data. With no data source identified as a "gold-standard", we suggest the use of multiple sources. Where possible, studies should access records with evidence to confirm the diagnosis. Studies should also calculate the dementia ascertainment rate for the population being studied to enable a comparison with an expected rate.
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS
2017-10-01
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ
77 FR 53059 - Risk-Based Capital Guidelines: Market Risk
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under current methodologies; and increase transparency through enhanced disclosures. The final rule does not include all of the methodologies adopted by the Basel Committee on Banking Supervision for calculating the standardized specific risk capital requirements for debt and securitization positions due to their reliance on credit ratings, which is impermissible under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. Instead, the final rule includes alternative methodologies for calculating standardized specific risk capital requirements for debt and securitization positions.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Zotova, N V; Chereshnev, V A; Gusev, E Yu
2016-01-01
We defined Systemic inflammation (SI) as a "typical, multi-syndrome, phase-specific pathological process, developing from systemic damage and characterized by the total inflammatory reactivity of endotheliocytes, plasma and blood cell factors, connective tissue and, at the final stage, by microcirculatory disorders in vital organs and tissues." The goal of the work: to determine methodological approaches and particular methodical solutions for the problem of identification of SI as a common pathological process. SI can be defined by the presence in plasma of systemic proinflammatory cell stress products-cytokines and other inflammatory mediators, and also by the complexity of other processes signs. We have developed 2 scales: 1) The Reactivity Level scale (RL)-from 0 to 5 points: 0-normal level; RL-5 confirms systemic nature of inflammatory mediator release, and RL- 2-4 defines different degrees of event probability. 2) The SI scale, considering additional criteria along with RL, addresses more integral criteria of SI: the presence of ≥ 5 points according to the SI scale proves the high probability of SI developing. To calculate the RL scale, concentrations of 4 cytokines (IL-6, IL-8, IL-10, TNF-α) and C-reactive protein in plasma were examined. Additional criteria of the SI scale were the following: D-dimers>500ng/ml, cortisol>1380 or <100nmol/l, troponin I≥0.2ng/ml and/or myoglobin≥800ng/ml. 422 patients were included in the study with different septic (n-207) and aseptic (n-215) pathologies. In 190 cases (of 422) there were signs of SI (lethality 38.4%, n-73). In only 5 of 78 cases, lethality was not confirmed by the presence of SI. SI was registered in 100% of cases with septic shock (n-31). There were not significant differences between AU-ROC of CR, SI scale and SOFA to predict death in patients with sepsis and trauma.
Microsatellite markers: what they mean and why they are so useful
Vieira, Maria Lucia Carneiro; Santini, Luciane; Diniz, Augusto Lima; Munhoz, Carla de Freitas
2016-01-01
Abstract Microsatellites or Single Sequence Repeats (SSRs) are extensively employed in plant genetics studies, using both low and high throughput genotyping approaches. Motivated by the importance of these sequences over the last decades this review aims to address some theoretical aspects of SSRs, including definition, characterization and biological function. The methodologies for the development of SSR loci, genotyping and their applications as molecular markers are also reviewed. Finally, two data surveys are presented. The first was conducted using the main database of Web of Science, prospecting for articles published over the period from 2010 to 2015, resulting in approximately 930 records. The second survey was focused on papers that aimed at SSR marker development, published in the American Journal of Botany's Primer Notes and Protocols in Plant Sciences (over 2013 up to 2015), resulting in a total of 87 publications. This scenario confirms the current relevance of SSRs and indicates their continuous utilization in plant science. PMID:27561112
Radice, Marcela; Marín, Marcelo; Giovanakis, Marta; Vay, Carlos; Almuzara, Marisa; Limansky, Adriana; Casellas, José M; Famiglietti, Angela; Quinteros, Mirta; Bantar, Carlos; Galas, Marcelo; Kovensky Pupko, Jaime; Nicola, Federico; Pasterán, Fernando; Soloaga, Rolando; Gutkind, Gabriel
2011-01-01
This document contains the recommendations for antimicrobial susceptibility testing of the clinically relevant non-fermenting gram-negative bacilli (NFGNB), adopted after conforming those from international committees to the experience of the Antimicrobial Agents Subcommittee members and invited experts. This document includes an update on NFGNB classification and description, as well as some specific descriptions regarding natural or frequent antimicrobial resistance and a brief account of associated resistance mechanisms. These recommendations not only suggest the antimicrobial drugs to be evaluated in each case, but also provide an optimization of the disk diffusion layout and a selection of results to be reported. Finally, this document also includes a summary of the different methodological approaches that may be used for detection and confirmation of emerging b-lactamases, such as class A and B carbapenemases.
Bulk and Surface Morphologies of ABC Miktoarm Star Terpolymers Composed of PDMS, PI, and PMMA Arms
Chernyy, Sergey; Kirkensgaard, Jacob Judas Kain; Mahalik, Jyoti P.; ...
2018-02-02
DIM miktoarm star copolymers, composed of polydimethylsiloxane [D], poly(1,4-isoprene) [I], and poly(methyl methacrylate) [M], were synthesized using a newly developed linking methodology with 4-allyl-1,1-diphenylethylene as a linking agent. The equilibrium bulk morphologies of the DIM stars were found to range from [6.6.6] tiling patterns to alternating lamellar and alternating cylindrical morphologies, as determined experimentally by small-angle X-ray scattering and transmission electron microscopy and confirmed by dissipative particle dynamics and self-consistent field theory based arguments. The thin film morphologies, which differ from those found in the bulk, were identified by scanning electron microscopy, coupled with oxygen plasma etching. Finally, square arraysmore » of the PDMS nanodots and empty core cylinders were formed on silica after oxygen plasma removal of the poly(1,4-isoprene) and poly(methyl methacrylate) which generated nanostructured substrates decorated with these features readily observable.« less
Strain hardening behavior during manufacturing of tube shapes by hydroforming
NASA Astrophysics Data System (ADS)
Park, Hyun Kyu; Yi, Hyae Kyung; Van Tyne, Chester J.; Moon, Young Hoon
2009-12-01
Safe and robust process design relies on knowledge of the evolution of the mechanical properties in a tube during hydroforming. The manufacturing of tubular shapes generally consists of three main stages: bending, preforming, and expansion. The latter is usually called hydroforming. As a result of these three steps, the final product's strain hardening history is nonlinear. In the present study, the strain hardening behavior during hydroforming was experimentally investigated. The variation of local flow stress and/or local hardness was used as an index of the strain hardening during the various steps and the local flow stress and/or local hardness were used with respective correlations to determine the effective strain. The strain hardening behavior during hydroforming after preforming has been successfully analyzed by using the relationships between hardness, flow stress, and effective strain for variable pre-strains prior to hydroforming. The comparison of predicted hardness with measured hardness confirms that the methodology used in this study is feasible, and that the strain hardening behavior can be quantitatively estimated with good accuracy.
Muñoz-Redondo, José Manuel; Cuevas, Francisco Julián; León, Juan Manuel; Ramírez, Pilar; Moreno-Rojas, José Manuel; Ruiz-Moreno, María José
2017-04-05
A quantitative approach using HS-SPME-GC-MS was performed to investigate the ester changes related to the second fermentation in bottle. The contribution of the type of base wine to the final wine style is detailed. Furthermore, a discriminant model was developed based on ester changes according to the second fermentation (with 100% sensitivity and specificity values). The application of a double-check criteria according to univariate and multivariate analyses allowed the identification of potential volatile markers related to the second fermentation. Some of them presented a synthesis-ratio around 3-fold higher after this period and they are known to play a key role in wine aroma. Up to date, this is the first study reporting the role of esters as markers of the second fermentation. The methodology described in this study confirmed its suitability for the wine aroma field. The results contribute to enhance our understanding of this fermentative step.
Current status of gene expression profiling in the diagnosis and management of acute leukaemia.
Bacher, Ulrike; Kohlmann, Alexander; Haferlach, Torsten
2009-06-01
Gene expression profiling (GEP) enables the simultaneous investigation of the expression of tens of thousands of genes and was successfully introduced in leukaemia research a decade ago. Aiming to better understand the diversity of genetic aberrations in acute myeloid leukaemia (AML) and acute lymphoblastic leukaemia (ALL), pioneer studies investigated and confirmed the predictability of many cytogenetic and molecular subclasses in AML and ALL. In addition, GEP can define new prognostic subclasses within distinct leukaemia subgroups, as illustrated in AML with normal karyotype. Another approach is the development of treatment-specific sensitivity assays, which might contribute to targeted therapy studies. Finally, GEP might enable the detection of new molecular targets for therapy in patients with acute leukaemia. Meanwhile, large multicentre studies, e.g. the Microarray Innovations in LEukaemia (MILE) study, prepare for a standardised introduction of GEP in leukaemia diagnostic algorithms, aiming to translate this novel methodology into clinical routine for the benefit of patients with the complex disorders of AML and ALL.
Bulk and Surface Morphologies of ABC Miktoarm Star Terpolymers Composed of PDMS, PI, and PMMA Arms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chernyy, Sergey; Kirkensgaard, Jacob Judas Kain; Mahalik, Jyoti P.
DIM miktoarm star copolymers, composed of polydimethylsiloxane [D], poly(1,4-isoprene) [I], and poly(methyl methacrylate) [M], were synthesized using a newly developed linking methodology with 4-allyl-1,1-diphenylethylene as a linking agent. The equilibrium bulk morphologies of the DIM stars were found to range from [6.6.6] tiling patterns to alternating lamellar and alternating cylindrical morphologies, as determined experimentally by small-angle X-ray scattering and transmission electron microscopy and confirmed by dissipative particle dynamics and self-consistent field theory based arguments. The thin film morphologies, which differ from those found in the bulk, were identified by scanning electron microscopy, coupled with oxygen plasma etching. Finally, square arraysmore » of the PDMS nanodots and empty core cylinders were formed on silica after oxygen plasma removal of the poly(1,4-isoprene) and poly(methyl methacrylate) which generated nanostructured substrates decorated with these features readily observable.« less
NASA Astrophysics Data System (ADS)
Chen, X. Z.; Zhao, X. H.; Chen, X. P.
2018-03-01
Recently, smoggy weather has become a daily in large part of China because of rapidly economic growth and accelerative urbanization. Stressed on the smoggy situation and economic growth, the green and environment-friendly technology is necessary to reduce or eliminate the smog and promote the sustainable development of economy. Previous studies had confirmed that nitrogen oxides ( NOx ) is one of crucial factors which forms smog. Microorganisms have the advantages of quickly growth and reproduction and metabolic diversity which can collaboratively Metabolize various NOx. This study will design a kind of bacteria & algae cultivation system which can metabolize collaboratively nitrogen oxides in air and intervene in the local nitrogen cycle. Furthermore, the nitrogen oxides can be transformed into nitrogen gas or assembled in protein in microorganism cell by regulating the microorganism types and quantities and metabolic pathways in the system. Finally, the smog will be alleviated or eliminated because of reduction of nitrogen oxides emission. This study will produce the green developmental methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, E.U.; George, T.L.; Rector, D.R.
The natural circulation tests of the Fast Flux Test Facility (FFTF) demonstrated a safe and stable transition from forced convection to natural convection and showed that natural convection may adequately remove decay heat from the reactor core. The COBRA-WC computer code was developed by the Pacific Northwest laboratory (PNL) to account for buoyancy-induced coolant flow redistribution and interassembly heat transfer, effects that become important in mitigating temperature gradients and reducing reactor core temperatures when coolant flow rate in the core is low. This report presents work sponsored by the US Department of Energy (DOE) with the objective of checking themore » validity of COBRA-WC during the first 220 seconds (sec) of the FFTF natural-circulation (plant-startup) tests using recorded data from two instrumented Fuel Open Test Assemblies (FOTAs). Comparison of COBRA-WC predictions of the FOTA data is a part of the final confirmation of the COBRA-WC methodology for core natural-convection analysis.« less
NASA Astrophysics Data System (ADS)
Pochwat, Kamil; Słyś, Daniel; Kordana, Sabina
2017-06-01
The paper presents issues relating to the influence of time distribution of rainfall on the required storage capacity of stormwater reservoirs. The research was based on data derived from simulations of existing drainage systems. The necessary models of catchments and the drainage system were prepared using the hydrodynamic modelling software SWMM 5.0 (Storm Water Management Model). The research results obtained were used to determine the critical rainfall distribution in time which required reserving the highest capacity of stormwater reservoir. In addition, it can be confirmed based on the research that dimensioning of enclosed structures should rely on using the critical precipitation generated as the characteristics of a synthetically developed rainfall vary dynamically in time. In the final part of the paper, the results of the analyses are compared and followed with the ensuing conclusions. The results of the research will have impact on the development of methodologies for dimensioning retention facilities in drainage systems.
Gustavson, Daniel E; Miyake, Akira; Hewitt, John K; Friedman, Naomi P
2014-06-01
Previous research has revealed a moderate and positive correlation between procrastination and impulsivity. However, little is known about why these two constructs are related. In the present study, we used behavior-genetics methodology to test three predictions derived from an evolutionary account that postulates that procrastination arose as a by-product of impulsivity: (a) Procrastination is heritable, (b) the two traits share considerable genetic variation, and (c) goal-management ability is an important component of this shared variation. These predictions were confirmed. First, both procrastination and impulsivity were moderately heritable (46% and 49%, respectively). Second, although the two traits were separable at the phenotypic level (r = .65), they were not separable at the genetic level (r genetic = 1.0). Finally, variation in goal-management ability accounted for much of this shared genetic variation. These results suggest that procrastination and impulsivity are linked primarily through genetic influences on the ability to use high-priority goals to effectively regulate actions. © The Author(s) 2014.
Gustavson, Daniel E.; Miyake, Akira; Hewitt, John K.; Friedman, Naomi P.
2014-01-01
Previous research has revealed a moderate positive correlation between procrastination and impulsivity. However, little is known about why these two constructs are related. This study used behavioral genetic methodology to test three predictions derived from an evolutionary account that postulates that procrastination arose as a by-product of impulsivity (Steel, 2010): (a) Procrastination is heritable; (b) the two traits share considerable genetic variation; and (c) goal-management ability is an important component of this shared variation. These predictions were confirmed. First, both procrastination and impulsivity were moderately heritable (46% and 49%, respectively). Second, although the two traits were separable at the phenotypic level (r=.65), they were not separable at the genetic level (rg=1.0). Finally, variation in goal-management ability accounted for much of this shared genetic variation. These results suggest that procrastination and impulsivity are linked primarily through genetic influences on the ability to use their high-priority goals effectively to regulate their action. PMID:24705635
Theoretical characterisation of point defects on a MoS2 monolayer by scanning tunnelling microscopy.
González, C; Biel, B; Dappe, Y J
2016-03-11
Different S and Mo vacancies as well as their corresponding antisite defects in a free-standing MoS2 monolayer are analysed by means of scanning tunnelling microscopy (STM) simulations. Our theoretical methodology, based on the Keldysh nonequilibrium Green function formalism within the density functional theory (DFT) approach, is applied to simulate STM images for different voltages and tip heights. Combining the geometrical and electronic effects, all features of the different STM images can be explained, providing a valuable guide for future experiments. Our results confirm previous reports on S atom imaging, but also reveal a strong dependence on the applied bias for vacancies and antisite defects that include extra S atoms. By contrast, when additional Mo atoms cover the S vacancies, the MoS2 gap vanishes and a bias-independent bright protrusion is obtained in the STM image. Finally, we show that the inclusion of these point defects promotes the emergence of reactive dangling bonds that may act as efficient adsorption sites for external adsorbates.
Matha, Denis; Sandner, Frank; Molins, Climent; Campos, Alexis; Cheng, Po Wen
2015-02-28
The current key challenge in the floating offshore wind turbine industry and research is on designing economic floating systems that can compete with fixed-bottom offshore turbines in terms of levelized cost of energy. The preliminary platform design, as well as early experimental design assessments, are critical elements in the overall design process. In this contribution, a brief review of current floating offshore wind turbine platform pre-design and scaled testing methodologies is provided, with a focus on their ability to accommodate the coupled dynamic behaviour of floating offshore wind systems. The exemplary design and testing methodology for a monolithic concrete spar platform as performed within the European KIC AFOSP project is presented. Results from the experimental tests compared to numerical simulations are presented and analysed and show very good agreement for relevant basic dynamic platform properties. Extreme and fatigue loads and cost analysis of the AFOSP system confirm the viability of the presented design process. In summary, the exemplary application of the reduced design and testing methodology for AFOSP confirms that it represents a viable procedure during pre-design of floating offshore wind turbine platforms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Lihachev, Alexey; Lihacova, Ilze; Plorina, Emilija V.; Lange, Marta; Derjabo, Alexander; Spigulis, Janis
2018-01-01
A clinical trial on the autofluorescence imaging of skin lesions comprising 16 dermatologically confirmed pigmented nevi, 15 seborrheic keratosis, 2 dysplastic nevi, histologically confirmed 17 basal cell carcinomas and 1 melanoma was performed. The autofluorescence spatial properties of the skin lesions were acquired by smartphone RGB camera under 405 nm LED excitation. The diagnostic criterion is based on the calculation of the mean autofluorescence intensity of the examined lesion in the spectral range of 515 nm–700 nm. The proposed methodology is able to differentiate seborrheic keratosis from basal cell carcinoma, pigmented nevi and melanoma. The sensitivity and specificity of the proposed method was estimated as being close to 100%. The proposed methodology and potential clinical applications are discussed in this article. PMID:29675324
Lihachev, Alexey; Lihacova, Ilze; Plorina, Emilija V; Lange, Marta; Derjabo, Alexander; Spigulis, Janis
2018-04-01
A clinical trial on the autofluorescence imaging of skin lesions comprising 16 dermatologically confirmed pigmented nevi, 15 seborrheic keratosis, 2 dysplastic nevi, histologically confirmed 17 basal cell carcinomas and 1 melanoma was performed. The autofluorescence spatial properties of the skin lesions were acquired by smartphone RGB camera under 405 nm LED excitation. The diagnostic criterion is based on the calculation of the mean autofluorescence intensity of the examined lesion in the spectral range of 515 nm-700 nm. The proposed methodology is able to differentiate seborrheic keratosis from basal cell carcinoma, pigmented nevi and melanoma. The sensitivity and specificity of the proposed method was estimated as being close to 100%. The proposed methodology and potential clinical applications are discussed in this article.
78 FR 26250 - Payment for Home Health Services and Hospice Care to Non-VA Providers
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
... Hospice Care to Non-VA Providers AGENCY: Department of Veterans Affairs. ACTION: Final rule. SUMMARY: The Department of Veterans Affairs (VA) amends its regulations concerning the billing methodology for non-VA... billing methodology for non-VA providers of home health services and hospice care. The proposed rulemaking...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lala, J.H.; Nagle, G.A.; Harper, R.E.
1993-05-01
The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev control computer system has been designed using a design-for-validation methodology developed earlier under NASA and SDIO sponsorship for real-time aerospace applications. The present study starts by defining the maglev mission scenario and ends with the definition of a maglev control computer architecture. Key intermediate steps included definitions of functional and dependability requirements, synthesis of two candidate architectures, development of qualitative and quantitative evaluation criteria, and analyticalmore » modeling of the dependability characteristics of the two architectures. Finally, the applicability of the design-for-validation methodology was also illustrated by applying it to the German Transrapid TR07 maglev control system.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
...), the Department is notifying the public that the final CIT judgment in this case is not in harmony with... methodological and calculation issues from the Final Determination.\\5\\ On remand, the Department recalculated the...
NASA Astrophysics Data System (ADS)
Xie, Shunping; Paau, Man Chin; Zhang, Yan; Shuang, Shaomin; Chan, Wan; Choi, Martin M. F.
2012-08-01
Reverse-phase high-performance liquid chromatographic (RP-HPLC) separation and analysis of polydisperse water-soluble gold nanoclusters (AuNCs) stabilised with N,N'-dimethylformamide (DMF) were investigated. Under optimal elution gradient conditions, the separation of DMF-AuNCs was monitored by absorption and fluorescence spectroscopy. The UV-vis spectral characteristics of the separated DMF-AuNCs have been captured and they do not possess distinct surface plasmon resonance bands, indicating that all DMF-AuNCs are small AuNCs. The photoluminescence emission spectra of the separated DMF-AuNCs are in the blue-light region. Moreover, cationic DMF-AuNCs are for the first time identified by ion chromatography. Our proposed RP-HPLC methodology has been successfully applied to separate AuNCs of various Au atoms as well as DMF-stabilised ligands. Finally, the composition of the separated DMF-AuNCs was confirmed by matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry and electrospray ionisation mass spectrometry, proving that the as-synthesised DMF-AuNCs product consists of Au10+, Au10, Au11, Au12, Au13, and Au14 NCs stabilised with various numbers of DMF ligands.Reverse-phase high-performance liquid chromatographic (RP-HPLC) separation and analysis of polydisperse water-soluble gold nanoclusters (AuNCs) stabilised with N,N'-dimethylformamide (DMF) were investigated. Under optimal elution gradient conditions, the separation of DMF-AuNCs was monitored by absorption and fluorescence spectroscopy. The UV-vis spectral characteristics of the separated DMF-AuNCs have been captured and they do not possess distinct surface plasmon resonance bands, indicating that all DMF-AuNCs are small AuNCs. The photoluminescence emission spectra of the separated DMF-AuNCs are in the blue-light region. Moreover, cationic DMF-AuNCs are for the first time identified by ion chromatography. Our proposed RP-HPLC methodology has been successfully applied to separate AuNCs of various Au atoms as well as DMF-stabilised ligands. Finally, the composition of the separated DMF-AuNCs was confirmed by matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry and electrospray ionisation mass spectrometry, proving that the as-synthesised DMF-AuNCs product consists of Au10+, Au10, Au11, Au12, Au13, and Au14 NCs stabilised with various numbers of DMF ligands. This article was submitted as part of a Themed Issue on metallic clusters. Other papers on this topic can be found in issue 14 of vol. 4 (2012). This issue can be found from the Nanoscale homepage [http://www.rsc.org/nanoscale].
Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James
2012-10-01
The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Evaluating Multi-Input/Multi-Output Digital Control Systems
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek
1994-01-01
Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.
Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F
2014-11-27
Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information gathered in this study to adapt an existing framework for impact of clinical research for use in methodological research. Gathering evidence on research impact of methodological research from a variety of sources has enabled us to obtain multiple indicators and thus to demonstrate broad impacts of methodological research. The adapted framework developed can be applied to future methodological research and thus provides a tool for methodologists to better assess and report research impacts.
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
The National Visitor Use Monitoring methodology and final results for round 1
S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold
2011-01-01
A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
78 FR 24047 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program, Part 2
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-24
... Employment and Training Administration 20 CFR Part 655 RIN 1205-AB69 Wage Methodology for the Temporary Non-Agricultural Employment H- 2B Program, Part 2 AGENCY: Employment and Training Administration, Labor; U.S. Citizenship and Immigration Services, DHS. ACTION: Interim final rule; request for comments. SUMMARY: The...
ERIC Educational Resources Information Center
Vineberg, Robert; Joyner, John N.
Instructional System Development (ISD) methodologies and practices were examined in the Army, Navy, Marine Corps, and Air Force, each of which prescribes the ISD system involving rigorous derivation of training requirements from job requirements, selection of instructional strategies to maximize training efficiency, and revision of instruction…
ERIC Educational Resources Information Center
Roman, Elliott M.
The Alternative Learning Methodologies through Academics Project (Project ALMA) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation in two high schools in Queens and the Bronx (New York). The program served 436 Spanish-speaking students, most of whom were of limited English proficiency.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-12
... a post-preliminary analysis in which we altered the cost-of- production methodology from that which... scope of the order is dispositive. Alternative Cost Methodology In our Preliminary Results we relied on... Results, 75 FR at 12516), and we compared the home-market prices to POR costs for the cost-of-production...
Treatment of Farm Families under Need Analysis for Student Aid. Final Report.
ERIC Educational Resources Information Center
National Computer Systems, Inc., Arlington, VA.
In response to Congressional request, this report compares the treatment of student financial aid applicants from farm families and non-farm families under two need-analysis formulae. Both the need-analysis methodology for Pell Grants and the Congressional Methodology (CM) for other federal aid calculate ability to pay as a function of income and…
Installation Restoration Program. Confirmation/Quantification Stage 1. Phase 2
1985-03-07
INSTALLATION RESTORATION PROGRAM i0 PHASE II - CONFIRMATION/QUANTIFICATION 0STAGE 1 KIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117 IIl PREPARED BY SCIENCE...APPLICATIONS INTERNATIONAL CORPORATION 505 MARQUETTE NW, SUITE 1200 ALBUQUERQUE, NEW MEXICO 871021 5MARCH 1985 FINAL REPORT FROM FEB 1983 TO MAR 1985...QUANTIFICATION STAGE 1 i FINAL REPORT FOR IKIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117U HEADQUARTERS MILITARY AIRLIFT COMMAND COMMAND SURGEON’S OFFICE (HQ MAC
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
NASA Astrophysics Data System (ADS)
Huff, A. E.; Skinner, J. A.
2018-06-01
Final progress report on the 1:1,500,000-scale mapping of western Libya Montes and northwestern Tyrrhena Terra. The final unit names, labels, and descriptions are reported as well as the methodology for age determinations and brief geologic history.
ERIC Educational Resources Information Center
Etaio, Iñaki; Churruca, Itziar; Rada, Diego; Miranda, Jonatan; Saracibar, Amaia; Sarrionandia, Fernando; Lasa, Arrate; Simón, Edurne; Labayen, Idoia; Martinez, Olaia
2018-01-01
European Frame for Higher Education has led universities to adapt their teaching schemes. Degrees must train students in competences including specific and cross-curricular skills. Nevertheless, there are important limitations to follow skill improvement through the consecutive academic years. Final-year dissertation (FYD) offers the opportunity…
76 FR 65631 - Energy Conservation Program: Test Procedures for Microwave Ovens
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... Conservation Program: Test Procedures for Microwave Ovens AGENCY: Office of Energy Efficiency and Renewable... (DOE) has initiated a test procedure rulemaking to develop active mode testing methodologies for... Federal Register a final rule for the microwave oven test procedure rulemaking (July TP repeal final rule...
The influences of implementing state-mandated science assessment on teacher practice
NASA Astrophysics Data System (ADS)
Katzmann, Jason Matthew
Four high school Biology teachers, two novice and two experienced, participated in a year and a half case study. By utilizing a naturalistic paradigm, the four individuals were studied in their natural environment, their classrooms. Data sources included: three semi-structured interviews, classroom observation field notes, and classroom artifacts. Through cross-case analysis and a constant comparative methodology, coding nodes where combined and refined resulting in the final themes for discussion. The following research question was investigated: what is the impact of high-stakes testing on high school Biology teacher's instructional planning, instructional practices and classroom assessments? Seven final themes were realized: Assessment, CSAP, Planning, Pressure, Standards, Teaching and Time. Each theme was developed and discussed utilizing each participant's voice. Trustworthiness of this study was established via five avenues: triangulation of data sources, credibility, transferability, dependability and confirmability. A model of the influences of high-stakes testing on teacher practice was developed to describe the seven themes (Figure 5). This model serves as an illustration of the complex nature of teacher practice and the influences upon it. The four participants in this study were influenced by high-stakes assessment. It influenced their instructional decisions, assessment practices, use of time, planning decisions and decreased the amount of inquiry that occurred in the classroom. Implications of this research and future research directions are described.
Analysis of La Dehesa paleo-landslide. Central Pre-Andes of Argentina
NASA Astrophysics Data System (ADS)
Tapia Baldis, Carla; Rothis, Luis Martín; Perucca, Laura; Esper Angillieri, María; Vargas, Horacio; Ponce, David; Allis, Carlos
2018-04-01
The main objective of this paper is to consider the influence of Quaternary faults as likely triggering factor for rockslides occurrence in the Central Pre-Andes, a region with intense shallow seismic activity. A rockslide deposit was selected as study case, placed in the western flank of La Dehesa and Talacasto (DT) range (31°3‧37″ S and 68°46‧ 8″ W). Applied methodology includes the characterization of main discontinuities, reconstruction of the topography using a high-resolution digital elevation model, safety factor calculation along the sliding surface and, Newmark displacements estimation for three different hypothetical seismic scenarios, recreated from existing neotectonic local information. Equilibrium-limit method's results confirm that study case, La Dehesa rockslide (LDR), had a stable and safe slope's configuration under static conditions. However, a seismic horizontal coefficient between 0.2 and 0.3 decreases safety factor below the safety threshold. Newmark's displacements for different seismic reconstructed scenarios varies between 4.1 and 15.9 cm, values that agreed with a coherent failure process, likely triggered by Pleistocene to Holocene seismogenic sources in Central Pre-Andes. LDR trigger could be assigned mainly to an earthquake related to La Dehesa Quaternary fault (LDF) activity; however, similar movements produced by neighboring faults should not be discarded. LDR triggering related to climatic conditions is despised. Finally, the methodology presented in this work is easy to reproduce and may be applied to other rockslides located in the mountainous areas of the Central Pre-Andes of Argentina.
Fish and chips: Various methodologies demonstrate utility of a 16,006-gene salmonid microarray
von Schalburg, Kristian R; Rise, Matthew L; Cooper, Glenn A; Brown, Gordon D; Gibbs, A Ross; Nelson, Colleen C; Davidson, William S; Koop, Ben F
2005-01-01
Background We have developed and fabricated a salmonid microarray containing cDNAs representing 16,006 genes. The genes spotted on the array have been stringently selected from Atlantic salmon and rainbow trout expressed sequence tag (EST) databases. The EST databases presently contain over 300,000 sequences from over 175 salmonid cDNA libraries derived from a wide variety of tissues and different developmental stages. In order to evaluate the utility of the microarray, a number of hybridization techniques and screening methods have been developed and tested. Results We have analyzed and evaluated the utility of a microarray containing 16,006 (16K) salmonid cDNAs in a variety of potential experimental settings. We quantified the amount of transcriptome binding that occurred in cross-species, organ complexity and intraspecific variation hybridization studies. We also developed a methodology to rapidly identify and confirm the contents of a bacterial artificial chromosome (BAC) library containing Atlantic salmon genomic DNA. Conclusion We validate and demonstrate the usefulness of the 16K microarray over a wide range of teleosts, even for transcriptome targets from species distantly related to salmonids. We show the potential of the use of the microarray in a variety of experimental settings through hybridization studies that examine the binding of targets derived from different organs and tissues. Intraspecific variation in transcriptome expression is evaluated and discussed. Finally, BAC hybridizations are demonstrated as a rapid and accurate means to identify gene content. PMID:16164747
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-01-01
Appropriate dietary intake can have a significant influence on athletic performance. There is a growing consensus on sports nutrition and professionals working with athletes often provide dietary education. However, due to the limitations of existing sports nutrition knowledge questionnaires, previous reports of athletes' nutrition knowledge may be inaccurate. An updated questionnaire has been developed based on a recent review of sports nutrition guidelines. The tool has been validated using a robust methodology that incorporates relevant techniques from classical test theory (CTT) and Item response theory (IRT), namely, Rasch analysis. The final questionnaire has 89 questions and six sub-sections (weight management, macronutrients, micronutrients, sports nutrition, supplements, and alcohol). The content and face validity of the tool have been confirmed based on feedback from expert sports dietitians and university sports students, respectively. The internal reliability of the questionnaire as a whole is high (KR = 0.88), and most sub-sections achieved an acceptable internal reliability. Construct validity has been confirmed, with an independent T-test revealing a significant ( p < 0.001) difference in knowledge scores of nutrition (64 ± 16%) and non-nutrition students (51 ± 19%). Test-retest reliability has been assured, with a strong correlation ( r = 0.92, p < 0.001) between individuals' scores on two attempts of the test, 10 days to 2 weeks apart. Three of the sub-sections fit the Rasch Unidimensional Model. The final version of the questionnaire represents a significant improvement over previous tools. Each nutrition sub-section is unidimensional, and therefore researchers and practitioners can use these individually, as required. Use of the questionnaire will allow researchers to draw conclusions about the effectiveness of nutrition education programs, and differences in knowledge across athletes of varying ages, genders, and athletic calibres.
ERIC Educational Resources Information Center
Ashton, Karen
2016-01-01
This paper reflects on the methodology used in international comparative education surveys by conducting a systematic review of the European Survey on Language Competences (ESLC). The ESLC was administered from February to March 2011, with final results released in June 2012. The survey tested approximately 55,000 students across 14 European…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobi, Rober
2007-03-28
This Topical Report (#6 of 9) consists of the figures 3.6-13 to (and including) 3.6-18 (and appropriate figure captions) that accompany the Final Technical Progress Report entitled: “Innovative Methodology for Detection of Fracture-Controlled Sweet Spots in the Northern Appalachian Basin” for DOE/NETL Award DE-AC26-00NT40698.
Kagkadis, K A; Rekkas, D M; Dallas, P P; Choulis, N H
1996-01-01
In this study a complex of Ibuprofen and b-Hydroxypropylcyclodextrin was prepared employing a freeze drying method. The production parameters and the final specifications of this product were optimized by using response surface methodology. The results show that the freeze dried complex meets the requirements for solubility to be considered as a possible injectable form.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-26
... excludes (1) polyethylene bags that are not printed with logos or store names and that are closeable with... comparison methodology to TCI's targeted sales and the average-to-average comparison methodology to TCI's non... average-to-average comparison method does not account for such price differences and results in the...
ERIC Educational Resources Information Center
Jovanovic, Aleksandar; Jankovic, Anita; Jovanovic, Snezana Markovic; Peric, Vladan; Vitosevic, Biljana; Pavlovic, Milos
2015-01-01
The paper describes the delivery of the courses in the framework of the project implementation and presents the effect the change in the methodology had on student performance as measured by final grade. Methodology: University of Pristina piloted blended courses in 2013 under the framework of the Tempus BLATT project. The blended learning…
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
Post-Doctoral Fellowship for Merton S. Krause. Final Report.
ERIC Educational Resources Information Center
Jackson, Philip W.
The final quarter of Krause's fellowship year was spent in completing his interviews with political socialization researchers in the eastern United States and his work on methodological problems. Krause also completed a long essay on the nature and implications of the "matrix perspective" for research planning, pursued his study of measurement…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is comprised of five volumes. The volume presents the study conclusions, summarizes the methodology used (more detail is found in Volume 3), discusses four case study applications of the model, and contains profiles of coastal communities in an Appendix.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... Development and Research, HUD. ACTION: Notice of Final Fiscal Year (FY) 2012 Fair Market Rents (FMRs). SUMMARY... program staff. Questions on how to conduct FMR surveys or concerning further methodological explanations... Economic Affairs, Office of Policy Development and Research, telephone 202-708-0590. Persons with hearing...
2014-08-06
This final rule will update the prospective payment rates for Medicare inpatient hospital services provided by inpatient psychiatric facilities (IPFs). These changes will be applicable to IPF discharges occurring during the fiscal year (FY) beginning October 1, 2014 through September 30, 2015. This final rule will also address implementation of ICD-10-CM and ICD-10-PCS codes; finalize a new methodology for updating the cost of living adjustment (COLA), and finalize new quality measures and reporting requirements under the IPF quality reporting program.
I. DEVELOPMENTAL METHODOLOGY AS A CENTRAL SUBDISCIPLINE OF DEVELOPMENTAL SCIENCE.
Card, Noel A
2017-06-01
This first chapter introduces the main goals of the monograph and previews the remaining chapters. The goals of this monograph are to provide summaries of our current understanding of advanced developmental methodologies, provide information that can advance our understanding of human development, identify shortcomings in our understanding of developmental methodology, and serve as a flagpost for organizing developmental methodology as a subdiscipline within the broader field of developmental science. The remaining chapters in this monograph address issues in design (sampling and big data), longitudinal data analysis, and issues of replication and research accumulation. The final chapter describes the history of developmental methodology, considers how the previous chapters in this monograph fit within this subdiscipline, and offers recommendations for further advancement. © 2017 The Society for Research in Child Development, Inc.
2017-08-03
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2018 as required by the statute. As required by section 1886(j)(5) of the Social Security Act (the Act), this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2018. This final rule also revises the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis codes that are used to determine presumptive compliance under the "60 percent rule," removes the 25 percent payment penalty for inpatient rehabilitation facility patient assessment instrument (IRF-PAI) late transmissions, removes the voluntary swallowing status item (Item 27) from the IRF-PAI, summarizes comments regarding the criteria used to classify facilities for payment under the IRF PPS, provides for a subregulatory process for certain annual updates to the presumptive methodology diagnosis code lists, adopts the use of height/weight items on the IRF-PAI to determine patient body mass index (BMI) greater than 50 for cases of single-joint replacement under the presumptive methodology, and revises and updates measures and reporting requirements under the IRF quality reporting program (QRP).
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Sanders
2006-09-01
Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less
Optimal Trajectories Generation in Robotic Fiber Placement Systems
NASA Astrophysics Data System (ADS)
Gao, Jiuchun; Pashkevich, Anatol; Caro, Stéphane
2017-06-01
The paper proposes a methodology for optimal trajectories generation in robotic fiber placement systems. A strategy to tune the parameters of the optimization algorithm at hand is also introduced. The presented technique transforms the original continuous problem into a discrete one where the time-optimal motions are generated by using dynamic programming. The developed strategy for the optimization algorithm tuning allows essentially reducing the computing time and obtaining trajectories satisfying industrial constraints. Feasibilities and advantages of the proposed methodology are confirmed by an application example.
Optically stimulated luminescence dating of sediments
NASA Astrophysics Data System (ADS)
Troja, S. O.; Amore, C.; Barbagallo, G.; Burrafato, G.; Forzese, R.; Geremia, F.; Gueli, A. M.; Marzo, F.; Pirnaci, D.; Russo, M.; Turrisi, E.
2000-04-01
Optically stimulated luminescence (OSL) dating methodology was applied on the coarse grain fraction (100÷500 μm thick) of quartz crystals (green light stimulated luminescence, GLSL) and feldspar crystals (infrared stimulated luminescence, IRSL) taken from sections at different depths of cores bored in various coastal lagoons (Longarini, Cuba, Bruno) in the south-east coast of Sicily. The results obtained give a sequence of congruent relative ages and maximum absolute ages compatible with the sedimentary structure, thus confirming the excellent potential of the methodology.
2016-08-24
08-24-2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) July 2011 - May 2016 4. TITLE AND SUBTITLE Experimental Confirmation of an Aquatic...Unlimited) or SAR (Same as Report). An entry in this block is necessary if the abstract is to be limited. 1 FINAL REPORT Title: Experimental ...expounds on: the robot’s propulsive wake is being investigated in order to determine whether the theoretical hydrodynamics compare to the experimentally
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kooymana, Timothée; Buiron, Laurent; Rimpault, Gérald
2017-09-01
Heterogeneous loading of minor actinides in radial blankets is a potential solution to implement minor actinides transmutation in fast reactors. However, to compensate for the lower flux level experienced by the blankets, the fraction of minor actinides to be loaded in the blankets must be increased to maintain acceptable performances. This severely increases the decay heat and neutron source of the blanket assemblies, both before and after irradiation, by more than an order of magnitude in the case of neutron source for instance. We propose here to implement an optimization methodology of the blankets design with regards to various parameters such as the local spectrum or the mass to be loaded, with the objective of minimizing the final neutron source of the spent assembly while maximizing the transmutation performances of the blankets. In a first stage, an analysis of the various contributors to long and short term neutron and gamma source is carried out while in a second stage, relevant estimators are designed for use in the effective optimization process, which is done in the last step. A comparison with core calculations is finally done for completeness and validation purposes. It is found that the use of a moderated spectrum in the blankets can be beneficial in terms of final neutron and gamma source without impacting minor actinides transmutation performances compared to more energetic spectrum that could be achieved using metallic fuel for instance. It is also confirmed that, if possible, the use of hydrides as moderating material in the blankets is a promising option to limit the total minor actinides inventory in the fuel cycle. If not, it appears that focus should be put upon an increased residence time for the blankets rather than an increase in the acceptable neutron source for handling and reprocessing.
WHT follow-up observations of extremely metal-poor stars identified from SDSS and LAMOST
NASA Astrophysics Data System (ADS)
Aguado, D. S.; González Hernández, J. I.; Allende Prieto, C.; Rebolo, R.
2017-09-01
Aims: We have identified several tens of extremely metal-poor star candidates from SDSS and LAMOST, which we follow up with the 4.2 m William Herschel Telescope (WHT) telescope to confirm their metallicity. Methods: We followed a robust two-step methodology. We first analyzed the SDSS and LAMOST spectra. A first set of stellar parameters was derived from these spectra with the FERRE code, taking advantage of the continuum shape to determine the atmospheric parameters, in particular, the effective temperature. Second, we selected interesting targets for follow-up observations, some of them with very low-quality SDSS or LAMOST data. We then obtained and analyzed higher-quality medium-resolution spectra obtained with the Intermediate dispersion Spectrograph and Imaging System (ISIS) on the WHT to arrive at a second more reliable set of atmospheric parameters. This allowed us to derive the metallicity with accuracy, and we confirm the extremely metal-poor nature in most cases. In this second step we also employed FERRE, but we took a running mean to normalize both the observed and the synthetic spectra, and therefore the final parameters do not rely on having an accurate flux calibration or continuum placement. We have analyzed with the same tools and following the same procedure six well-known metal-poor stars, five of them at [Fe/H] <-4 to verify our results. This showed that our methodology is able to derive accurate metallicity determinations down to [Fe/H] <-5.0. Results: The results for these six reference stars give us confidence on the metallicity scale for the rest of the sample. In addition, we present 12 new extremely metal-poor candidates: 2 stars at [Fe/H] ≃-4, 6 more in the range -4 < [Fe / H] < -3.5, and 4 more at -3.5 < [Fe / H] < -3.0. Conclusions: We conclude that we can reliably determine metallicities for extremely metal-poor stars with a precision of 0.2 dex from medium-resolution spectroscopy with our improved methodology. This provides a highly effective way of verifying candidates from lower quality data. Our model spectra and the details of the fitting algorithm are made public to facilitate the standardization of the analysis of spectra from the same or similar instruments. The model spectra are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/605/A40
Dipnall, Joanna F.
2016-01-01
Background Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. Methods The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009–2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. Results After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). Conclusion The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin. PMID:26848571
Dipnall, Joanna F; Pasco, Julie A; Berk, Michael; Williams, Lana J; Dodd, Seetal; Jacka, Felice N; Meyer, Denny
2016-01-01
Atheoretical large-scale data mining techniques using machine learning algorithms have promise in the analysis of large epidemiological datasets. This study illustrates the use of a hybrid methodology for variable selection that took account of missing data and complex survey design to identify key biomarkers associated with depression from a large epidemiological study. The study used a three-step methodology amalgamating multiple imputation, a machine learning boosted regression algorithm and logistic regression, to identify key biomarkers associated with depression in the National Health and Nutrition Examination Study (2009-2010). Depression was measured using the Patient Health Questionnaire-9 and 67 biomarkers were analysed. Covariates in this study included gender, age, race, smoking, food security, Poverty Income Ratio, Body Mass Index, physical activity, alcohol use, medical conditions and medications. The final imputed weighted multiple logistic regression model included possible confounders and moderators. After the creation of 20 imputation data sets from multiple chained regression sequences, machine learning boosted regression initially identified 21 biomarkers associated with depression. Using traditional logistic regression methods, including controlling for possible confounders and moderators, a final set of three biomarkers were selected. The final three biomarkers from the novel hybrid variable selection methodology were red cell distribution width (OR 1.15; 95% CI 1.01, 1.30), serum glucose (OR 1.01; 95% CI 1.00, 1.01) and total bilirubin (OR 0.12; 95% CI 0.05, 0.28). Significant interactions were found between total bilirubin with Mexican American/Hispanic group (p = 0.016), and current smokers (p<0.001). The systematic use of a hybrid methodology for variable selection, fusing data mining techniques using a machine learning algorithm with traditional statistical modelling, accounted for missing data and complex survey sampling methodology and was demonstrated to be a useful tool for detecting three biomarkers associated with depression for future hypothesis generation: red cell distribution width, serum glucose and total bilirubin.
Wang, Yan; Xi, Chengyu; Zhang, Shuai; Yu, Dejian; Zhang, Wenyu; Li, Yong
2014-01-01
The recent government tendering process being conducted in an electronic way is becoming an inevitable affair for numerous governmental agencies to further exploit the superiorities of conventional tendering. Thus, developing an effective web-based bid evaluation methodology so as to realize an efficient and effective government E-tendering (GeT) system is imperative. This paper firstly investigates the potentiality of employing fuzzy analytic hierarchy process (AHP) along with fuzzy gray relational analysis (GRA) for optimal selection of candidate tenderers in GeT process with consideration of a hybrid fuzzy environment with incomplete weight information. We proposed a novel hybrid fuzzy AHP-GRA (HFAHP-GRA) method that combines an extended fuzzy AHP with a modified fuzzy GRA. The extended fuzzy AHP which combines typical AHP with interval AHP is proposed to obtain the exact weight information, and the modified fuzzy GRA is applied to aggregate different types of evaluation information so as to identify the optimal candidate tenderers. Finally, a prototype system is built and validated with an illustrative example for GeT to confirm the feasibility of our approach. PMID:25057506
Wang, Yan; Xi, Chengyu; Zhang, Shuai; Yu, Dejian; Zhang, Wenyu; Li, Yong
2014-01-01
The recent government tendering process being conducted in an electronic way is becoming an inevitable affair for numerous governmental agencies to further exploit the superiorities of conventional tendering. Thus, developing an effective web-based bid evaluation methodology so as to realize an efficient and effective government E-tendering (GeT) system is imperative. This paper firstly investigates the potentiality of employing fuzzy analytic hierarchy process (AHP) along with fuzzy gray relational analysis (GRA) for optimal selection of candidate tenderers in GeT process with consideration of a hybrid fuzzy environment with incomplete weight information. We proposed a novel hybrid fuzzy AHP-GRA (HFAHP-GRA) method that combines an extended fuzzy AHP with a modified fuzzy GRA. The extended fuzzy AHP which combines typical AHP with interval AHP is proposed to obtain the exact weight information, and the modified fuzzy GRA is applied to aggregate different types of evaluation information so as to identify the optimal candidate tenderers. Finally, a prototype system is built and validated with an illustrative example for GeT to confirm the feasibility of our approach.
Hot-stage microscopy for determination of API particles in a formulated tablet.
Simek, Michal; Grünwaldová, Veronika; Kratochvíl, Bohumil
2014-01-01
Although methods exist to readily determine the particle size distribution (PSD) of an active pharmaceutical ingredient (API) before its formulation into a final product, the primary challenge is to develop a method to determine the PSD of APIs in a finished tablet. To address the limitations of existing PSD methods, we used hot-stage microscopy to observe tablet disintegration during temperature change and, thus, reveal the API particles in a tablet. Both mechanical and liquid disintegration were evaluated after we had identified optimum milling time for mechanical disintegration and optimum volume of water for liquid disintegration. In each case, hot-stage micrographs, taken before and after the API melting point, were compared with image analysis software to obtain the PSDs. Then, the PSDs of the APIs from the disintegrated tablets were compared with the PSDs of raw APIs. Good agreement was obtained, thereby confirming the robustness of our methodology. The availability of such a method equips pharmaceutical scientists with an in vitro assessment method that will more reliably determine the PSD of active substances in finished tablets.
NASA Astrophysics Data System (ADS)
Khan, Akhtar; Maity, Kalipada
2018-03-01
This paper explores some of the vital machinability characteristics of commercially pure titanium (CP-Ti) grade 2. Experiments were conducted based on Taguchi’s L9 orthogonal array. The selected material was machined on a heavy duty lathe (Model: HMT NH26) using uncoated carbide inserts in dry cutting environment. The selected inserts were designated by ISO as SNMG 120408 (Model: K313) and manufactured by Kennametal. These inserts were rigidly mounted on a right handed tool holder PSBNR 2020K12. Cutting speed, feed rate and depth of cut were selected as three input variables whereas tool wear (VBc) and surface roughness (Ra) were the major attentions. In order to confirm an appreciable machinability of the work part, an optimal parametric combination was attained with the help of grey relational analysis (GRA) approach. Finally, a mathematical model was developed to exhibit the accuracy and acceptability of the proposed methodology using multiple regression equations. The results indicated that, the suggested model is capable of predicting overall grey relational grade within acceptable range.
Design of a low-bending-loss large-mode-area photonic crystal fiber
NASA Astrophysics Data System (ADS)
Napierala, Marek; Beres-Pawlik, Elzbieta; Nasilowski, Tomasz; Mergo, Pawel; Berghmans, Francis; Thienpont, Hugo
2012-04-01
We present a design of a photonic crystal fiber for high power laser and amplifier applications. Our fiber comprises a core with a diameter larger than 60 μm and exhibits single mode operation when the fiber is bent around a 10 cm radius at a wavelength of 1064 nm. Single mode guidance is enforced by the high loss of higher order modes which exceeds 80 dB/m whereas the loss of the fundamental mode (FM) is lower than 0.03 dB/m. The fiber can therefore be considered as an active medium for compact high power fiber lasers and amplifiers with a nearly diffraction limited beam output. We also analyze our fiber in terms of tolerance to manufacturing imperfections. To do so we employ a statistical design methodology. This analysis reveals those crucial parameters of the fiber that have to be controlled precisely during the fabrication process not to deteriorate the fiber performance. Finally we show that the fiber can be fabricated according to our design and we present experimental results that confirm the expected fiber performance.
Korecká, Lucie; Jankovicová, Barbora; Krenková, Jana; Hernychová, Lenka; Slováková, Marcela; Le-Nell, Anne; Chmelik, Josef; Foret, Frantisek; Viovy, Jean-Louis; Bilková, Zusana
2008-02-01
We report an efficient and streamlined way to improve the analysis and identification of peptides and proteins in complex mixtures of soluble proteins, cell lysates, etc. By using the shotgun proteomics methodology combined with bioaffinity purification we can remove or minimize the interference contamination of a complex tryptic digest and so avoid the time-consuming separation steps before the final MS analysis. We have proved that by means of enzymatic fragmentation (endoproteinases with Arg-C or/and Lys-C specificity) connected with the isolation of specific peptides we can obtain a simplified peptide mixture for easier identification of the entire protein. A new bioaffinity sorbent was developed for this purpose. Anhydrotrypsin (AHT), an inactive form of trypsin with an affinity for peptides with arginine (Arg) or lysine (Lys) at the C-terminus, was immobilized onto micro/nanoparticles with superparamagnetic properties (silica magnetite particles (SiMAG)-Carboxyl, Chemicell, Germany). This AHT carrier with a determined binding capacity (26.8 nmol/mg of carrier) was tested with a model peptide, human neurotensin, and the resulting MS spectra confirmed the validity of this approach.
Economic Models for Projecting Industrial Capacity for Defense Production: A Review
1983-02-01
macroeconomic forecast to establish the level of civilian final demand; all use the DoD Bridge Table to allocate budget category outlays to industries. Civilian...output table.’ 3. Macroeconomic Assumptions and the Prediction of Final Demand All input-output models require as a starting point a prediction of final... macroeconomic fore- cast of GNP and its components and (2) a methodology to transform these forecast values of consumption, investment, exports, etc. into
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-03
... Development and Research, HUD. ACTION: Notice of Final Fiscal Year (FY) 2014 Fair Market Rents (FMRs). SUMMARY.... Questions on how to conduct FMR surveys or concerning further methodological explanations may be addressed..., Office of Policy Development and Research, telephone 202-708-0590. Persons with hearing or speech...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... Film, Sheet, and Strip From Taiwan: Extension of Time Limit for Final Results of the Antidumping Duty... preliminary results of this review. See Polyethylene Terephthalate Film, Sheet, and Strip From Taiwan... Results and to issue a post-preliminary analysis regarding whether to use an alternate cost methodology...
ERIC Educational Resources Information Center
Storey, Keith
This final report briefly describes activities of a project which developed and evaluated specific natural support intervention procedures to increase the social integration of employees with severe disabilities using single-subject, clique analysis, and social validation methodologies. The project resulted in the publication of 6 journal articles…
ERIC Educational Resources Information Center
Thouvenelle, Suzanne; And Others
The final document in a series on least restrictive environment (LRE) placement for handicapped students summarizes the objectives and findings of the project. Research questions, methodology, and conclusions are reviewed from each of four research activities: state education agency analysis; local education agency analysis; legal analysis; and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-02
... the programs covered by this investigation, and the methodologies used to calculate the subsidy rates... determination with the final determinations in the companion AD investigations of MCBs from the PRC and Mexico...\\ The Petitioner in the instant investigation is Resco Products Inc. On January 22, 2010, the GOC filed...
Vehicle mass and injury risk in two-car crashes: A novel methodology.
Tolouei, Reza; Maher, Mike; Titheridge, Helena
2013-01-01
This paper introduces a novel methodology based on disaggregate analysis of two-car crash data to estimate the partial effects of mass, through the velocity change, on absolute driver injury risk in each of the vehicles involved in the crash when absolute injury risk is defined as the probability of injury when the vehicle is involved in a two-car crash. The novel aspect of the introduced methodology is in providing a solution to the issue of lack of data on the speed of vehicles prior to the crash, which is required to calculate the velocity change, as well as a solution to the issue of lack of information on non-injury two-car crashes in national accident data. These issues have often led to focussing on relative measures of injury risk that are not independent of risk in the colliding cars. Furthermore, the introduced methodology is used to investigate whether there is any effect of vehicle size above and beyond that of mass ratio, and whether there are any effects associated with the gender and age of the drivers. The methodology was used to analyse two-car crashes to investigate the partial effects of vehicle mass and size on absolute driver injury risk. The results confirmed that in a two-car collision, vehicle mass has a protective effect on its own driver injury risk and an aggressive effect on the driver injury risk of the colliding vehicle. The results also confirmed that there is a protective effect of vehicle size above and beyond that of vehicle mass for frontal and front to side collisions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Veneziano, Domenico; Ahmed, Kamran; Van Cleynenbreugel, Ben S E P; Gözen, Ali Serdar; Palou, Joan; Sarica, Kemal; Liatsikos, Evangelos N; Sanguedolce, Francesco; Honeck, Patrick; Alvarez-Maestro, Mario; Papatsoris, Athanasios; Kallidonis, Panagiotis; Greco, Francesco; Breda, Alberto; Somani, Bhaskar
2017-07-10
Background Simulation based technical-skill assessment is a core topic of debate, especially in high-risk environments. After the introduction of the E-BLUS exam for basic laparoscopy, no more technical training/assessment urological protocols have been developed in Europe. Objective We describe the methodology used in the development of the novel Endoscopic Stone Treatment step 1 (EST s1) assessment curriculum. Materials and Methods The "full life cycle curriculum development" template was followed for curriculum development. A CTA was run to define the most important steps and details of RIRS, in accordance with EAU Urolithiasis guidelines. Training tasks were created between April 2015 and September 2015. Tasks and metrics were further analyzed by a consensus meeting with the EULIS board in February 2016. A review, aimed to study available simulators and their accordance with task requirements, was subsequently run in London on March 2016. After initial feedback and further tests, content validity of this protocol was achieved during EUREP 2016. Results The EST s1 curriculum development, took 23 months. 72 participants tested the 5 preliminary tasks during EUREP 2015, with sessions of 45 minutes each. Likert-scale questionnaires were filled-out to score the quality of training. The protocol was modified accordingly and 25 participants tested the 4 tasks during the hands-on training sessions of the ESUT 2016 congress. 134 participants finally participated in the validation study in EUREP 2016. During the same event 10 experts confirmed content validity by filling-out a Likert-scale questionnaire. Conclusion We described a reliable and replicable methodology that can be followed to develop training/assessment protocols for surgical procedures. The expert consensus meetings, strict adherence to guidelines and updated literature search towards an Endourology curriculum allowed correct training and assessment protocol development. It is the first step towards standardized simulation training in Endourology with a potential for worldwide adoption.
Assessing the quality of the volume-outcome relationship in uro-oncology.
Mayer, Erik K; Purkayastha, Sanjay; Athanasiou, Thanos; Darzi, Ara; Vale, Justin A
2009-02-01
To assess systematically the quality of evidence for the volume-outcome relationship in uro-oncology, and thus facilitate the formulating of health policy within this speciality, as 'Implementation of Improving Outcome Guidance' has led to centralization of uro-oncology based on published studies that have supported a 'higher volume-better outcome' relationship, but improved awareness of methodological drawbacks in health service research has questioned the strength of this proposed volume-outcome relationship. We systematically searched previous relevant reports and extracted all articles from 1980 onwards assessing the volume-outcome relationship for cystectomy, prostatectomy and nephrectomy at the institution and/or surgeon level. Studies were assessed for their methodological quality using a previously validated rating system. Where possible, meta-analytical methods were used to calculate overall differences in outcome measures between low and high volume healthcare providers. In all, 22 studies were included in the final analysis; 19 of these were published in the last 5 years. Only four studies appropriately explored the effect of both the institution and surgeon volume on outcome measures. Mortality and length of stay were the most frequently measured outcomes. The median total quality scores within each of the operation types were 8.5, 9 and 8 for cystectomy, prostatectomy and nephrectomy, respectively (possible maximum score 18). Random-effects modelling showed a higher risk of mortality in low-volume institutions than in higher-volume institutions for both cystectomy and nephrectomy (odds ratio 1.88, 95% confidence interval 1.54-2.29, and 1.28, 1.10-1.49, respectively). The methodological quality of volume-outcome research as applied to cystectomy, prostatectomy and nephrectomy is only modest at best. Accepting several limitations, pooled analysis confirms a higher-volume, lower-mortality relationship for cystectomy and nephrectomy. Future research should focus on the development of a quality framework with a validated scoring system for the bench-marking of data to improve validity and facilitate rational policy-making within the speciality of uro-oncology.
Major Upgrades to the AIRS Version-6 Water Vapor Profile Methodology
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John; Iredell, Lena
2015-01-01
This research is a continuation of part of what was shown at the last AIRS Science Team Meeting and the AIRS 2015 NetMeeting. AIRS Version 6 was finalized in late 2012 and is now operational. Version 6 contained many significant improvements in retrieval methodology compared to Version 5. Version 6 retrieval methodology used for the water vapor profile q(p) and ozone profile O3(p) retrievals is basically unchanged from Version 5, or even from Version 4. Subsequent research has made significant improvements in both water vapor and O3 profiles compared to Version 6.
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
75 FR 69586 - New Animal Drugs for Minor Use and Minor Species
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
... companion proposed rule and direct final rule are substantively identical. DATES: This rule is effective... confirming the effective date of the final rule in the Federal Register within 30 days after the comment... will publish a document in the Federal Register withdrawing this direct final rule before its effective...
Vertically aligned carbon nanotubes for microelectrode arrays applications.
Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric
2012-09-01
In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.
Sketching Designs Using the Five Design-Sheet Methodology.
Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D
2016-01-01
Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.
Development of a Flipped Medical School Dermatology Module.
Fox, Joshua; Faber, David; Pikarsky, Solomon; Zhang, Chi; Riley, Richard; Mechaber, Alex; O'Connell, Mark; Kirsner, Robert S
2017-05-01
The flipped classroom module incorporates independent study in advance of in-class instructional sessions. It is unproven whether this methodology is effective within a medical school second-year organ system module. We report the development, implementation, and effectiveness of the flipped classroom methodology in a second-year medical student dermatology module at the University of Miami Leonard M. Miller School of Medicine. In a retrospective cohort analysis, we compared attitudinal survey data and mean scores for a 50-item multiple-choice final examination of the second-year medical students who participated in this 1-week flipped course with those of the previous year's traditional, lecture-based course. Each group comprised nearly 200 students. Students' age, sex, Medical College Admission Test scores, and undergraduate grade point averages were comparable between the flipped and traditional classroom students. The flipped module students' mean final examination score of 92.71% ± 5.03% was greater than that of the traditional module students' 90.92% ± 5.51% ( P < 0.001) score. Three of the five most commonly missed questions were identical between the two cohorts. The majority of students preferred the flipped methodology to attending live lectures or watching previously recorded lectures. The flipped classroom can be an effective instructional methodology for a medical school second-year organ system module.
Rojas, Kristians Diaz; Montero, Maria L.; Yao, Jorge; Messing, Edward; Fazili, Anees; Joseph, Jean; Ou, Yangming; Rubens, Deborah J.; Parker, Kevin J.; Davatzikos, Christos; Castaneda, Benjamin
2015-01-01
Abstract. A methodology to study the relationship between clinical variables [e.g., prostate specific antigen (PSA) or Gleason score] and cancer spatial distribution is described. Three-dimensional (3-D) models of 216 glands are reconstructed from digital images of whole mount histopathological slices. The models are deformed into one prostate model selected as an atlas using a combination of rigid, affine, and B-spline deformable registration techniques. Spatial cancer distribution is assessed by counting the number of tumor occurrences among all glands in a given position of the 3-D registered atlas. Finally, a difference between proportions is used to compare different spatial distributions. As a proof of concept, we compare spatial distributions from patients with PSA greater and less than 5 ng/ml and from patients older and younger than 60 years. Results suggest that prostate cancer has a significant difference in the right zone of the prostate between populations with PSA greater and less than 5 ng/ml. Age does not have any impact in the spatial distribution of the disease. The proposed methodology can help to comprehend prostate cancer by understanding its spatial distribution and how it changes according to clinical parameters. Finally, this methodology can be easily adapted to other organs and pathologies. PMID:26236756
Considering consumer choice in the economic evaluation of mandatory health programmes: a review.
Parkinson, Bonny; Goodall, Stephen
2011-08-01
Governments are increasing their focus on mandatory public health programmes following positive economic evaluations of their impact. This review aims to examine whether loss of consumer choice should be included in economic evaluations of mandatory health programmes (MHP). A systematic literature review was conducted to identify economic evaluations of MHP, whether they discuss the impact on consumer choice and any methodological limitations. Overall 39 economic evaluations were identified, of which 10 discussed the loss of consumer choice and 6 attempted to place a value on the loss of consumer choice. Methodological limitations included: measuring the marginal cost of compliance, unavailability of price elasticity estimates, the impact of income effects, double counting health impacts, biased willingness-to-pay responses, and "protest" responses. Overall it was found that the inclusion of the loss of consumer choice rarely impacted on the final outcome of the study. The impact of MHP on the loss of consumer choice has largely been ignored in economic evaluations. Its importance remains uncertain due to its infrequent inclusion and significant methodological limitations. Further research regarding which methodology is best for valuing the loss of consumer choice and whether it is important to the final implementation decision is warranted. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Gyori, Miklos; Stefanik, Krisztina; Kanizsai-Nagy, Ildikó
2015-01-01
A growing body of evidence confirms that mobile digital devices have key potentials as assistive/educational tools for people with autism spectrum disorders. The aim of this paper is to outline key aspects of development and evaluation methodologies that build on, and provide systematic evidence on effects of using such apps. We rely on the results of two R+D projects, both using quantitative and qualitative methods to support development and to evaluate developed apps (n=54 and n=22). Analyzing methodological conclusions from these studies we outline some guidelines for an 'ideal' R+D methodology but we also point to important trade-offs between the need for best systematic evidence and the limitations on development time and costs. We see these trade-offs as a key issue to be resolved in this field.
Hörmeyer, Ina; Renner, Gregor
2013-09-01
For individuals with complex communication needs, one of the most frequent communicative strategies is the co-construction of meaning with familiar partners. This preliminary single-case study gives insight into a special sequential pattern of co-construction processes - the search sequence - particularly in relation to the processes of confirming and denying meanings proposed by familiar interaction partners. Five different conversations between an adult with cerebral palsy and complex communication needs and two familiar co-participants were videotaped and analyzed using the methodology of conversation analysis (CA). The study revealed that confirmations and denials are not simply two alternative actions, but that several possibilities to realize confirmations and denials exist that differ in their frequency and that have different consequences for the sequential context. This study of confirmations and denials demonstrates that co-construction processes are more complex than have previously been documented.
2001-08-28
This final rule implements fee schedules for payment of parenteral and enteral nutrition (PEN) items and services furnished under the prosthetic device benefit, defined in section 1861(s)(8) of the Social Security Act. The authority for establishing these fee schedules is provided by the Balanced Budget Act of 1997, which amended the Social Security Act at section 1842(s). Section 1842(s) of the Social Security Act specifies that statewide or other area wide fee schedules may be implemented for the following items and services still subject to the reasonable charge payment methodology: medical supplies; home dialysis supplies and equipment; therapeutic shoes; parenteral and enteral nutrients, equipment, and supplies; electromyogram devices; salivation devices; blood products; and transfusion medicine. This final rule describes changes made to the proposed fee schedule payment methodology for these items and services and provides that the fee schedules for PEN items and services are effective for all covered items and services furnished on or after January 1, 2002. Fee schedules will not be implemented for electromyogram devices and salivation devices at this time since these items are not covered by Medicare. In addition, fee schedules will not be implemented for medical supplies, home dialysis supplies and equipment, therapeutic shoes, blood products, and transfusion medicine at this time since the data required to establish these fee schedules are inadequate.
ERIC Educational Resources Information Center
Filatov, Ksenia; Pill, Shane
2015-01-01
No literature exists on English teaching efficacy or self-efficacy or on pre-service teachers' English teaching self-efficacy and its relationship to pre-service teacher education. This project addressed this conceptual and methodological gap in current teacher efficacy research literature. Five pre-service English teachers in their final year of…
ERIC Educational Resources Information Center
Townley, Charles T.
As the final report on the National Indian Education Association's (NIEA) Library Project, this document presents the following: (1) an introduction (describes the general condition of American Indian library service, the involvement of NIEA, and the project's objectives and time line); (2) the methodology of Phase I: identification of…
ERIC Educational Resources Information Center
LaHart, David E.; Allen, Rodney F.
This is the final report of a workshop in which selected teachers from Florida public schools learned about energy technology and conservation, and teaching methodology needed to incorporate energy education into existing school curriculum. Participants were teachers of science, social studies, environmental studies, and home economics. During the…
ERIC Educational Resources Information Center
Council for Cultural Cooperation, Strasbourg (France).
The final report of a project group representing the 18 member countries of the Council of Europe Council for Cultural Co-operation investigates education and cultural development of migrants. The report discusses methodology and reasons for selecting an intercultural approach to migrant education in terms of interculturalism's basic elements:…
Advanced and Hybrid Powertrains
and analysis, and to create methodologies for evaluating the true potential of proposed advanced architectures, and optimal control strategies. Finally, experimental studies are being conducted to support
European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.
Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Gallart-Ayala, H; Courant, F; Severe, S; Antignac, J-P; Morio, F; Abadie, J; Le Bizec, B
2013-09-24
Lipids represent an extended class of substances characterized by such high variety and complexity that makes their unified analyses by liquid chromatography coupled to either high resolution or tandem mass spectrometry (LC-HRMS or LC-MS/MS) a real challenge. In the present study, a new versatile methodology associating ultra high performance liquid chromatography coupled to high resolution tandem mass spectrometry (UHPLC-HRMS/MS) have been developed for a comprehensive analysis of lipids. The use of polarity switching and "all ion fragmentation" (AIF) have been two action levels particularly exploited to finally permit the detection and identification of a multi-class and multi-analyte extended range of lipids in a single run. For identification purposes, both higher energy collision dissociation (HCD) and in-source CID (collision induced dissociation) fragmentation were evaluated in order to obtain information about the precursor and product ions in the same spectra. This approach provides both class-specific and lipid-specific fragments, enhancing lipid identification. Finally, the developed method was applied for differential phenotyping of serum samples collected from pet dogs developing spontaneous malignant mammary tumors and health controls. A biological signature associated with the presence of cancer was then successfully revealed from this lipidome analysis, which required to be further investigated and confirmed at larger scale. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yang, Hongqin; Xie, Shusen; Li, Hui; Wang, Yuhua
2009-04-01
A new concept and its methodology for studying human meridians are presented based on rigorous and scientific observation on the objective existence of human meridians in view of biomedical optics. According to this methodology, the infrared radiant characteristics of acupuncture meridians over human body and the optical transport properties of light propagating along the meridian are reported. This study, thus, confirms the existence of acupuncture meridians, sheds new light on an approach to investigation of human meridians and offers a new perspective in understanding the potential meridian functions such as energy and information transfer and physiological regulation.
Monitoring of artificial water reservoirs in the Southern Brazilian Amazon with remote sensing data
NASA Astrophysics Data System (ADS)
Arvor, Damien; Daher, Felipe; Corpetti, Thomas; Laslier, Marianne; Dubreuil, Vincent
2016-10-01
The agricultural expansion in the Southern Brazilian Amazon has long been pointed out due to its severe impacts on tropical forests. But the last decade has been marked by a rapid agricultural transition which enabled to reduce pressure on forests through (i) the adoption of intensive agricultural practices and (ii) the diversification of activities. However, we suggest that this new agricultural model implies new pressures on environment and especially on water resources since many artificial water reservoirs have been built to ensure crop irrigation, generate energy, farm fishes, enable access to water for cattle or just for leisure. In this paper, we implemented a method to automatically map artificial water reservoirs based on time series of Landsat images. The method was tested in the county of Sorriso (State of Mato Grosso, Brazil) where we identified 521 water reservoirs by visual inspection on very high resolution images. 68 Landsat-8 images covering 4 scenes in 2015 were pre-classified and a final class (Terrestrial or Aquatic) was determined for each pixel based on a Dempster-Shafer fusion approach. Results confirmed the potential of the methodology to automatically and efficiently detect water reservoirs in the study area (overall accuracy = 0.952 and Kappa index = 0.904) although the methodology underestimates the total area in water bodies because of the spatial resolution of Landsat images. In the case of Sorriso, we mapped 19.4 km2 of the 20.8 km2 of water reservoirs initially delimited by visual interpretation, i.e. we underestimated the area by 5.9%.
Mohamed, Lagzouli; Kettani, Youssfi El; Ali, Aitounejjar; Mohamed, Elyachioui; Mohamed, Jadal
2017-01-01
Glucoamylase is among the most important enzymes in biotechnology. The present study aims to determine better conditions for growth and glucoamylase production by Candida guilliermondii and to reduce the overall cost of the medium using Box-Behnken design with one central point and response surface methodology. Box-Behnken factorial design based on three levels was carried out to obtain optimal medium combination of five independent variables such as initial pH, soluble starch, CH4N2O, yeast extract and MgSO4. Forty one randomized mediums were incubated in flask on a rotary shaker at 105 rpm for 72 h at 30°C. The production of biomass was found to be pH and starch dependent, maximum production when the starch concentration was 8 g L-1 and the initial pH was 6, while maximum glucoamylase production was found at 6.5 of initial pH, 4 g L-1 yeast extract and 6 g L-1 starch, whereas yeast extract and urea were highly significant, but interacted negatively. Box-Behnken factorial design used for the analysis of treatment combinations gave a second-order polynomial regression model with R2 = 0.976 for Biomass and R2 = 0.981 for glucoamylase. The final biomass and glucoamylase activity obtained was very close to the calculated parameters according to the p-values (p<0.001), the predicted optimal parameters were confirmed and provides a basis for further studies in baking additives and in the valuation of starch waste products.
An ontology-driven, case-based clinical decision support model for removable partial denture design
NASA Astrophysics Data System (ADS)
Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao
2016-06-01
We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient’s oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.
An ontology-driven, case-based clinical decision support model for removable partial denture design.
Chen, Qingxiao; Wu, Ji; Li, Shusen; Lyu, Peijun; Wang, Yong; Li, Miao
2016-06-14
We present the initial work toward developing a clinical decision support model for specific design of removable partial dentures (RPDs) in dentistry. We developed an ontological paradigm to represent knowledge of a patient's oral conditions and denture component parts. During the case-based reasoning process, a cosine similarity algorithm was applied to calculate similarity values between input patients and standard ontology cases. A group of designs from the most similar cases were output as the final results. To evaluate this model, the output designs of RPDs for 104 randomly selected patients were compared with those selected by professionals. An area under the curve of the receiver operating characteristic (AUC-ROC) was created by plotting true-positive rates against the false-positive rate at various threshold settings. The precision at position 5 of the retrieved cases was 0.67 and at the top of the curve it was 0.96, both of which are very high. The mean average of precision (MAP) was 0.61 and the normalized discounted cumulative gain (NDCG) was 0.74 both of which confirmed the efficient performance of our model. All the metrics demonstrated the efficiency of our model. This methodology merits further research development to match clinical applications for designing RPDs. This paper is organized as follows. After the introduction and description of the basis for the paper, the evaluation and results are presented in Section 2. Section 3 provides a discussion of the methodology and results. Section 4 describes the details of the ontology, similarity algorithm, and application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn, R.Y.S.; Bolmarcich, J.J.
The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less
Mogasale, Vittal; Mogasale, Vijayalaxmi V; Ramani, Enusa; Lee, Jung Seok; Park, Ju Yeon; Lee, Kang Sung; Wierzba, Thomas F
2016-01-29
The control of typhoid fever being an important public health concern in low and middle income countries, improving typhoid surveillance will help in planning and implementing typhoid control activities such as deployment of new generation Vi conjugate typhoid vaccines. We conducted a systematic literature review of longitudinal population-based blood culture-confirmed typhoid fever studies from low and middle income countries published from 1(st) January 1990 to 31(st) December 2013. We quantitatively summarized typhoid fever incidence rates and qualitatively reviewed study methodology that could have influenced rate estimates. We used meta-analysis approach based on random effects model in summarizing the hospitalization rates. Twenty-two papers presented longitudinal population-based and blood culture-confirmed typhoid fever incidence estimates from 20 distinct sites in low and middle income countries. The reported incidence and hospitalizations rates were heterogeneous as well as the study methodology across the sites. We elucidated how the incidence rates were underestimated in published studies. We summarized six categories of under-estimation biases observed in these studies and presented potential solutions. Published longitudinal typhoid fever studies in low and middle income countries are geographically clustered and the methodology employed has a potential for underestimation. Future studies should account for these limitations.
Alternative power supply systems for remote industrial customers
NASA Astrophysics Data System (ADS)
Kharlamova, N. V.; Khalyasmaa, A. I.; Eroshenko, S. A.
2017-06-01
The paper addresses the problem of alternative power supply of remote industrial clusters with renewable electric energy generation. As a result of different technologies comparison, consideration is given to wind energy application. The authors present a methodology of mean expected wind generation output calculation, based on Weibull distribution, which provides an effective express-tool for preliminary assessment of required installed generation capacity. The case study is based on real data including database of meteorological information, relief characteristics, power system topology etc. Wind generation feasibility estimation for a specific territory is followed by power flow calculations using Monte Carlo methodology. Finally, the paper provides a set of recommendations to ensure safe and reliable power supply for the final customers and, subsequently, to provide sustainable development of the regions, located far from megalopolises and industrial centres.
Fractional-order TV-L2 model for image denoising
NASA Astrophysics Data System (ADS)
Chen, Dali; Sun, Shenshen; Zhang, Congrong; Chen, YangQuan; Xue, Dingyu
2013-10-01
This paper proposes a new fractional order total variation (TV) denoising method, which provides a much more elegant and effective way of treating problems of the algorithm implementation, ill-posed inverse, regularization parameter selection and blocky effect. Two fractional order TV-L2 models are constructed for image denoising. The majorization-minimization (MM) algorithm is used to decompose these two complex fractional TV optimization problems into a set of linear optimization problems which can be solved by the conjugate gradient algorithm. The final adaptive numerical procedure is given. Finally, we report experimental results which show that the proposed methodology avoids the blocky effect and achieves state-of-the-art performance. In addition, two medical image processing experiments are presented to demonstrate the validity of the proposed methodology.
Theoretical Approaches to Political Communication.
ERIC Educational Resources Information Center
Chesebro, James W.
Political communication appears to be emerging as a theoretical and methodological academic area of research within both speech-communication and political science. Five complimentary approaches to political science (Machiavellian, iconic, ritualistic, confirmational, and dramatistic) may be viewed as a series of variations which emphasize the…
Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.
Kellmeyer, Philipp
2017-10-01
Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.
Effects of surface chemistry on hot corrosion life
NASA Technical Reports Server (NTRS)
Fryxell, R. E.; Leese, G. E.
1985-01-01
This program has its primary objective: the development of hot corrosion life prediction methodology based on a combination of laboratory test data and evaluation of field service turbine components which show evidence of hot corrosion. The laboratory program comprises burner rig testing by TRW. A summary of results is given for two series of burner rig tests. The life prediction methodology parameters to be appraised in a final campaign of burner rig tests are outlined.
Multidisciplinary accident investigation : volume 1
DOT National Transportation Integrated Search
1976-09-01
The final report of the Multidisciplinary Accident Investigation Team of the Maryland Medical-Legal Foundation, Inc. is presented. The report describes the methodology, results, discussions, conclusions and recommendations pertaining to the investiga...
The methodology of database design in organization management systems
NASA Astrophysics Data System (ADS)
Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.
2017-01-01
The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.
Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms
NASA Technical Reports Server (NTRS)
Kurdila, Andrew J.; Sharpley, Robert C.
1999-01-01
This paper presents a final report on Wavelet and Multiresolution Analysis for Finite Element Networking Paradigms. The focus of this research is to derive and implement: 1) Wavelet based methodologies for the compression, transmission, decoding, and visualization of three dimensional finite element geometry and simulation data in a network environment; 2) methodologies for interactive algorithm monitoring and tracking in computational mechanics; and 3) Methodologies for interactive algorithm steering for the acceleration of large scale finite element simulations. Also included in this report are appendices describing the derivation of wavelet based Particle Image Velocity algorithms and reduced order input-output models for nonlinear systems by utilizing wavelet approximations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Columbia River System Operation Review
1995-11-01
This Appendix J of the Final Environmental Impact Statement for the Columbia River System discusses impacts on the recreational activities in the region. Major sections include the following: scope and processes; recreation in the Columbia River Basin today - by type, location, participation, user characteristics, factors which affect usage, and managing agencies; recreation analysis procedures and methodology; and alternatives and their impacts.
2018-02-15
models and approaches are also valid using other invasive and non - invasive technologies. Finally, we illustrate and experimentally evaluate this...2017 Project Outline q Pattern formation diversity in wild microbial societies q Experimental and mathematical analysis methodology q Skeleton...chemotaxis, nutrient degradation, and the exchange of amino acids between cells. Using both quantitative experimental methods and several theoretical
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Millennial Students' Mental Models of Information Retrieval
ERIC Educational Resources Information Center
Holman, Lucy
2009-01-01
This qualitative study examines first-year college students' online search habits in order to identify patterns in millennials' mental models of information retrieval. The study employed a combination of modified contextual inquiry and concept mapping methodologies to elicit students' mental models. The researcher confirmed previously observed…
Feature Selection and Effective Classifiers.
ERIC Educational Resources Information Center
Deogun, Jitender S.; Choubey, Suresh K.; Raghavan, Vijay V.; Sever, Hayri
1998-01-01
Develops and analyzes four algorithms for feature selection in the context of rough set methodology. Experimental results confirm the expected relationship between the time complexity of these algorithms and the classification accuracy of the resulting upper classifiers. When compared, results of upper classifiers perform better than lower…
The Holy Trinity of Methodological Rigor: A Skeptical View
ERIC Educational Resources Information Center
Coryn, Chris L. S.
2007-01-01
The author discusses validation hierarchies grounded in the tradition of quantitative research that generally consists of the criteria of validity, reliability and objectivity and compares this with similar criteria developed by the qualitative tradition, described as trustworthiness, dependability and confirmability. Although these quantitative…
Integrative Discovery Doing Science.
ERIC Educational Resources Information Center
Harry, Vickie; Belzer, William
1990-01-01
The article details a program in which gifted upper elementary grade students used videomicroscopy in a study of microscopic life in pond water. Each child produced a narrated videotape of a specific species studied. Program evaluation confirmed the motivational benefits of early opportunities with scientific instrumentation and methodology. (DB)
Osorio, Maria Teresa; Haughey, Simon A; Elliott, Christopher T; Koidis, Anastasios
2015-12-15
European Regulation 1169/2011 requires producers of foods that contain refined vegetable oils to label the oil types. A novel rapid and staged methodology has been developed for the first time to identify common oil species in oil blends. The qualitative method consists of a combination of a Fourier Transform Infrared (FTIR) spectroscopy to profile the oils and fatty acid chromatographic analysis to confirm the composition of the oils when required. Calibration models and specific classification criteria were developed and all data were fused into a simple decision-making system. The single lab validation of the method demonstrated the very good performance (96% correct classification, 100% specificity, 4% false positive rate). Only a small fraction of the samples needed to be confirmed with the majority of oils identified rapidly using only the spectroscopic procedure. The results demonstrate the huge potential of the methodology for a wide range of oil authenticity work. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Long Exercise Test in Periodic Paralysis: A Bayesian Analysis.
Simmons, Daniel B; Lanning, Julie; Cleland, James C; Puwanant, Araya; Twydell, Paul T; Griggs, Robert C; Tawil, Rabi; Logigian, Eric L
2018-05-12
The long exercise test (LET) is used to assess the diagnosis of periodic paralysis (PP), but LET methodology and normal "cut-off" values vary. To determine optimal LET methodology and cut-offs, we reviewed LET data (abductor digiti minimi (ADM) motor response amplitude, area) from 55 PP patients (32 genetically definite) and 125 controls. Receiver operating characteristic (ROC) curves were constructed and area-under-the-curve (AUC) calculated to compare 1) peak-to-nadir versus baseline-to-nadir methodologies, and 2) amplitude versus area decrements. Using Bayesian principles, optimal "cut-off" decrements that achieved 95% post-test probability of PP were calculated for various pre-test probabilities (PreTPs). AUC was highest for peak-to-nadir methodology and equal for amplitude and area decrements. For PreTP ≤50%, optimal decrement cut-offs (peak-to-nadir) were >40% (amplitude) or >50% (area). For confirmation of PP, our data endorse the diagnostic utility of peak-to-nadir LET methodology using 40% amplitude or 50% area decrement cut-offs for PreTPs ≤50%. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
2013-01-01
Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905
Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P
2013-08-22
Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.
Evaluate methodology to determine localized roughness.
DOT National Transportation Integrated Search
2016-03-01
The Texas Department of Transportation implements a smoothness specification based on inertial profile : measurements. This specification includes a localized roughness provision to locate defects on the final : surface based on measured surface prof...
Methodology to design a municipal solid waste generation and composition map: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es
Highlights: • To draw a waste generation and composition map of a town a lot of factors must be taken into account. • The methodology proposed offers two different depending on the available data combined with geographical information systems. • The methodology has been applied to a Spanish city with success. • The methodology will be a useful tool to organize the municipal solid waste management. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve naturalmore » resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.« less
According to the CPLL proteome sheriffs, not all aperitifs are created equal!
Lerma-García, María Jesús; D'Amato, Alfonsina; Fasoli, Elisa; Simó-Alfonso, Ernesto Francisco; Righetti, Pier Giorgio
2014-09-01
Combinatorial peptide ligand libraries (CPLLs) have been adopted for investigating the proteome of a popular aperitif in Northern Italy, called "Amaro Branzi", stated to be an infusion of a secret herbal mixture, of which some ingredients are declared on the label, namely Angelica officinalis, Gentiana lutea and orange peel, sweetened by a final addition of honey. In order to assess the genuineness of this commercial liqueur, we have prepared extracts of the three vegetable ingredients, assessed their proteomes, and compared them to the one found in the aperitif. The amaro's proteome was identified via prior capture with CPLLs at two different pH values (2.2 and 4.8). Via mass spectrometry analysis of the recovered fractions, after elution of the captured populations in 4% boiling SDS, we could confirm the presence of the following: six proteins originating from honey, 11 from orange peels, 29 from G. lutea and 46 from A. officinalis (including shared species), plus 33 species which could not be attributed to the other secret ingredients, due to paucity of genomic data on plant proteins, for a total of 93 unique gene products (merging shared proteins). This fully confirmed the genuineness of the product. Considering that most of these species could be present in trace amounts, undetectable by conventional techniques, the CPLL methodology, due to its ability to enhance the signal of trace components up to 3 to 4 orders of magnitude, could represent a powerful tool for investigating the genuineness and natural origin of commercial beverages in order to protect consumers from adulterated products. Copyright © 2014 Elsevier B.V. All rights reserved.
Teacher Team Commitment, Teamwork and Trust: Exploring Associations
ERIC Educational Resources Information Center
Park, Sungmin; Henkin, Alan B.; Egley, Robert
2005-01-01
Purpose: To investigate relationships between teamwork, trust and teacher team commitment. Design/methodology/approach: Research has confirmed the value-added effects of organizational commitment in terms of job performance, organizational effectiveness, and employee retention. This study focused on teacher teams as the unit of analysis, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sacuta, Norm; Young, Aleana; Worth, Kyle
2015-12-22
The IEAGHG Weyburn-Midale CO₂ Monitoring and Storage Project (WMP) began in 2000 with the first four years of research that confirmed the suitability of the containment complex of the Weyburn oil field in southeastern Saskatchewan as a storage location for CO₂ injected as part of enhanced oil recovery (EOR) operations. The first half of this report covers research conducted from 2010 to 2012, under the funding of the United States Department of Energy (contract DEFE0002697), the Government of Canada, and various other governmental and industry sponsors. The work includes more in-depth analysis of various components of a measurement, monitoring andmore » verification (MMV) program through investigation of data on site characterization and geological integrity, wellbore integrity, storage monitoring (geophysical and geochemical), and performance/risk assessment. These results then led to the development of a Best Practices Manual (BPM) providing oilfield and project operators with guidance on CO₂ storage and CO₂-EOR. In 2013, the USDOE and Government of Saskatchewan exercised an optional phase of the same project to further develop and deploy applied research tools, technologies, and methodologies to the data and research at Weyburn with the aim of assisting regulators and operators in transitioning CO₂-EOR operations into permanent storage. This work, detailed in the second half of this report, involves seven targeted research projects – evaluating the minimum dataset for confirming secure storage; additional overburden monitoring; passive seismic monitoring; history-matched modelling; developing proper wellbore design; casing corrosion evaluation; and assessment of post CO₂-injected core samples. The results from the final and optional phases of the Weyburn-Midale Project confirm the suitability of CO₂-EOR fields for the injection of CO₂, and further, highlight the necessary MMV and follow-up monitoring required for these operations to be considered permanent storage.« less
Araújo, Jane A M; Esmerino, Erick A; Alvarenga, Verônica O; Cappato, Leandro P; Hora, Iracema C; Silva, Marcia Cristina; Freitas, Monica Q; Pimentel, Tatiana C; Walter, Eduardo H M; Sant'Ana, Anderson S; Cruz, Adriano G
2018-03-01
This study aimed to develop a checklist for good hygiene practices (GHP) for raw material of vegetable origin using the focus groups (FGs) approach (n = 4). The final checklist for commercialization of horticultural products totaled 28 questions divided into six blocks, namely: water supply; hygiene, health, and training; waste control; control of pests; packaging and traceability; and hygiene of facilities and equipment. The FG methodology was efficient to elaborate a participatory and objective checklist, based on minimum hygiene requirements, serving as a tool for diagnosis, planning, and training in GHP of fresh vegetables, besides contributing to raise awareness of the consumers' food safety. The FG methodology provided useful information to establish the final checklist for GHP, with easy application, according to the previous participants' perception and experience.
77 FR 68684 - Updating OSHA Standards Based on National Consensus Standards; Head Protection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
..., 1918, and 1926 [Docket No. OSHA-2011-0184] RIN 1218-AC65 Updating OSHA Standards Based on National Consensus Standards; Head Protection AGENCY: Occupational Safety and Health Administration (OSHA), Labor. ACTION: Final rule; confirmation of effective date. SUMMARY: OSHA is confirming the effective date of its...
Methodology and Results of Mathematical Modelling of Complex Technological Processes
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.
2018-03-01
The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.
Levecke, Bruno; Kaplan, Ray M; Thamsborg, Stig M; Torgerson, Paul R; Vercruysse, Jozef; Dobson, Robert J
2018-04-15
Although various studies have provided novel insights into how to best design, analyze and interpret a fecal egg count reduction test (FECRT), it is still not straightforward to provide guidance that allows improving both the standardization and the analytical performance of the FECRT across a variety of both animal and nematode species. For example, it has been suggested to recommend a minimum number of eggs to be counted under the microscope (not eggs per gram of feces), but we lack the evidence to recommend any number of eggs that would allow a reliable assessment of drug efficacy. Other aspects that need further research are the methodology of calculating uncertainty intervals (UIs; confidence intervals in case of frequentist methods and credible intervals in case of Bayesian methods) and the criteria of classifying drug efficacy into 'normal', 'suspected' and 'reduced'. The aim of this study is to provide complementary insights into the current knowledge, and to ultimately provide guidance in the development of new standardized guidelines for the FECRT. First, data were generated using a simulation in which the 'true' drug efficacy (TDE) was evaluated by the FECRT under varying scenarios of sample size, analytic sensitivity of the diagnostic technique, and level of both intensity and aggregation of egg excretion. Second, the obtained data were analyzed with the aim (i) to verify which classification criteria allow for reliable detection of reduced drug efficacy, (ii) to identify the UI methodology that yields the most reliable assessment of drug efficacy (coverage of TDE) and detection of reduced drug efficacy, and (iii) to determine the required sample size and number of eggs counted under the microscope that optimizes the detection of reduced efficacy. Our results confirm that the currently recommended criteria for classifying drug efficacy are the most appropriate. Additionally, the UI methodologies we tested varied in coverage and ability to detect reduced drug efficacy, thus a combination of UI methodologies is recommended to assess the uncertainty across all scenarios of drug efficacy estimates. Finally, based on our model estimates we were able to determine the required number of eggs to count for each sample size, enabling investigators to optimize the probability of correctly classifying a theoretical TDE while minimizing both financial and technical resources. Copyright © 2018 Elsevier B.V. All rights reserved.
Medicare and Medicaid: long-term care survey--HCFA. Final rule.
1988-06-17
This final rule amends the Medicare and Medicaid regulations to require that the State survey agencies use the survey methods and procedures prescribed by HCFA and forms contained in regulations. The regulations define the principles on which Medicare and Medicaid survey methodologies are based and the required elements of a skilled nursing facility (SNF) or intermediate care facility (ICF) survey. This rule is in response to a court order.
Final Report DE-FG02-07ER64416
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seymour, Joseph D.
The document provides the Final Report for DE-FG02-07ER64416 on the use of magnetic resonance (MR) methods to quantify transport in porous media impacted by biological and chemical processes. Products resulting from the research in the form of peer reviewed publications and conference presentations are presented. The research correlated numerical simulations and MR measurements to test simulation methodology. Biofilm and uranium detection by MR was demonstrated.
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-10-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.
Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué
2015-01-01
In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638
Methodological Aspects of Trend Studies and Development of the HBSC Study in the Czech Republic.
Sigmund, Erik; Baďura, Petr; Sigmundová, Dagmar; Csémy, Ladislav; Kalman, Michal
2017-07-01
The aim of the study is to present the theoretical background of trend studies in general, to characterize the international Health Behaviour in School-aged Children (HBSC) study and to describe its methodology and changes of the Czech HBSC study between 1994 and 2014. The first part describes various types of trend research studies including their advantages and disadvantages. The second part summarizes the history of the HBSC study in an international context and particularly in the Czech Republic. The final part presents the basic methodological data from six surveys conducted in the Czech Republic between 1994 and 2014. Copyright© by the National Institute of Public Health, Prague 2017.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
Biobehavioral Outcomes Following Psychological Interventions for Cancer Patients
Andersen, Barbara L.
2007-01-01
Psychological interventions for adult cancer patients have primarily focused on reducing stress and enhancing quality of life. However, there has been expanded focus on biobehavioral outcomes—health behaviors, compliance, biologic responses, and disease outcomes—consistent with the Biobehavioral Model of cancer stress and disease course. The author reviewed this expanded focus in quasi-experimental and experimental studies of psychological interventions, provided methodologic detail, summarized findings, and highlighted novel contributions. A final section discussed methodologic issues, research directions, and challenges for the coming decade. PMID:12090371
In many studies of human exposure, the measurement of pollutant chemicals in the environment (air, water, food, soil, etc.) is being supplemented by their additional measurement in biological media such as human breath, blood, and urine. This allows an unambiguous confirmation...
LED traffic signal management system : final report.
DOT National Transportation Integrated Search
2016-06-01
This research originated from the opportunity to develop a methodology to assess when LED (Light Emitting Diode) traffic signal modules begin to fail to meet the Institute of Transportation Engineers (ITE) performance specification for luminous inten...
MITSI project : final local evaluation report
DOT National Transportation Integrated Search
2003-01-01
The mission statement for the MITSI project was facilitating National Standards Compliance migration for NaviGAtor, conducting National Architecture mapping for MARTA and E911, and evaluating CORBA as a methodology for exchanging data. This involved ...
Validation of the ULCEAT methodology by applying it in retrospect to the Roboticbed.
Nakamura, Mio; Suzurikawa, Jun; Tsukada, Shohei; Kume, Yohei; Kawakami, Hideo; Inoue, Kaoru; Inoue, Takenobu
2015-01-01
In answer to the increasing demand for care by the Japanese oldest portion of the population, an extensive programme of life support robots is under development, advocated by the Japanese government. Roboticbed® (RB) is developed to facilitate patients in their daily life in making independent transfers from and to the bed. The bed is intended both for elderly and persons with a disability. The purpose of this study is to examine the validity of the user and user's life centred clinical evaluation of assistive technology (ULCEAT) methodology. To support user centred development of life support robots the ULCEAT method was developed. By means of the ULCEAT method the target users and the use environment were re-established in an earlier study. The validity of the method is tested by re-evaluating the development of RB in retrospect. Six participants used the first prototype of RB (RB1) and eight participants used the second prototype of RB (RB2). The results indicated that the functionality was improved owing to the end-user evaluations. Therefore, we confirmed the content validity of the proposed ULCEAT method. In this study we confirmed the validation of the ULCEAT methodology by applying it in retrospect to RB using development process. This method will be used for the development of Life-support robots and prototype assistive technologies.
Methodological triangulation in a study of social support for siblings of children with cancer.
Murray, J S
1999-10-01
Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.
Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego
2017-09-22
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators
Sánchez-Picot, Álvaro
2017-01-01
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610
First-excited state g factor of Te 136 by the recoil in vacuum method
Stuchbery, A. E.; Allmond, J. M.; Danchev, M.; ...
2017-07-27
The g factor of the first 2 + state of radioactive 136Te with two valence protons and two valence neutrons beyond double-magic 132Sn has been measured by the recoil in vacuum (RIV) method. The lifetime of this state is an order of magnitude longer than the lifetimes of excited states recently measured by the RIV method in Sn and Te isotopes, requiring a new evaluation of the free-ion hyperfine interactions and methodology used to determine the g factor. In this paper, the calibration data are reported and the analysis procedures are described in detail. The resultant g factor has amore » similar magnitude to the g factors of other nuclei with an equal number of valence protons and neutrons in the major shell. However, an unexpected trend is found in the g factors of the N = 84 isotones, which decrease from 136Te to 144Nd. Finally, shell model calculations with interactions derived from the CD Bonn potential show good agreement with the g factors and E2 transition rates of 2 + states around 132Sn, confirming earlier indications that 132Sn is a good doubly magic core.« less
Deniaud, Aurélien; Panwar, Pankaj; Frelet-Barrand, Annie; Bernaudat, Florent; Juillan-Binard, Céline; Ebel, Christine; Rolland, Norbert; Pebay-Peyroula, Eva
2012-01-01
Background Chloroplast ATP/ADP transporters are essential to energy homeostasis in plant cells. However, their molecular mechanism remains poorly understood, primarily due to the difficulty of producing and purifying functional recombinant forms of these transporters. Methodology/Principal Findings In this work, we describe an expression and purification protocol providing good yields and efficient solubilization of NTT1 protein from Arabidopsis thaliana. By biochemical and biophysical analyses, we identified the best detergent for solubilization and purification of functional proteins, LAPAO. Purified NTT1 was found to accumulate as two independent pools of well folded, stable monomers and dimers. ATP and ADP binding properties were determined, and Pi, a co-substrate of ADP, was confirmed to be essential for nucleotide steady-state transport. Nucleotide binding studies and analysis of NTT1 mutants lead us to suggest the existence of two distinct and probably inter-dependent binding sites. Finally, fusion and deletion experiments demonstrated that the C-terminus of NTT1 is not essential for multimerization, but probably plays a regulatory role, controlling the nucleotide exchange rate. Conclusions/Significance Taken together, these data provide a comprehensive molecular characterization of a chloroplast ATP/ADP transporter. PMID:22438876
Movement pattern recognition in basketball free-throw shooting.
Schmidt, Andrea
2012-04-01
The purpose of the present study was to analyze the movement patterns of free-throw shooters in basketball at different skill levels. There were two points of interest. First, to explore what information can be drawn from the movement pattern and second, to examine the methodological possibilities of pattern analysis. To this end, several qualitative and quantitative methods were employed. The resulting data were converged in a triangulation. Using a special kind of ANN named Dynamically Controlled Networks (DyCoN), a 'complex feature' consisting of several isolated features (angle displacements and velocities of the articulations of the kinematic chain) was calculated. This 'complex feature' was displayed by a trajectory combining several neurons of the network, reflecting the devolution of the twelve angle measures over the time course of each shooting action. In further network analyses individual characteristics were detected, as well as movement phases. Throwing patterns were successfully classified and the stability and variability of the realized pattern were established. The movement patterns found were clearly individually shaped as well as formed by the skill level. The triangulation confirmed the individual movement organizations. Finally, a high stability of the network methods was documented. Copyright © 2012. Published by Elsevier B.V.
Orientational order and rotational relaxation in the plastic crystal phase of tetrahedral molecules.
Rey, Rossend
2008-01-17
A methodology recently introduced to describe orientational order in liquid carbon tetrachloride is extended to the plastic crystal phase of XY4 molecules. The notion that liquid and plastic crystal phases are germane regarding orientational order is confirmed for short intermolecular distances but is seen to fail beyond, as long range orientational correlations are found for the simulated solid phase. It is argued that, if real, such a phenomenon may not to be accessible with direct (diffraction) methods due to the high molecular symmetry. This behavior is linked to the existence of preferential orientation with respect to the fcc crystalline network defined by the centers of mass. It is found that the dominant class accounts, at most, for one-third of all configurations, with a feeble dependence on temperature. Finally, the issue of rotational relaxation is also addressed, with an excellent agreement with experimental measures. It is shown that relaxation is nonhomogeneous in the picosecond range, with a slight dispersion of decay times depending on the initial orientational class. The results reported mainly correspond to neopentane over a wide temperature range, although results for carbon tetrachloride are included, as well.
The discovery and early structural studies of arachidonic acid
Martin, Sarah A.; Brash, Alan R.; Murphy, Robert C.
2016-01-01
Arachidonic acid and esterified arachidonate are ubiquitous components of every mammalian cell. This polyunsaturated fatty acid serves very important biochemical roles, including being the direct precursor of bioactive lipid mediators such as prostaglandin and leukotrienes. This 20 carbon fatty acid with four double bonds was first isolated and identified from mammalian tissues in 1909 by Percival Hartley. This was accomplished prior to the advent of chromatography or any spectroscopic methodology (MS, infrared, UV, or NMR). The name, arachidonic, was suggested in 1913 based on its relationship to the well-known arachidic acid (C20:0). It took until 1940 before the positions of the four double bonds were defined at 5,8,11,14 of the 20-carbon chain. Total synthesis was reported in 1961 and, finally, the configuration of the double bonds was confirmed as all-cis-5,8,11,14. By the 1930s, the relationship of arachidonic acid within the family of essential fatty acids helped cue an understanding of its structure and the biosynthetic pathway. Herein, we review the findings leading up to the discovery of arachidonic acid and the progress toward its complete structural elucidation. PMID:27142391
What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.
2012-12-01
A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less
Trust, trolleys and social dilemmas: A replication study.
Bostyn, Dries H; Roets, Arne
2017-05-01
The present manuscript addresses how perceived trustworthiness of cooperative partners in a social dilemma context is influenced by the moral judgments those partners make on Trolley-type moral dilemmas; an issue recently investigated by Everett, Pizarro, and Crockett (2016). The present research comprises 2 studies that were conducted independently, simultaneously with, and incognizant of the Everett studies. Whereas the present studies aimed at investigating the same research hypothesis, a different and more elaborate methodology was used, as such providing a conceptual replication opportunity and extension to the Everett et al. Overall, the present studies clearly confirmed the main finding of Everett et al., that deontologists are more trusted than consequentialists in social dilemma games. Study 1 replicates Everett et al.'s effect in the context of trust games. Study 2 generalizes the effect to public goods games, thus demonstrating that it is not specific to the type of social dilemma game used in Everett et al. Finally, both studies build on these results by demonstrating that the increased trust in deontologists may sometimes, but not always, be warranted: deontologists displayed increased cooperation rates but only in the public goods game and not in trust games. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Ten years of preanalytical monitoring and control: Synthetic Balanced Score Card Indicator
López-Garrigós, Maite; Flores, Emilio; Santo-Quiles, Ana; Gutierrez, Mercedes; Lugo, Javier; Lillo, Rosa; Leiva-Salinas, Carlos
2015-01-01
Introduction Preanalytical control and monitoring continue to be an important issue for clinical laboratory professionals. The aim of the study was to evaluate a monitoring system of preanalytical errors regarding not suitable samples for analysis, based on different indicators; to compare such indicators in different phlebotomy centres; and finally to evaluate a single synthetic preanalytical indicator that may be included in the balanced scorecard management system (BSC). Materials and methods We collected individual and global preanalytical errors in haematology, coagulation, chemistry, and urine samples analysis. We also analyzed a synthetic indicator that represents the sum of all types of preanalytical errors, expressed in a sigma level. We studied the evolution of those indicators over time and compared indicator results by way of the comparison of proportions and Chi-square. Results There was a decrease in the number of errors along the years (P < 0.001). This pattern was confirmed in primary care patients, inpatients and outpatients. In blood samples, fewer errors occurred in outpatients, followed by inpatients. Conclusion We present a practical and effective methodology to monitor unsuitable sample preanalytical errors. The synthetic indicator results summarize overall preanalytical sample errors, and can be used as part of BSC management system. PMID:25672466
Sequential analysis of hydrochemical data for watershed characterization.
Thyne, Geoffrey; Güler, Cüneyt; Poeter, Eileen
2004-01-01
A methodology for characterizing the hydrogeology of watersheds using hydrochemical data that combine statistical, geochemical, and spatial techniques is presented. Surface water and ground water base flow and spring runoff samples (180 total) from a single watershed are first classified using hierarchical cluster analysis. The statistical clusters are analyzed for spatial coherence confirming that the clusters have a geological basis corresponding to topographic flowpaths and showing that the fractured rock aquifer behaves as an equivalent porous medium on the watershed scale. Then principal component analysis (PCA) is used to determine the sources of variation between parameters. PCA analysis shows that the variations within the dataset are related to variations in calcium, magnesium, SO4, and HCO3, which are derived from natural weathering reactions, and pH, NO3, and chlorine, which indicate anthropogenic impact. PHREEQC modeling is used to quantitatively describe the natural hydrochemical evolution for the watershed and aid in discrimination of samples that have an anthropogenic component. Finally, the seasonal changes in the water chemistry of individual sites were analyzed to better characterize the spatial variability of vertical hydraulic conductivity. The integrated result provides a method to characterize the hydrogeology of the watershed that fully utilizes traditional data.
Ito, Shosuke; Kikuta, Marina; Koike, Shota; Szewczyk, Grzegorz; Sarna, Michal; Zadlo, Andrzej; Sarna, Tadeusz; Wakamatsu, Kazumasa
2016-05-01
Eumelanin photoprotects pigmented tissues from ultraviolet (UV) damage. However, UVA-induced tanning seems to result from the photooxidation of preexisting melanin and does not contribute to photoprotection. We investigated the mechanism of UVA-induced degradation of 5,6-dihydroxyindole-2-carboxylic acid (DHICA)-melanin taking advantage of its solubility in a neutral buffer and using a differential spectrophotometric method to detect subtle changes in its structure. Our methodology is suitable for examining the effects of various agents that interact with reactive oxygen species (ROS) to determine how ROS is involved in the UVA-induced oxidative modifications. The results show that UVA radiation induces the oxidation of DHICA to indole-5,6-quinone-2-carboxylic acid in eumelanin, which is then cleaved to form a photodegraded, pyrrolic moiety and finally to form free pyrrole-2,3,5-tricarboxylic acid. The possible involvement of superoxide radical and singlet oxygen in the oxidation was suggested. The generation and quenching of singlet oxygen by DHICA-melanin was confirmed by direct measurements of singlet oxygen phosphorescence. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
An animal model of tinnitus: a decade of development.
Jastreboff, P J; Sasaki, C T
1994-01-01
Although tinnitus affects approximately 9 million people in the United States, a cure remains elusive and the mechanisms of its origin are speculative. The crucial obstacle in tinnitus research has been the lack of an animal model. Over the last decade we have been creating such a model by combining a variety of methodologies, including a behavioral component, to allow for the detection of tinnitus perception. Initially, 2-deoxyglucose had been used to map changes in the metabolic activity after unilateral destruction of the cochlea. It has been found that the initial decrease of the metabolic rate in the auditory nuclei recovered to preoperative values, which could be attributable to the development of tinnitus. The spontaneous activity of single units recorded from the inferior colliculus before and after salicylate administration revealed an increase of discharges, which might reflect the presence of salicylate-induced tinnitus. Recent data have confirmed, and further elaborated this observation, including the discovery of abnormal, epileptic-like, neuronal activity. Finally, the authors have developed a behavioral model of tinnitus, tested it extensively, and used it to measure tinnitus pitch and loudness. The model is presently used for investigating the hypotheses for the mechanisms of tinnitus.
Computer-Vision-Assisted Palm Rehabilitation With Supervised Learning.
Vamsikrishna, K M; Dogra, Debi Prosad; Desarkar, Maunendra Sankar
2016-05-01
Physical rehabilitation supported by the computer-assisted-interface is gaining popularity among health-care fraternity. In this paper, we have proposed a computer-vision-assisted contactless methodology to facilitate palm and finger rehabilitation. Leap motion controller has been interfaced with a computing device to record parameters describing 3-D movements of the palm of a user undergoing rehabilitation. We have proposed an interface using Unity3D development platform. Our interface is capable of analyzing intermediate steps of rehabilitation without the help of an expert, and it can provide online feedback to the user. Isolated gestures are classified using linear discriminant analysis (DA) and support vector machines (SVM). Finally, a set of discrete hidden Markov models (HMM) have been used to classify gesture sequence performed during rehabilitation. Experimental validation using a large number of samples collected from healthy volunteers reveals that DA and SVM perform similarly while applied on isolated gesture recognition. We have compared the results of HMM-based sequence classification with CRF-based techniques. Our results confirm that both HMM and CRF perform quite similarly when tested on gesture sequences. The proposed system can be used for home-based palm or finger rehabilitation in the absence of experts.
Bromelain Loading and Release from a Hydrogel Formulated Using Alginate and Arabic Gum.
Ataide, Janaína Artem; Cefali, Letícia Caramori; Rebelo, Marcia de Araujo; Spir, Lívia Genovez; Tambourgi, Elias Basile; Jozala, Angela Faustino; Chaud, Marco Vinícius; Silveira, Edgar; Gu, Xiaochen; Gava Mazzola, Priscila
2017-07-01
An ideal wound dressing ensures a moist environment around the wound area and absorbs exudates from the wound surface. Topical application of bromelain to incised wounds has been shown to reprogram the wound microenvironment to promote effective tissue repair. Combining the characteristics of hydrogels and bromelain is therefore of great interest. Herein, we describe the development of a hydrogel, formulated using alginate and Arabic gum, for bromelain loading and release. The hydrogel formulation was evaluated using response surface methodology, considering the pH value and the concentration of alginate and Arabic gum. Bromelain loading and release were evaluated based on passive diffusion. Differential scanning calorimetry and Fourier transform infrared spectroscopy were performed to confirm bromelain immobilization in the hydrogel. The final hydrogel formulation had a swelling ratio of 227 % and incorporated 19 % of bromelain from a bromelain solution. Bromelain immobilization in the hydrogel was the result of hydrogen bond formation and was optimal at 4 °C after 4 h of contact. This evidence suggests that bromelain entrapment into a hydrogel is a promising strategy for the development of wound dressings that support the debridement of burns and wounds. Georg Thieme Verlag KG Stuttgart · New York.
First-excited state g factor of Te 136 by the recoil in vacuum method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuchbery, A. E.; Allmond, J. M.; Danchev, M.
The g factor of the first 2 + state of radioactive 136Te with two valence protons and two valence neutrons beyond double-magic 132Sn has been measured by the recoil in vacuum (RIV) method. The lifetime of this state is an order of magnitude longer than the lifetimes of excited states recently measured by the RIV method in Sn and Te isotopes, requiring a new evaluation of the free-ion hyperfine interactions and methodology used to determine the g factor. In this paper, the calibration data are reported and the analysis procedures are described in detail. The resultant g factor has amore » similar magnitude to the g factors of other nuclei with an equal number of valence protons and neutrons in the major shell. However, an unexpected trend is found in the g factors of the N = 84 isotones, which decrease from 136Te to 144Nd. Finally, shell model calculations with interactions derived from the CD Bonn potential show good agreement with the g factors and E2 transition rates of 2 + states around 132Sn, confirming earlier indications that 132Sn is a good doubly magic core.« less
Methodological triangulation: an approach to understanding data.
Bekhet, Abir K; Zauszniewski, Jaclene A
2012-01-01
To describe the use of methodological triangulation in a study of how people who had moved to retirement communities were adjusting. Methodological triangulation involves using more than one kind of method to study a phenomenon. It has been found to be beneficial in providing confirmation of findings, more comprehensive data, increased validity and enhanced understanding of studied phenomena. While many researchers have used this well-established technique, there are few published examples of its use. The authors used methodological triangulation in their study of people who had moved to retirement communities in Ohio, US. A blended qualitative and quantitative approach was used. The collected qualitative data complemented and clarified the quantitative findings by helping to identify common themes. Qualitative data also helped in understanding interventions for promoting 'pulling' factors and for overcoming 'pushing' factors of participants. The authors used focused research questions to reflect the research's purpose and four evaluative criteria--'truth value', 'applicability', 'consistency' and 'neutrality'--to ensure rigour. This paper provides an example of how methodological triangulation can be used in nursing research. It identifies challenges associated with methodological triangulation, recommends strategies for overcoming them, provides a rationale for using triangulation and explains how to maintain rigour. Methodological triangulation can be used to enhance the analysis and the interpretation of findings. As data are drawn from multiple sources, it broadens the researcher's insight into the different issues underlying the phenomena being studied.
2014-01-01
Background In this research, the removal of natural organic matter from aqueous solutions using advanced oxidation processes (UV/H2O2) was evaluated. Therefore, the response surface methodology and Box-Behnken design matrix were employed to design the experiments and to determine the optimal conditions. The effects of various parameters such as initial concentration of H2O2 (100–180 mg/L), pH (3–11), time (10–30 min) and initial total organic carbon (TOC) concentration (4–10 mg/L) were studied. Results Analysis of variance (ANOVA), revealed a good agreement between experimental data and proposed quadratic polynomial model (R2 = 0.98). Experimental results showed that with increasing H2O2 concentration, time and decreasing in initial TOC concentration, TOC removal efficiency was increased. Neutral and nearly acidic pH values also improved the TOC removal. Accordingly, the TOC removal efficiency of 78.02% in terms of the independent variables including H2O2 concentration (100 mg/L), pH (6.12), time (22.42 min) and initial TOC concentration (4 mg/L) were optimized. Further confirmation tests under optimal conditions showed a 76.50% of TOC removal and confirmed that the model is accordance with the experiments. In addition TOC removal for natural water based on response surface methodology optimum condition was 62.15%. Conclusions This study showed that response surface methodology based on Box-Behnken method is a useful tool for optimizing the operating parameters for TOC removal using UV/H2O2 process. PMID:24735555
45 CFR 96.92 - Termination of funding.
Code of Federal Regulations, 2014 CFR
2014-10-01
... the Community Services Block Grant Act that it will terminate present or future funding of any... the record prior to terminating funding. If a review by the Secretary of the State's final decision to... of notification by the State of its final decision to terminate funding. The Department will confirm...
45 CFR 96.92 - Termination of funding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... the Community Services Block Grant Act that it will terminate present or future funding of any... the record prior to terminating funding. If a review by the Secretary of the State's final decision to... of notification by the State of its final decision to terminate funding. The Department will confirm...
45 CFR 96.92 - Termination of funding.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the Community Services Block Grant Act that it will terminate present or future funding of any... the record prior to terminating funding. If a review by the Secretary of the State's final decision to... of notification by the State of its final decision to terminate funding. The Department will confirm...
45 CFR 96.92 - Termination of funding.
Code of Federal Regulations, 2013 CFR
2013-10-01
... the Community Services Block Grant Act that it will terminate present or future funding of any... the record prior to terminating funding. If a review by the Secretary of the State's final decision to... of notification by the State of its final decision to terminate funding. The Department will confirm...
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Igras, Susan; Diakité, Mariam; Lundgren, Rebecka
2017-07-01
In West Africa, social factors influence whether couples with unmet need for family planning act on birth-spacing desires. Tékponon Jikuagou is testing a social network-based intervention to reduce social barriers by diffusing new ideas. Individuals and groups judged socially influential by their communities provide entrée to networks. A participatory social network mapping methodology was designed to identify these diffusion actors. Analysis of monitoring data, in-depth interviews, and evaluation reports assessed the methodology's acceptability to communities and staff and whether it produced valid, reliable data to identify influential individuals and groups who diffuse new ideas through their networks. Results indicated the methodology's acceptability. Communities were actively and equitably engaged. Staff appreciated its ability to yield timely, actionable information. The mapping methodology also provided valid and reliable information by enabling communities to identify highly connected and influential network actors. Consistent with social network theory, this methodology resulted in the selection of informal groups and individuals in both informal and formal positions. In-depth interview data suggest these actors were diffusing new ideas, further confirming their influence/connectivity. The participatory methodology generated insider knowledge of who has social influence, challenging commonly held assumptions. Collecting and displaying information fostered staff and community learning, laying groundwork for social change.
A quality evaluation methodology of health web-pages for non-professionals.
Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro
2004-06-01
The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Non-destructive, high-content analysis of wheat grain traits using X-ray micro computed tomography.
Hughes, Nathan; Askew, Karen; Scotson, Callum P; Williams, Kevin; Sauze, Colin; Corke, Fiona; Doonan, John H; Nibau, Candida
2017-01-01
Wheat is one of the most widely grown crop in temperate climates for food and animal feed. In order to meet the demands of the predicted population increase in an ever-changing climate, wheat production needs to dramatically increase. Spike and grain traits are critical determinants of final yield and grain uniformity a commercially desired trait, but their analysis is laborious and often requires destructive harvest. One of the current challenges is to develop an accurate, non-destructive method for spike and grain trait analysis capable of handling large populations. In this study we describe the development of a robust method for the accurate extraction and measurement of spike and grain morphometric parameters from images acquired by X-ray micro-computed tomography (μCT). The image analysis pipeline developed automatically identifies plant material of interest in μCT images, performs image analysis, and extracts morphometric data. As a proof of principle, this integrated methodology was used to analyse the spikes from a population of wheat plants subjected to high temperatures under two different water regimes. Temperature has a negative effect on spike height and grain number with the middle of the spike being the most affected region. The data also confirmed that increased grain volume was correlated with the decrease in grain number under mild stress. Being able to quickly measure plant phenotypes in a non-destructive manner is crucial to advance our understanding of gene function and the effects of the environment. We report on the development of an image analysis pipeline capable of accurately and reliably extracting spike and grain traits from crops without the loss of positional information. This methodology was applied to the analysis of wheat spikes can be readily applied to other economically important crop species.
Prior, C; Danilāne, L; Oganesyan, V S
2018-05-16
We report the first application of fully atomistic molecular dynamics (MD) simulations to the prediction of electron paramagnetic resonance (EPR) spectra of spin labelled DNA. Models for two structurally different DNA spin probes with either the rigid or flexible position of the nitroxide group in the base pair, employed in experimental studies previously, have been developed. By the application of the combined MD-EPR simulation methodology we aimed at the following. Firstly, to provide a test bed against a sensitive spectroscopic technique for the recently developed improved version of the parmbsc1 force field for MD modelling of DNA. The predicted EPR spectra show good agreement with the experimental ones available from the literature, thus confirming the accuracy of the currently employed DNA force fields. Secondly, to provide a quantitative interpretation of the motional contributions into the dynamics of spin probes in both duplex and single-strand DNA fragments and to analyse their perturbing effects on the local DNA structure. Finally, a combination of MD and EPR allowed us to test the validity of the application of the Model-Free (M-F) approach coupled with the partial averaging of magnetic tensors to the simulation of EPR spectra of DNA systems by comparing the resultant EPR spectra with those simulated directly from MD trajectories. The advantage of the M-F based EPR simulation approach over the direct propagation techniques is that it requires motional and order parameters that can be calculated from shorter MD trajectories. The reported MD-EPR methodology is transferable to the prediction and interpretation of EPR spectra of higher order DNA structures with novel types of spin labels.
Darajeh, Negisa; Idris, Azni; Fard Masoumi, Hamid Reza; Nourani, Abolfazl; Truong, Paul; Sairi, Nor Asrina
2016-10-01
While the oil palm industry has been recognized for its contribution towards economic growth and rapid development, it has also contributed to environmental pollution due to the production of huge quantities of by-products from the oil extraction process. A phytoremediation technique (floating Vetiver system) was used to treat Palm Oil Mill Secondary Effluent (POMSE). A batch study using 40 L treatment tanks was carried out under different conditions and Response Surface Methodology (RSM) was applied to optimize the treatment process. A three factor central composite design (CCD) was used to predict the experimental variables (POMSE concentration, Vetiver plant density and time). An extraordinary decrease in organic matter as measured by BOD and COD (96% and 94% respectively) was recorded during the experimental duration of 4 weeks using a density of 30 Vetiver plants. The best and lowest final BOD of 2 mg/L was obtained when using 15 Vetiver plants after 13 days for low concentration POMSE (initial BOD = 50 mg/L). The next best result of BOD at 32 mg/L was obtained when using 30 Vetiver plants after 24 days for medium concentration POMSE (initial BOD = 175 mg/L). These results confirmed the validity of the model, and the experimental value was determined to be quite close to the predicted value, implying that the empirical model derived from RSM experimental design can be used to adequately describe the relationship between the independent variables and response. The study showed that the Vetiver system is an effective method of treating POMSE. Copyright © 2016 Elsevier Ltd. All rights reserved.
Singapore Indian Eye Study-2: methodology and impact of migration on systemic and eye outcomes.
Sabanayagam, Charumathi; Yip, Wanfen; Gupta, Preeti; Mohd Abdul, Riswana Bb; Lamoureux, Ecosse; Kumari, Neelam; Cheung, Gemmy Cm; Cheung, Carol Y; Wang, Jie Jin; Cheng, Ching-Yu; Wong, Tien Yin
2017-11-01
Asian Indians are the fastest growing migration groups in the world. Studies evaluating the impact of migration on disease outcomes in this population are rare. We describe the methodology of the Singapore Indian Eye Study-2 (SINDI-2) aimed to evaluate the impact of migration status on diabetic retinopathy and other major age-related eye diseases in Asian Indians living in an urban environment. Population-based cohort study. A total of 2200 adults had participated in baseline SINDI (2007-2009, mean age [range] = 57.8 [42.7-84.1] years) and SINDI-2 (2013-2015, 56.5 [48.4-90.2] years). Participants were classified as 'first generation' if they were Indian residents born outside of Singapore and as 'second-generation' immigrants (59.7% in SINDI vs. 63.6% in SINDI-2) if they were born in Singapore. Response rate, participant characteristics and prevalence of systemic diseases were stratified by migration status. Of the 2914 eligible SINDI participants invited to participate, 2200 participated in SINDI-2 (response rate of 75.2%). In both SINDI and SINDI-2, compared with first-generation immigrants, second-generation immigrants were younger, less likely to have income <1000 SGD, had lower levels of pulse pressure, higher levels of high-density lipoprotein cholesterol, had lower prevalence of hypertension and chronic kidney disease and had higher prevalence of current smoking and obesity (all P < 0.05). In both SINDI and SINDI-2, second-generation immigrants had lower prevalence of cardiovascular risk factors except smoking and obesity compared with first-generation immigrants. The final report will confirm if these differences between generations are evident with regard to eye diseases. © 2017 Royal Australian and New Zealand College of Ophthalmologists.
Methodology to design a municipal solid waste generation and composition map: a case study.
Gallardo, A; Carlos, M; Peris, M; Colomer, F J
2014-11-01
The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consist in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.
Methodology to design a municipal solid waste generation and composition map: a case study.
Gallardo, A; Carlos, M; Peris, M; Colomer, F J
2015-02-01
The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in defining the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town. Copyright © 2014 Elsevier Ltd. All rights reserved.
Independent evaluation of the transit retrofit package safety applications : final report.
DOT National Transportation Integrated Search
2015-02-01
This report presents the methodology and results of the independent evaluation of retrofit safety packages installed on transit vehicles in the : Safety Pilot Model Deploymentpart of the United States Department of Transportations Intelligent T...
Linking Aquatic Ecosystems to Human Well-Being
While ecological indicators should have relevance to people, a clear methodology to develop and evaluate this characteristic of ecological indicators is not well developed. Economists developed the concept of “Final Ecosystem Goods and Services”. Because these featur...
A methodology for conducting underwater archaeological surveys : final report.
DOT National Transportation Integrated Search
1990-01-01
Departments of transportation and other construction agencies are required to locate and conserve cultural resources located in the area of a construction project. Although most organizations have procedures for locating cultural resources on land, t...
Baeza, Fernanda L C; Caldieraro, Marco A K; Pinheiro, Diesa O; Fleck, Marcelo P
2010-06-01
To describe the translation and adaptation methodology for the Measure of Parental Style, a self-report instrument developed originally in English, following the International Society for Pharmacoeconomics and Outcomes Research guidelines, comparing this to other methodologies used for the same purposes. Translation and Cultural Adaptation group International Society for Pharmacoeconomics and Outcomes Research guidelines were followed (preparation, first forward translation, reconciliation, back translation, revision of back translation, harmonization, cognitive debriefing, revision of debriefing results, syntax and orthographic revision, final report). A careful and qualified cross-cultural translation and adaptation of an instrument contribute for measuring what it is designed to measure across cultures. Presenting this process, besides its final product, provides the opportunity that this experience could be replicated for adaptation of other instruments.
Using Sandelowski and Barroso's Meta-Synthesis Method in Advancing Qualitative Evidence.
Ludvigsen, Mette S; Hall, Elisabeth O C; Meyer, Gabriele; Fegran, Liv; Aagaard, Hanne; Uhrenfeldt, Lisbeth
2016-02-01
The purpose of this article was to iteratively account for and discuss the handling of methodological challenges in two qualitative research syntheses concerning patients' experiences of hospital transition. We applied Sandelowski and Barroso's guidelines for synthesizing qualitative research, and to our knowledge, this is the first time researchers discuss their methodological steps. In the process, we identified a need for prolonged discussions to determine mutual understandings of the methodology. We discussed how to identify the appropriate qualitative research literature and how to best conduct exhaustive literature searches on our target phenomena. Another finding concerned our status as third-order interpreters of participants' experiences and what this meant for synthesizing the primary findings. Finally, we discussed whether our studies could be classified as metasummaries or metasyntheses. Although we have some concerns regarding the applicability of the methodology, we conclude that following Sandelowski and Barroso's guidelines contributed to valid syntheses of our studies. © The Author(s) 2015.
Niaksu, Olegas; Zaptorius, Jonas
2014-01-01
This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.
Generic simulation of multi-element ladar scanner kinematics in USU LadarSIM
NASA Astrophysics Data System (ADS)
Omer, David; Call, Benjamin; Pack, Robert; Fullmer, Rees
2006-05-01
This paper presents a generic simulation model for a ladar scanner with up to three scan elements, each having a steering, stabilization and/or pattern-scanning role. Of interest is the development of algorithms that automatically generate commands to the scan elements given beam-steering objectives out of the ladar aperture, and the base motion of the sensor platform. First, a straight-forward single-element body-fixed beam-steering methodology is presented. Then a unique multi-element redirective and reflective space-fixed beam-steering methodology is explained. It is shown that standard direction cosine matrix decomposition methods fail when using two orthogonal, space-fixed rotations, thus demanding the development of a new algorithm for beam steering. Finally, a related steering control methodology is presented that uses two separate optical elements mathematically combined to determine the necessary scan element commands. Limits, restrictions, and results on this methodology are presented.
Gangopadhyay, Subhrendu; McCabe, Gregory J.; Woodhouse, Connie A.
2015-01-01
In this paper, we present a methodology to use annual tree-ring chronologies and a monthly water balance model to generate annual reconstructions of water balance variables (e.g., potential evapotrans- piration (PET), actual evapotranspiration (AET), snow water equivalent (SWE), soil moisture storage (SMS), and runoff (R)). The method involves resampling monthly temperature and precipitation from the instrumental record directed by variability indicated by the paleoclimate record. The generated time series of monthly temperature and precipitation are subsequently used as inputs to a monthly water balance model. The methodology is applied to the Upper Colorado River Basin, and results indicate that the methodology reliably simulates water-year runoff, maximum snow water equivalent, and seasonal soil moisture storage for the instrumental period. As a final application, the methodology is used to produce time series of PET, AET, SWE, SMS, and R for the 1404–1905 period for the Upper Colorado River Basin.
Vein matching using artificial neural network in vein authentication systems
NASA Astrophysics Data System (ADS)
Noori Hoshyar, Azadeh; Sulaiman, Riza
2011-10-01
Personal identification technology as security systems is developing rapidly. Traditional authentication modes like key; password; card are not safe enough because they could be stolen or easily forgotten. Biometric as developed technology has been applied to a wide range of systems. According to different researchers, vein biometric is a good candidate among other biometric traits such as fingerprint, hand geometry, voice, DNA and etc for authentication systems. Vein authentication systems can be designed by different methodologies. All the methodologies consist of matching stage which is too important for final verification of the system. Neural Network is an effective methodology for matching and recognizing individuals in authentication systems. Therefore, this paper explains and implements the Neural Network methodology for finger vein authentication system. Neural Network is trained in Matlab to match the vein features of authentication system. The Network simulation shows the quality of matching as 95% which is a good performance for authentication system matching.
Methodology to design a municipal solid waste pre-collection system. A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallardo, A., E-mail: gallardo@uji.es; Carlos, M., E-mail: mcarlos@uji.es; Peris, M., E-mail: perism@uji.es
Highlights: • MSW recovery starts at homes; therefore it is important to facilitate it to people. • Additionally, to optimize MSW collection a previous pre-collection must be planned. • A methodology to organize pre-collection considering several factors is presented. • The methodology has been verified applying it to a Spanish middle town. - Abstract: The municipal solid waste (MSW) management is an important task that local governments as well as private companies must take into account to protect human health, the environment and to preserve natural resources. To design an adequate MSW management plan the first step consists in definingmore » the waste generation and composition patterns of the town. As these patterns depend on several socio-economic factors it is advisable to organize them previously. Moreover, the waste generation and composition patterns may vary around the town and over the time. Generally, the data are not homogeneous around the city as the number of inhabitants is not constant nor it is the economic activity. Therefore, if all the information is showed in thematic maps, the final waste management decisions can be made more efficiently. The main aim of this paper is to present a structured methodology that allows local authorities or private companies who deal with MSW to design its own MSW management plan depending on the available data. According to these data, this paper proposes two ways of action: a direct way when detailed data are available and an indirect way when there is a lack of data and it is necessary to take into account bibliographic data. In any case, the amount of information needed is considerable. This paper combines the planning methodology with the Geographic Information Systems to present the final results in thematic maps that make easier to interpret them. The proposed methodology is a previous useful tool to organize the MSW collection routes including the selective collection. To verify the methodology it has been successfully applied to a Spanish town.« less
Language barriers and qualitative nursing research: methodological considerations.
Squires, A
2008-09-01
This review of the literature synthesizes methodological recommendations for the use of translators and interpreters in cross-language qualitative research. Cross-language qualitative research involves the use of interpreters and translators to mediate a language barrier between researchers and participants. Qualitative nurse researchers successfully address language barriers between themselves and their participants when they systematically plan for how they will use interpreters and translators throughout the research process. Experienced qualitative researchers recognize that translators can generate qualitative data through translation processes and by participating in data analysis. Failure to address language barriers and the methodological challenges they present threatens the credibility, transferability, dependability and confirmability of cross-language qualitative nursing research. Through a synthesis of the cross-language qualitative methods literature, this article reviews the basics of language competence, translator and interpreter qualifications, and roles for each kind of qualitative research approach. Methodological and ethical considerations are also provided. By systematically addressing the methodological challenges cross-language research presents, nurse researchers can produce better evidence for nursing practice and policy making when working across different language groups. Findings from qualitative studies will also accurately represent the experiences of the participants without concern that the meaning was lost in translation.
Language barriers and qualitative nursing research: methodological considerations
Squires, A.
2009-01-01
Aim This review of the literature synthesizes methodological recommendations for the use of translators and interpreters in cross-language qualitative research. Background Cross-language qualitative research involves the use of interpreters and translators to mediate a language barrier between researchers and participants. Qualitative nurse researchers successfully address language barriers between themselves and their participants when they systematically plan for how they will use interpreters and translators throughout the research process. Experienced qualitative researchers recognize that translators can generate qualitative data through translation processes and by participating in data analysis. Failure to address language barriers and the methodological challenges they present threatens the credibility, transferability, dependability and confirmability of cross-language qualitative nursing research. Through a synthesis of the cross-language qualitative methods literature, this article reviews the basics of language competence, translator and interpreter qualifications, and roles for each kind of qualitative research approach. Methodological and ethical considerations are also provided. Conclusion By systematically addressing the methodological challenges cross-language research presents, nurse researchers can produce better evidence for nursing practice and policy making when working across different language groups. Findings from qualitative studies will also accurately represent the experiences of the participants without concern that the meaning was lost in translation. PMID:19522941
A methodology for luminance map retrieval using airborne hyperspectral and photogrammetric data
NASA Astrophysics Data System (ADS)
Pipia, Luca; Alamús, Ramon; Tardà, Anna; Pérez, Fernando; Palà, Vicenç; Corbera, Jordi
2014-10-01
This paper puts forward a methodology developed at the Institut Cartogràfic i Geològic de Catalunya (ICGC) to quantify upwelling light flux using hyperspectral and photogrammetric airborne data. The work was carried out in the frame of a demonstrative study requested by the municipality of Sant Cugat del Vallès, in the vicinity of Barcelona (Spain), and aimed to envisage a new approach to assess artificial lighting policies and actions as alternative to field campaigns. Hyperspectral and high resolution multispectral/panchromatic data were acquired simultaneously over urban areas. In order to avoid moon light contributions, data were acquired during the first days of new moon phase. Hyperspectral data were radiometrically calibrated. Then, National Center for Environmental Prediction (NCEP) atmospheric profiles were employed to estimate the actual Column Water Vapor (CWV) to be passed to ModTran5.0 for the atmospheric transmissivity τ calculation. At-the-ground radiance was finally integrated using the photopic sensitivity curve to generate a luminance map (cdm-2) of the flown area by mosaicking the different flight tracks. In an attempt to improve the spatial resolution and enhance the dynamic range of the luminance map, a sensor-fusion strategy was finally looked into. DMC Photogrammetric data acquired simultaneously to hyperspectral information were converted into at-the-ground radiance and upscaled to CASI spatial resolution. High-resolution (HR) luminance maps with enhanced dynamic range were finally generated by linearly fitting up-scaled DMC mosaics to the CASI-based luminance information. In the end, a preliminary assessment of the methodology is carried out using non-simultaneous in-situ measurements.
NASA Astrophysics Data System (ADS)
Alturki, Uthman T.
The goal of this research was to research, design, and develop a hypertext program for students who study biology. The Ecology Hypertext Program was developed using Research and Development (R&D) methodology. The purpose of this study was to place the final "product", a CD-ROM for learning biology concepts, in the hands of teachers and students to help them in learning and teaching process. The product was created through a cycle of literature review, needs assessment, development, and a cycle of field tests and revisions. I applied the ten steps of R&D process suggested by Borg and Gall (1989) which, consisted of: (1) Literature review, (2) Needs assessment, (3) Planning, (4) Develop preliminary product, (5) Preliminary field-testing, (6) Preliminary revision, (7) Main field-testing, (8) Main revision, (9) Final field-testing, and (10) Final product revision. The literature review and needs assessment provided a support and foundation for designing the preliminary product---the Ecology Hypertext Program. Participants in the needs assessment joined a focus group discussion. They were a group of graduate students in education who suggested the importance for designing this product. For the preliminary field test, the participants were a group of high school students studying biology. They were the potential user of the product. They reviewed the preliminary product and then filled out a questionnaire. Their feedback and suggestions were used to develop and improve the product in a step called preliminary revision. The second round of field tasting was the main field test in which the participants joined a focus group discussion. They were the same group who participated in needs assessment task. They reviewed the revised product and then provided ideas and suggestions to improve the product. Their feedback were categorized and implemented to develop the product as in the main revision task. Finally, a group of science teachers participated in this study by reviewing the product and then filling out the questionnaire. Their suggestions were used to conduct the final step in R&D methodology, the final product revision. The primary result of this study was the Ecology Hypertext Program. It considered a small attempt to give students an opportunity to learn through an interactive hypertext program. In addition, using the R&D methodology was an ideal procedure for designing and developing new educational products and material.
Problem Solving Activity in the Workplace and the School: The Case of Constructing Solids.
ERIC Educational Resources Information Center
Jurdak, Murad; Shahin, Iman
2001-01-01
Documents, compares, and analyzes the nature of spatial reasoning by practitioners (plumbers) in the workplace and students in the school setting while constructing solids, with given specifications, from plane surfaces. Results confirm the power of activity theory and its methodology in explaining and identifying the structural differences…
Reexamining Psychokinesis: Comment on Bosch, Steinkamp, and Boller (2006)
ERIC Educational Resources Information Center
Radin, Dean; Nelson, Roger; Dobyns, York; Houtkooper, Joop
2006-01-01
H. Bosch, F. Steinkamp, and E. Boller's (see record 2006-08436-001) review of the evidence for psychokinesis confirms many of the authors' earlier findings. The authors agree with Bosch et al. that existing studies provide statistical evidence for psychokinesis, that the evidence is generally of high methodological quality, and that effect sizes…
Performance-Based Service Quality Model: An Empirical Study on Japanese Universities
ERIC Educational Resources Information Center
Sultan, Parves; Wong, Ho
2010-01-01
Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…
Regdon, G; Bácskay, I; Kata, M; Selmeczi, B; Szikszay, M; Sánta, A; Bálint, G S
1994-05-01
Methodology and the results of the in vitro membrane diffusion and in vivo bioavailability studies are presented. The results confirm a correlation between in vitro and in vivo findings. Hydrophilic macrogol-mixture with great molecular mass can be recommended as the optimal vehicle for formulation of diazepam suppositories.
77 FR 4626 - Departmental Offices; Submission for OMB Review, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-30
... U.S.C. 3506(c)(2)(A)). The BEP intends to request approval from the Office of Management and Budget... acuity to develop methodologies for collecting the feedback. The BEP also intends to contract with... acuity tests will help either confirm or provide other perspectives on the results of BEP's information...
[Field investigations of the air pollution level of populated territories].
Vinokurov, M V
2014-01-01
The assessment and management of air quality of settlements is one of the priorities in the field of environmental protection. In the management of air quality the backbone factor is the methodology of the organization, performance and interpretation of data of field investigations. The present article is devoted to the analysis of the existing methodological approaches and practical aspects of their application in the organization and performance of field investigations with the aim to confirm the adequacy of the boundaries of the sanitary protection zone in the old industrial regions, hygienic evaluation of the data of field investigations of the air pollution level.
NASA Technical Reports Server (NTRS)
Potter, Christopher S.
2014-01-01
The Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS) methodology was applied to detected changes in forest vegetation cover for areas burned by wildfires in the Sierra Nevada Mountains of California between the periods of 1975- 79 and 1995-1999. Results for areas burned by wildfire between 1995 and 1999 confirmed the importance of regrowing forest vegetation over 17% of the combined burned areas. A notable fraction (12%) of the entire 5-km (unburned) buffer area outside the 1995-199 fires perimeters showed decline in forest cover, and not nearly as many regrowing forest areas, covering only 3% of all the 1995-1999 buffer areas combined. Areas burned by wildfire between 1975 and 1979 confirmed the importance of disturbed (or declining evergreen) vegetation covering 13% of the combined 1975- 1979 burned areas. Based on comparison of these results to ground-based survey data, the LEDAPS methodology should be capable of fulfilling much of the need for consistent, low-cost monitoring of changes due to climate and biological factors in western forest regrowth following stand-replacing disturbances.
Healy, Patricia; Galvin, Sandra; Williamson, Paula R; Treweek, Shaun; Whiting, Caroline; Maeso, Beccy; Bray, Christopher; Brocklehurst, Peter; Moloney, Mary Clarke; Douiri, Abdel; Gamble, Carrol; Gardner, Heidi R; Mitchell, Derick; Stewart, Derek; Jordan, Joan; O'Donnell, Martin; Clarke, Mike; Pavitt, Sue H; Guegan, Eleanor Woodford; Blatch-Jones, Amanda; Smith, Valerie; Reay, Hannah; Devane, Declan
2018-03-01
Despite the problem of inadequate recruitment to randomised trials, there is little evidence to guide researchers on decisions about how people are effectively recruited to take part in trials. The PRioRiTy study aimed to identify and prioritise important unanswered trial recruitment questions for research. The PRioRiTy study - Priority Setting Partnership (PSP) included members of the public approached to take part in a randomised trial or who have represented participants on randomised trial steering committees, health professionals and research staff with experience of recruiting to randomised trials, people who have designed, conducted, analysed or reported on randomised trials and people with experience of randomised trials methodology. This partnership was aided by the James Lind Alliance and involved eight stages: (i) identifying a unique, relevant prioritisation area within trial methodology; (ii) establishing a steering group (iii) identifying and engaging with partners and stakeholders; (iv) formulating an initial list of uncertainties; (v) collating the uncertainties into research questions; (vi) confirming that the questions for research are a current recruitment challenge; (vii) shortlisting questions and (viii) final prioritisation through a face-to-face workshop. A total of 790 survey respondents yielded 1693 open-text answers to 6 questions, from which 1880 potential questions for research were identified. After merging duplicates, the number of questions was reduced to 496. Questions were combined further, and those that were submitted by fewer than 15 people and/or fewer than 6 of the 7 stakeholder groups were excluded from the next round of prioritisation resulting in 31 unique questions for research. All 31 questions were confirmed as being unanswered after checking relevant, up-to-date research evidence. The 10 highest priority questions were ranked at a face-to-face workshop. The number 1 ranked question was "How can randomised trials become part of routine care and best utilise current clinical care pathways?" The top 10 research questions can be viewed at www.priorityresearch.ie . The prioritised questions call for a collective focus on normalising trials as part of clinical care, enhancing communication, addressing barriers, enablers and motivators around participation and exploring greater public involvement in the research process.
Xenon-induced power oscillations in a generic small modular reactor
NASA Astrophysics Data System (ADS)
Kitcher, Evans Damenortey
As world demand for energy continues to grow at unprecedented rates, the world energy portfolio of the future will inevitably include a nuclear energy contribution. It has been suggested that the Small Modular Reactor (SMR) could play a significant role in the spread of civilian nuclear technology to nations previously without nuclear energy. As part of the design process, the SMR design must be assessed for the threat to operations posed by xenon-induced power oscillations. In this research, a generic SMR design was analyzed with respect to just such a threat. In order to do so, a multi-physics coupling routine was developed with MCNP/MCNPX as the neutronics solver. Thermal hydraulic assessments were performed using a single channel analysis tool developed in Python. Fuel and coolant temperature profiles were implemented in the form of temperature dependent fuel cross sections generated using the SIGACE code and reactor core coolant densities. The Power Axial Offset (PAO) and Xenon Axial Offset (XAO) parameters were chosen to quantify any oscillatory behavior observed. The methodology was benchmarked against results from literature of startup tests performed at a four-loop PWR in Korea. The developed benchmark model replicated the pertinent features of the reactor within ten percent of the literature values. The results of the benchmark demonstrated that the developed methodology captured the desired phenomena accurately. Subsequently, a high fidelity SMR core model was developed and assessed. Results of the analysis revealed an inherently stable SMR design at beginning of core life and end of core life under full-power and half-power conditions. The effect of axial discretization, stochastic noise and convergence of the Monte Carlo tallies in the calculations of the PAO and XAO parameters was investigated. All were found to be quite small and the inherently stable nature of the core design with respect to xenon-induced power oscillations was confirmed. Finally, a preliminary investigation into excess reactivity control options for the SMR design was conducted confirming the generally held notion that existing PWR control mechanisms can be used in iPWR SMRs with similar effectiveness. With the desire to operate the SMR under the boron free coolant condition, erbium oxide fuel integral burnable absorber rods were identified as a possible means to retain the dispersed absorber effect of soluble boron in the reactor coolant in replacement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heaney, Mike
Statistically designed experiments can save researchers time and money by reducing the number of necessary experimental trials, while resulting in more conclusive experimental results. Surprisingly, many researchers are still not aware of this efficient and effective experimental methodology. As reported in a 2013 article from Chemical & Engineering News, there has been a resurgence of this methodology in recent years (http://cen.acs.org/articles/91/i13/Design-Experiments-Makes-Comeback.html?h=2027056365). This presentation will provide a brief introduction to statistically designed experiments. The main advantages will be reviewed along with the some basic concepts such as factorial and fractional factorial designs. The recommended sequential approach to experiments will be introducedmore » and finally a case study will be presented to demonstrate this methodology.« less
Analysis of local bus markets : volume I – methodology and findings : final report.
DOT National Transportation Integrated Search
2017-07-04
Despite having an extensive network of public transit, traffic congestion and transportation-related greenhouse gas (GHG) emissions are significant concerns in New Jersey. This research hypothesizes that traffic congestion and air quality concerns in...
Elastic stress analysis of general prismatic beams : final report.
DOT National Transportation Integrated Search
1980-01-01
This study developed a numerical methodology for the elastic stress analysis of general prismatic beams. The objective was to accurately determine stresses and displacements on a cross section of a beam where the stress resultants are prescribed. App...
75 FR 35519 - Primary National Ambient Air Quality Standard for Sulfur Dioxide
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-22
... this final rule, additional areas could be classified as non-attainment. Certain States would then be... numerous locations and with a variety of methodological approaches (ISA, section 5.2; p. 5-5). It was...
Site selection for MSFC operational tests of solar heating and cooling systems
NASA Technical Reports Server (NTRS)
1978-01-01
The criteria, methodology, and sequence aspects of the site selection process are presented. This report organized the logical thought process that should be applied to the site selection process, but final decisions are highly selective.
Constructing the principles: Method and metaphysics in the progress of theoretical physics
NASA Astrophysics Data System (ADS)
Glass, Lawrence C.
This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to the principles of Galilean relativity. A comparison between the conservative strategy as an application of the conservative strategy and TeVeS as one of the innovative constitutes the second example.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-10
...-0019; Amdt. No. 1-67] RIN 2120- AK03 Removal of Category IIIa, IIIb, and IIIc Definitions; Confirmation... remove the definitions of Category IIIa, IIIb, and IIIc operations because the definitions are outdated..., entitled ``Removal of Category IIIa, IIIb, and IIIc Definitions'' (77 FR 9163). The direct final rule...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-03
...- AK03 Removal of Category IIIa, IIIb, and IIIc Definitions; Confirmation of Effective Date and Response... received on that direct final rule. In that document, the FAA proposed to remove the definitions of Category IIIa, IIIb, and IIIc operations because the definitions are outdated and no longer used for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-19
... IRS issued final regulations under section 367 (2009 final regulations) concerning gain recognition... without the recognition of a corresponding amount of gain or income inclusion. Notice 2008-10 announced... the revised regulations would confirm that to the extent the appropriate amount of built-in gain in...
NASA Astrophysics Data System (ADS)
Francis, Olivier; Klein, Gilbert; Baumann, Henri; Dando, Nicolas; Tracey, Ray; Ullrich, Christian; Castelein, Stefaan; Hua, Hu; Kang, Wu; Chongyang, Shen; Songbo, Xuan; Hongbo, Tan; Zhengyuan, Li; Pálinkás, Vojtech; Kostelecký, Jakub; Mäkinen, Jaakko; Näränen, Jyri; Merlet, Sébastien; Farah, Tristan; Guerlin, Christine; Pereira Dos Santos, Franck; Le Moigne, Nicolas; Champollion, Cédric; Deville, Sabrina; Timmen, Ludger; Falk, Reinhard; Wilmes, Herbert; Iacovone, Domenico; Baccaro, Francesco; Germak, Alessandro; Biolcati, Emanuele; Krynski, Jan; Sekowski, Marcin; Olszak, Tomasz; Pachuta, Andrzej; Agren, Jonas; Engfeldt, Andreas; Reudink, René; Inacio, Pedro; McLaughlin, Daniel; Shannon, Geoff; Eckl, Marc; Wilkins, Tim; van Westrum, Derek; Billson, Ryan
2012-01-01
During November 2011 a EURAMET key comparison of absolute gravimeters was organized in the Underground Laboratory for Geodynamics in Walferdange, Luxemburg. The comparison assembled 22 participants coming from 16 countries and four different continents. The comparison was divided into two parts: a key comparison that included six National Metrology Institutes or Designated Institutes, and a pilot study including all participants. The global result given by the pilot study confirms that all instruments are absolutely coherent with each other. The results obtained in the key comparison confirm a good agreement between the NMI instruments. Finally, a link to ICAG-2009 shows also that the NMI gravimeters are stable in time. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594
Sánchez, Ariel G.; Grieb, Jan Niklas; Salazar-Albornoz, Salvador; ...
2016-09-30
The cosmological information contained in anisotropic galaxy clustering measurements can often be compressed into a small number of parameters whose posterior distribution is well described by a Gaussian. Here, we present a general methodology to combine these estimates into a single set of consensus constraints that encode the total information of the individual measurements, taking into account the full covariance between the different methods. We also illustrate this technique by applying it to combine the results obtained from different clustering analyses, including measurements of the signature of baryon acoustic oscillations and redshift-space distortions, based on a set of mock cataloguesmore » of the final SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS). Our results show that the region of the parameter space allowed by the consensus constraints is smaller than that of the individual methods, highlighting the importance of performing multiple analyses on galaxy surveys even when the measurements are highly correlated. Our paper is part of a set that analyses the final galaxy clustering data set from BOSS. The methodology presented here is used in Alam et al. to produce the final cosmological constraints from BOSS.« less
Fraser, Thomas W K; Khezri, Abdolrahman; Jusdado, Juan G H; Lewandowska-Sabat, Anna M; Henry, Theodore; Ropstad, Erik
2017-07-05
Alterations in zebrafish motility are used to identify neurotoxic compounds, but few have reported how methodology may affect results. To investigate this, we exposed embryos to bisphenol A (BPA) or tetrabromobisphenol A (TBBPA) before assessing larval motility. Embryos were maintained on a day/night cycle (DN) or in constant darkness, were reared in 96 or 24 well plates (BPA only), and behavioural tests were carried out at 96, 100, or 118 (BPA only) hours post fertilisation (hpf). We found that the prior photo-regime, larval age, and/or arena size influence behavioural outcomes in response to toxicant exposure. For example, methodology determined whether 10μM BPA induced hyperactivity, hypoactivity, or had no behavioural effect. Furthermore, the minimum effect concentration was not consistent between different methodologies. Finally, we observed a mechanism previously used to explain hyperactivity following BPA exposure does not appear to explain the hypoactivity observed following minor alterations in methodology. Therefore, we demonstrate how methodology can have notable implications on dose responses and behavioural outcomes in larval zebrafish motility following identical chemical exposures. As such, our results have significant consequences for human and environmental risk assessment. Copyright © 2017 Elsevier B.V. All rights reserved.
Using Random Forest Models to Predict Organizational Violence
NASA Technical Reports Server (NTRS)
Levine, Burton; Bobashev, Georgly
2012-01-01
We present a methodology to access the proclivity of an organization to commit violence against nongovernment personnel. We fitted a Random Forest model using the Minority at Risk Organizational Behavior (MAROS) dataset. The MAROS data is longitudinal; so, individual observations are not independent. We propose a modification to the standard Random Forest methodology to account for the violation of the independence assumption. We present the results of the model fit, an example of predicting violence for an organization; and finally, we present a summary of the forest in a "meta-tree,"
2004-03-01
probabilistic by design. Finally, as the fragments disperse, fragment density decreases, and the probability of a fragment strike drops rapidly. Given the...Any PPE subjected to such testing needs to be exposed repeatedly to several mines in order to obtain a sufficient number of strikes . This will allow...velocity of each fragment, and the location of fragment strikes cannot be controlled precisely. This means that the same test must be repeated a
76 FR 3451 - Wage Methodology for the Temporary Non-agricultural Employment H-2B Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
...The Department of Labor (the Department or DOL) is amending its regulations governing the certification for the employment of nonimmigrant workers in temporary or seasonal non-agricultural employment. This Final Rule revises the methodology by which the Department calculates the prevailing wages to be paid to H-2B workers and United States (U.S.) workers recruited in connection with a temporary labor certification for use in petitioning the Department of Homeland Security (DHS) to employ a nonimmigrant worker in H-2B status.
A Case Study of Reverse Engineering Integrated in an Automated Design Process
NASA Astrophysics Data System (ADS)
Pescaru, R.; Kyratsis, P.; Oancea, G.
2016-11-01
This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.
Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures
NASA Technical Reports Server (NTRS)
Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.
1997-01-01
A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.
NASA Astrophysics Data System (ADS)
Deeying, J.; Asawarungsaengkul, K.; Chutima, P.
2018-01-01
This paper aims to investigate the effect of laser solder jet bonding parameters to the solder joints in Head Gimbal Assembly. Laser solder jet bonding utilizes the fiber laser to melt solder ball in capillary. The molten solder is transferred to two bonding pads by nitrogen gas. The response surface methodology have been used to investigate the effects of laser energy, wait time, nitrogen gas pressure, and focal position on the shear strength of solder joints and the change of pitch static attitude (PSA). The response surface methodology is employed to establish the reliable mathematical relationships between the laser soldering parameters and desired responses. Then, multi-objective optimization is conducted to determine the optimal process parameters that can enhance the joint shear strength and minimize the change of PSA. The validation test confirms that the predicted value has good agreement with the actual value.
Hazmat transport: a methodological framework for the risk analysis of marshalling yards.
Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino
2007-08-17
A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.
NASA Astrophysics Data System (ADS)
McCrary-Dennis, Micah C. L.
Incorporating nanostructured functional constituents within polymers has become extensive in processes and products for manufacturing composites. The conception of carbon nanotubes (CNTs) and their heralded attributes yielding property enhancements to the carrier system is leading many industries and research endeavors. Displaced Foam Dispersion (DFD) methodology is a novel and effective approach to facilitating the incorporation of CNTs within fiber reinforced polymer composites (FRPC). The methodology consists of six separate solubility phases that lead to the manufacture of CNT-FRPCs (also termed hybrid/multiscale composites). This study was primarily initiated to characterize the interaction parameters of nanomaterials (multiwall carbon nanotubes), polymers (polystyrene), and solvents (dimethyl formamide (DMF) and acetone) in the current paradigm of the DFD materials manufacture. Secondly, we sought to illustrate the theoretical potential for the methodology to be used in conjunction with other nanomaterial-polymer-solvent systems. Herein, the theory of Hansen's solubility parameters (HSP) is employed to explain the DFD constituents manufacturing combination parameters and aid in the explanation of the experimental results. The results illustrate quantitative values for the relative energy differences between each polymer-solvent system. Scanning Electron Microscopy (SEM) and Transmission Electron Microscopy (TEM) were used to characterize the multiwalled carbon nanotubes (MWCNTs) in each of the solubility stages and culminates with an indication of good dispersion potential in the final multiscale composite. Additionally, acetone absorption, evaporation mass loss and retention are reported for the sorbed plasticized PS-CNT (CNTaffy) nanocomposites that has successfully achieved up through approximately 60 weight percent loading. The findings indicate that as CNT loading percentage increases the acetone absorbency also increases, but the materials retention of acetone over time decreases. This directly influences the manufacturability of the porous polymer nanocomposite (P-PNC) in the DFD methodology. Localized interlaminar CNT enrichment was achieved through 60 wt. % loading within the P-PNC and verified under two-electrode electrical conductivity testing of the final multiscale composite. The electrical properties of low weight percent (approximately 0.15 - 2.5 wt. %) nanomaterials show a decreasing trend in the materials' resistivity that indicates the ability to become increasingly conductive with increasing CNT loadings. Finally, the mechanical properties will show evidence of toughness, increased strain to failure, and the potential for greater energy absorption.
DOT National Transportation Integrated Search
2017-06-01
This project developed a methodology to simulate and analyze roadway traffic patterns : and expected penetration and timing of electric vehicles (EVs) with application directed : toward the requirements for electric vehicle supply equipment (EVSE) si...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
... methodological explanations may be addressed to Marie L. Lihn or Peter B. Kahn, Economic and Market Analysis... not statistically significant. Rent reasonableness studies are not subject to the same constraints on...
DOT National Transportation Integrated Search
2005-07-29
This final report describes the national evaluation of the New Mexico Client Referral, Ridership, and Financial Tracking (CRRAFT) System. The evaluation methodology assessed twelve hypotheses related to the expected outcomes of CRRAFT. To assess the ...
Wind Resource and Feasibility Assessment Report for the Lummi Reservation
DOE Office of Scientific and Technical Information (OSTI.GOV)
DNV Renewables; J.C. Brennan & Associates, Inc.; Hamer Environmental L.P.
2012-08-31
This report summarizes the wind resource on the Lummi Indian Reservation (Washington State) and presents the methodology, assumptions, and final results of the wind energy development feasibility assessment, which included an assessment of biological impacts and noise impacts.
Roughness analysis of grade breaks at intersections : final report.
DOT National Transportation Integrated Search
1993-03-01
A method to analyze the roughness of grade breaks at highway intersections is proposed. Although there are a variety of instruments to physically measure the road roughness, there are no known methodologies to analyze profile roughness during the des...
Evaluation of life cycle impacts of intersection control type selection [final report].
DOT National Transportation Integrated Search
2016-05-05
The methodology provided in this report provides guidance for NCDOT conversions of twoway : stop controlled intersections to other intersection types to enhance the effective allocation of : public funds. The findings of this project have been inc...
Evaluation of subgrade moduli for flexible pavements in Virginia : final report.
DOT National Transportation Integrated Search
1980-01-01
Advances in pavement design technology in recent years have led to more dependence on mechanistic approaches and less reliance on subjective design criteria. In Virginia, the tendency is toward a pavement design and evaluation methodology based on el...
DOT National Transportation Integrated Search
2011-08-01
The Backing crash Countermeasures project, part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program, developed a basic methodological framework and computerbased simulation model to estimate the effectiv...
Innovative methods for calculation of freeway travel time using limited data : final report.
DOT National Transportation Integrated Search
2008-01-01
Description: Travel time estimations created by processing of simulated freeway loop detector data using proposed method have been compared with travel times reported from VISSIM model. An improved methodology was proposed to estimate freeway corrido...
DOT National Transportation Integrated Search
1996-04-01
THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.
Addendum to final report, Optimizing traffic counting procedures.
DOT National Transportation Integrated Search
1987-01-01
The methodology described in entry 55-14 was used with 1980 data for 16 continuous count stations to determine periods that were stable throughout the year for different short counts. It was found that stable periods for short counts occurred mainly ...
MOLECULAR EPIDEMIOLOGY OF PRETERM DELIVERY: METHODOLOGY AND CHALLENGES. (R825818)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Simulation analysis of route diversion strategies for freeway incident management : final report.
DOT National Transportation Integrated Search
1995-02-01
The purpose of this project was to investigate whether simulation models could : be used as decision aids for defining traffic diversion strategies for effective : incident management. A methodology was developed for using such a model to : determine...
76 FR 20043 - Agency Information Collection Activities: New Collection, Comments Requested
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-11
... DEPARTMENT OF JUSTICE Federal Bureau of Investigation [OMB Number 1110-NEW] Agency Information... renewal: Final Disposition Report (R-84). The Department of Justice (DOJ), Federal Bureau of Investigation... validity of the methodology and assumptions used; [[Page 20044
Lean Six Sigma Applied to Ultrasound Guided Needle Biopsy in the Head and Neck.
Matt, Bruce H; Woodward-Hagg, Heather K; Wade, Christopher L; Butler, Penny D; Kokoska, Mimi S
2014-07-01
(1) Confirm the positive value stream of office-based ultrasound using Lean Six Sigma; (2) demonstrate how ultrasound reduces time to diagnosis, costs, patient inconvenience and travel, exposure to ionizing radiation, intravenous contrast, and laboratory tests. Case series with historical controls using chart review. Tertiary Veterans Administration Hospital (university-affiliated). Patients with a consult request or decision for ultrasound guided fine needle aspiration (USFNA) from 2006 to 2012. Process evaluation using Lean Six Sigma methodologies; years study conducted: 2006-2012; outcome measurements: type of diagnostic tests and imaging studies including CT scans with associated radiation exposure, time to preliminary and final cytopathologic diagnosis, episodes of patient travel. Value stream mapping prior to and after implementing office-based ultrasound confirmed the time from consult request or decision for USFNA to completion of the USFNA was reduced from a range of 0 to 286 days requiring a maximum 17 steps to a range of 0 to 48 days, necessitating only a maximum of 9 steps. Office-based USFNA for evaluation of head and neck lesions reduced costs, time to diagnosis, risks and inconvenience to patients, radiation exposure, unnecessary laboratory, and patient complaints while increasing staff satisfaction. In addition, office-based ultrasound also changed the clinical management of specific patients. Lean Six Sigma reduces waste and optimizes quality and accuracy in manufacturing. This is the first known application of Lean Six Sigma to office-based USFNA in the evaluation of head and neck lesions. The literature supports the value of office-based ultrasound to patients and health care systems. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.
Assays of homeopathic remedies in rodent behavioural and psychopathological models.
Bellavite, Paolo; Magnani, Paolo; Marzotto, Marta; Conforti, Anita
2009-10-01
The first part of this paper reviews the effects of homeopathic remedies on several models of anxiety-like behaviours developed and described in rodents. The existing literature in this field comprises some fifteen exploratory studies, often published in non-indexed and non-peer-reviewed journals. Only a few results have been confirmed by multiple laboratories, and concern Ignatia, Gelsemium, Chamomilla (in homeopathic dilutions/potencies). Nevertheless, there are some interesting results pointing to the possible efficacy of other remedies, and confirming a statistically significant effect of high dilutions of neurotrophic molecules and antibodies. In the second part of this paper we report some recent results obtained in our laboratory, testing Aconitum, Nux vomica, Belladonna, Argentum nitricum, Tabacum (all 5CH potency) and Gelsemium (5, 7, 9 and 30CH potencies) on mice using ethological models of behaviour. The test was performed using coded drugs and controls in double blind (operations and calculations). After an initial screening that showed all the tested remedies (except for Belladonna) to have some effects on the behavioural parameters (light-dark test and open-field test), but with high experimental variability, we focused our study on Gelsemium, and carried out two complete series of experiments. The results showed that Gelsemium had several effects on the exploratory behaviour of mice, which in some models were highly statistically significant (p < 0.001), in all the dilutions/dynamizations used, but with complex differences according to the experimental conditions and test performed. Finally, some methodological issues of animal research in this field of homeopathy are discussed. The "Gelsemium model" - encompassing experimental studies in vitro and in vivo from different laboratories and with different methods, including significant effects of its major active principle gelsemine - may play a pivotal rule for investigations on other homeopathic remedies.
Parente, Joana; Pereira, Mário G; Tonini, Marj
2016-07-15
The present study focuses on the dependence of the space-time permutation scan statistics (STPSS) (1) on the input database's characteristics and (2) on the use of this methodology to assess changes on the fire regime due to different type of climate and fire management activities. Based on the very strong relationship between weather and the fire incidence in Portugal, the detected clusters will be interpreted in terms of the atmospheric conditions. Apart from being the country most affected by the fires in the European context, Portugal meets all the conditions required to carry out this study, namely: (i) two long and comprehensive official datasets, i.e. the Portuguese Rural Fire Database (PRFD) and the National Mapping Burnt Areas (NMBA), respectively based on ground and satellite measurements; (ii) the two types of climate (Csb in the north and Csa in the south) that characterizes the Mediterranean basin regions most affected by the fires also divide the mainland Portuguese area; and, (iii) the national plan for the defence of forest against fires was approved a decade ago and it is now reasonable to assess its impacts. Results confirmed (1) the influence of the dataset's characteristics on the detected clusters, (2) the existence of two different fire regimes in the country promoted by the different types of climate, (3) the positive impacts of the fire prevention policy decisions and (4) the ability of the STPSS to correctly identify clusters, regarding their number, location, and space-time size in spite of eventual space and/or time splits of the datasets. Finally, the role of the weather on days when clustered fires were active was confirmed for the classes of small, medium and large fires. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.
2017-04-01
Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliott, Kenny B.; Joshi, Suresh M.; Walz, Joseph E.
1994-01-01
This paper describes the first experimental validation of an optimization-based integrated controls-structures design methodology for a class of flexible space structures. The Controls-Structures-Interaction (CSI) Evolutionary Model, a laboratory test bed at Langley, is redesigned based on the integrated design methodology with two different dissipative control strategies. The redesigned structure is fabricated, assembled in the laboratory, and experimentally compared with the original test structure. Design guides are proposed and used in the integrated design process to ensure that the resulting structure can be fabricated. Experimental results indicate that the integrated design requires greater than 60 percent less average control power (by thruster actuators) than the conventional control-optimized design while maintaining the required line-of-sight performance, thereby confirming the analytical findings about the superiority of the integrated design methodology. Amenability of the integrated design structure to other control strategies is considered and evaluated analytically and experimentally. This work also demonstrates the capabilities of the Langley-developed design tool CSI DESIGN which provides a unified environment for structural and control design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.
1994-10-01
The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less
NASA Astrophysics Data System (ADS)
Abdeljabbar Kharrat, Nourhene; Plateaux, Régis; Miladi Chaabane, Mariem; Choley, Jean-Yves; Karra, Chafik; Haddar, Mohamed
2018-05-01
The present work tackles the modeling of multi-physics systems applying a topological approach while proceeding with a new methodology using a topological modification to the structure of systems. Then the comparison with the Magos' methodology is made. Their common ground is the use of connectivity within systems. The comparison and analysis of the different types of modeling show the importance of the topological methodology through the integration of the topological modification to the topological structure of a multi-physics system. In order to validate this methodology, the case of Pogo-stick is studied. The first step consists in generating a topological graph of the system. Then the connectivity step takes into account the contact with the ground. During the last step of this research; the MGS language (Modeling of General System) is used to model the system through equations. Finally, the results are compared to those obtained by MODELICA. Therefore, this proposed methodology may be generalized to model multi-physics systems that can be considered as a set of local elements.
The communicative functions of final rises in Finnish intonation.
Ogden, Richard; Routarinne, Sara
2005-01-01
This paper considers the communicative function of final rises in Finnish conversational talk between pairs of teenage girls. Final rises are fairly common, occurring approximately twice a minute, predominantly on declaratives and in narrative sequences. We briefly consider the interplay between voice quality (known to be a marker of transition relevance) and rising intonation in Finnish. We argue that in narrative sequences, rising terminals manage two main interactional tasks: they provide a place for a coparticipant to mark recipiency, and they project more talk by the current speaker. Using a methodology which combines phonetic observation with conversation analysis, we demonstrate participants' orientation to these functions.
76 FR 11331 - New Animal Drugs for Minor Use and Minor Species; Confirmation of Effective Date
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 516 [Docket No. FDA-2010-N-0534] RIN 0910-AG58 New Animal Drugs for Minor Use and Minor Species; Confirmation of... direct final rule amends the regulations regarding new animal drugs for minor use and minor species (MUMS...
Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M
2018-05-16
Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Validation of Technical Recommendations. Final Report. ISSOE Managing Student Progress.
ERIC Educational Resources Information Center
Ridley, Dennis; And Others
This report is organized to provide a complete, logical presentation of the major steps taken in the third phase of research on dynamics of dissemination of the Instructional Support System for Occupational Education (ISSOE). It reports on the confirmation and validation of the Phase II Final Report on Dissemination Issues and the Phase III…
Influence of Gender and Other Factors to Final Grade
ERIC Educational Resources Information Center
Domeova, Ludmila; Jindrova, Andrea; Fejfar, Jiri
2015-01-01
The study focused on the relations between the partial evaluation and the final grade. The investigation has been done on a group of 269 students of the Czech University of Life Sciences in Prague, in the course of Mathematical Methods, who have to go through a strictly defined evaluation scheme. The results of statistical analysis confirmed that…
1994-11-01
Erdman Solar to Thermal Energy Physics and Astronomy University of Iowa, Iowa City, IA PL/RK 6 A Detailed Investigation of Low-and High-Power Arcjet...Properties of Dr. Mary Potasek Strained Layer Sem Applied Physics Columbia University, New York, NY WL/ML 27 Development of Control Design Methodologies...concrete is also presented. Finally, the model is extended to include penetration into multiple layers of different target materials. Comparisons are
ERIC Educational Resources Information Center
Leaman, Lori Hostetler; Flanagan, Toni Michele
2013-01-01
This article draws from situated learning theory, teacher education research, and the authors' collaborative self-study to propose a teacher education pedagogy that may help to bridge the theory-into-practice gap for preservice teachers. First, we review the Interstate Teacher Assessment and Support Consortium standards to confirm the call for…
ERIC Educational Resources Information Center
Mahmoud, Ali Bassam; Khalifa, Bayan
2015-01-01
Purpose: The purpose of this paper is to confirm the factorial structure of SERVPERF based on an exploration of its dimensionality among Syrian universities' students. It also aimed at assessing the perceived service quality offered at these universities. Design/methodology/approach: A cross-sectional survey was conducted targeting students at…
ERIC Educational Resources Information Center
Miller, Dana L.
The external audit is a way of assessing the trustworthiness of a study, attesting to its dependability from a methodological standpoint and to its confirmability by reviewing the data, analysis, and interpretations and assessing whether the findings represent the data accurately. This paper discusses issues in the audit process, drawing on data…
ERIC Educational Resources Information Center
Malik, Muhammad Shaukat; Kanwal, Maria
2018-01-01
Purpose: The purpose of this paper is to investigate empirically impacts of organizational knowledge-sharing practices (KSP) on employees' job satisfaction (JS), interpersonal adaptability (IA) and learning commitment (LC). Indirect effects of KSP on JS are also confirmed through mediating factors (LC and IA). Design/methodology/approach:…
A multiresidue pesticide methodology has been studied and results for acidics are reported here with base/neutral to follow. This work studies a literature procedure as a possible general approach to many pesticides and potentially other analytes that are considered to be liquid...
Teachers' Perceptions of Employment-Related Problems: A Survey of Teachers in Two States.
ERIC Educational Resources Information Center
Cutrer, Susan S.; Daniel, Larry G.
This study was conducted to determine the degree to which a randomly selected sample of teachers in Mississippi and Louisiana (N=291) experience various types of work-related problems. It provides an opportunity to either confirm or deny the findings of previous studies, many of them limited by various methodological problems. Data were collected…
2013-01-01
Background Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. Methods The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. Results As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach’s alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. Conclusion The results of this study indicate that the “Consumer Behavior Toward Counterfeit Drugs Scale” is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem. PMID:24020730
Alfadl, Abubakr A; Ibrahim, Mohamed Izham b Mohamed; Hassali, Mohamed Azmi Ahmad
2013-09-11
Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach's alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. The results of this study indicate that the "Consumer Behavior Toward Counterfeit Drugs Scale" is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem.
DOT National Transportation Integrated Search
2000-12-01
The current performance-related specifications (PRS) methodology has been under development by the Federal : Highway Administration (FHWA) for several years and has now reached a level at which it can be implemented by : State highway agencies. PRS f...
Development of the Bicycle Compatibility Index: A Level of Service Concept, Final Report
DOT National Transportation Integrated Search
1998-12-01
Presently, there is no methodology widely accepted by engineers, planners, or bicycle coordinators that will allow them to determine how compatible a roadway is for allowing efficient operation of both bicycles and motor vehicles. Determining how exi...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
de Paiva, Anderson Paulo
2018-01-01
This research evaluates the influence of the Brazilian accreditation methodology on the sustainability of the organizations. Critical factors for implementing accreditation were also examined, including measuring the relationships established between these factors in the organization sustainability. The present study was developed based on the survey methodology applied in the organizations accredited by ONA (National Accreditation Organization); 288 responses were received from the top level managers. The analysis of quantitative data of the measurement models was made with factorial analysis from principal components. The final model was evaluated from the confirmatory factorial analysis and structural equation modeling techniques. The results from the research are vital for the definition of factors that interfere in the accreditation processes, providing a better understanding for accredited organizations and for Brazilian accreditation. PMID:29599939
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, Paolo; Theiler, C.; Fasoli, A.
A methodology for plasma turbulence code validation is discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The present work extends the analysis carried out in a previous paper [P. Ricci et al., Phys. Plasmas 16, 055703 (2009)] where the validation observables were introduced. Here, it is discussed how to quantify the agreement between experiments and simulations with respect to each observable, how to define a metric to evaluate this agreement globally, and - finally - how to assess the quality of a validation procedure. The methodology is then applied to the simulation of the basic plasmamore » physics experiment TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulation models.« less
NASA Technical Reports Server (NTRS)
Hall, Edward; Magner, James
2011-01-01
This report is provided as part of ITT s NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: New ATM Requirements-Future Communications, C-Band and L-Band Communications Standard Development and was based on direction provided by FAA project-level agreements for New ATM Requirements-Future Communications. Task 7 included two subtasks. Subtask 7-1 addressed C-band (5091- to 5150-MHz) airport surface data communications standards development, systems engineering, test bed and prototype development, and tests and demonstrations to establish operational capability for the Aeronautical Mobile Airport Communications System (AeroMACS). Subtask 7-2 focused on systems engineering and development support of the L-band digital aeronautical communications system (L-DACS). Subtask 7-1 consisted of two phases. Phase I included development of AeroMACS concepts of use, requirements, architecture, and initial high-level safety risk assessment. Phase II builds on Phase I results and is presented in two volumes. Volume I is devoted to concepts of use, system requirements, and architecture, including AeroMACS design considerations. Volume II (this document) describes an AeroMACS prototype evaluation and presents final AeroMACS recommendations. This report also describes airport categorization and channelization methodologies. The purposes of the airport categorization task were (1) to facilitate initial AeroMACS architecture designs and enable budgetary projections by creating a set of airport categories based on common airport characteristics and design objectives, and (2) to offer high-level guidance to potential AeroMACS technology and policy development sponsors and service providers. A channelization plan methodology was developed because a common global methodology is needed to assure seamless interoperability among diverse AeroMACS services potentially supplied by multiple service providers.
Cocchioni, M; Scuri, S; Morichetti, L; Petrelli, F; Grappasonni, I
2006-01-01
The article underlines the fundamental importance of the protection and promotion of environmental quality for the human health. The evolution of fluvial monitoring techniques is contemplated from chemical and bacteriological analysis until the Index Functional Index (I.F.F). This evolution it's very important because shows a new methodological and cultural maturation that has carried from a anthropocentric vision until an ecocentric vision. The target of this ecological vision is the re-establishment of ecological functionality of the rivers, eliminating the consumer's vision of the water considered only as a usable resource. The importance of an correct monitoring of a river is confirmed, even though the preventive approach priority remains.
The use of carrier RNA to enhance DNA extraction from microfluidic-based silica monoliths.
Shaw, Kirsty J; Thain, Lauren; Docker, Peter T; Dyer, Charlotte E; Greenman, John; Greenway, Gillian M; Haswell, Stephen J
2009-10-12
DNA extraction was carried out on silica-based monoliths within a microfluidic device. Solid-phase DNA extraction methodology was applied in which the DNA binds to silica in the presence of a chaotropic salt, such as guanidine hydrochloride, and is eluted in a low ionic strength solution, such as water. The addition of poly-A carrier RNA to the chaotropic salt solution resulted in a marked increase in the effective amount of DNA that could be recovered (25ng) compared to the absence of RNA (5ng) using the silica-based monolith. These findings confirm that techniques utilising nucleic acid carrier molecules can enhance DNA extraction methodologies in microfluidic applications.
Design consideration of resonance inverters with electro-technological application
NASA Astrophysics Data System (ADS)
Hinov, Nikolay
2017-12-01
This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.
Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.
Stern, Cindy; Chur-Hansen, Anna
2013-02-27
This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.
Manganelli, Joe; Threatt, Anthony; Brooks, Johnell O; Healy, Stan; Merino, Jessica; Yanik, Paul; Walker, Ian; Green, Keith
2014-01-01
This article presents the results of a qualitative study that confirmed, classified, and prioritized user needs for the design of a more useful, usable, and actively assistive over-the-bed table. Manganelli et al. (2014) generated a list of 74 needs for use in developing an actively assistive over-the-bed table. This present study assesses the value and importance of those needs. Fourteen healthcare subject matter experts and eight research and design subject matter experts engaged in a participatory and iterative research and design process. A mixed methods qualitative approach used methodological triangulation to confirm the value of the findings and ratings to establish importance. Open and closed card sorts and a Delphi study were used. Data analysis methods included frequency analysis, content analysis, and a modified Kano analysis. A table demonstrating the needs that are of high importance to both groups of subject matter experts and classification of the design challenges each represents was produced. Through this process, the list of 74 needs was refined to the 37 most important need statements for both groups. Designing a more useful, usable, and actively assistive over-the-bed table is primarily about the ability to position it optimally with respect to the user for any task, as well as improving ease of use and usability. It is also important to make explicit and discuss the differences in priorities and perspectives demonstrated between research and design teams and their clients. © 2014 Vendome Group, LLC.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
Simundic, Ana-Maria; Nikolac, Nora; Topic, Elizabeta
2009-01-01
The aims of this article are to evaluate the methodological quality of genetic association studies on the inherited thrombophilia published during 2003 to 2005, to identify the most common mistakes made by authors of those studies, and to examine if overall quality of the article correlates with the quality of the journal. Articles were evaluated by 2 independent reviewers using the checklist of 16 items. A total of 58 eligible studies were identified. Average total score was 7.59 +/- 1.96. Total article score did not correlate with the journal impact factor (r = 0.3971; 95% confidence interval [CI], 0.1547-0.5944, P = .002). Total score did not differ across years (P = .624). Finally, it is concluded that methodological quality of genetic association studies is not optimal, and it does not depend on the quality of the journal. Journals should adopt methodological criteria for reporting the genetic association studies, and editors should encourage authors to strictly adhere to those criteria.
An enhanced methodology for spacecraft correlation activity using virtual testing tools
NASA Astrophysics Data System (ADS)
Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew
2017-11-01
Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.
MetaCAA: A clustering-aided methodology for efficient assembly of metagenomic datasets.
Reddy, Rachamalla Maheedhar; Mohammed, Monzoorul Haque; Mande, Sharmila S
2014-01-01
A key challenge in analyzing metagenomics data pertains to assembly of sequenced DNA fragments (i.e. reads) originating from various microbes in a given environmental sample. Several existing methodologies can assemble reads originating from a single genome. However, these methodologies cannot be applied for efficient assembly of metagenomic sequence datasets. In this study, we present MetaCAA - a clustering-aided methodology which helps in improving the quality of metagenomic sequence assembly. MetaCAA initially groups sequences constituting a given metagenome into smaller clusters. Subsequently, sequences in each cluster are independently assembled using CAP3, an existing single genome assembly program. Contigs formed in each of the clusters along with the unassembled reads are then subjected to another round of assembly for generating the final set of contigs. Validation using simulated and real-world metagenomic datasets indicates that MetaCAA aids in improving the overall quality of assembly. A software implementation of MetaCAA is available at https://metagenomics.atc.tcs.com/MetaCAA. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo, H.
1982-01-01
The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less
2016-08-05
This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs) for fiscal year (FY) 2017. In addition, it specifies a potentially preventable readmission measure for the Skilled Nursing Facility Value-Based Purchasing Program (SNF VBP), and implements requirements for that program, including performance standards, a scoring methodology, and a review and correction process for performance information to be made public, aimed at implementing value-based purchasing for SNFs. Additionally, this final rule includes additional polices and measures in the Skilled Nursing Facility Quality Reporting Program (SNF QRP). This final rule also responds to comments on the SNF Payment Models Research (PMR) project.
2015-08-06
This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2016 as required by the statute. As required by section 1886(j)(5) of the Act, this rule includes the classification and weighting factors for the IRF PPS's case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2016. This final rule also finalizes policy changes, including the adoption of an IRF-specific market basket that reflects the cost structures of only IRF providers, a 1-year phase-in of the revised wage index changes, a 3-year phase-out of the rural adjustment for certain IRFs, and revisions and updates to the quality reporting program (QRP).
A Bio-Inspired Polymeric Gradient Refractive Index (GRIN) Human Eye Lens
2012-11-19
confirmation of the desired aspheric surface shape. Furthermore, the wavefronts of aspheric posterior GRIN and PMMA lenses were measured and...compared a homogenous PMMA lens of an identical geometry. Finally, the anterior and posterior GRIN lenses were assembled into a bio-inspired GRIN...topography and exhibited confirmation of the desired aspheric surface shape. Furthermore, the wavefronts of aspheric posterior GRIN and PMMA lenses were
McMurray, Josephine; McNeil, Heather; Lafortune, Claire; Black, Samantha; Prorok, Jeanette; Stolee, Paul
2016-01-01
To identify key dimensions of patients' experience across the rehabilitative care system and to recommend a framework to develop survey items that measure the rehabilitative care experience. Data were sourced from a literature review that searched MEDLINE (PubMed), CINAHL (Ebsco), and PsycINFO (APA PsycNET) databases from 2004 to 2014, the reference lists of the final accepted articles, and hand searches of relevant journals. Four reviewers performed the screening process on 2472 articles; 33 were included for analysis. Interrater reliability was confirmed through 2 rounds of title review and 1 round of abstract review, with an average κ score of .69. The final sample of 33 accepted articles was imported into a qualitative data analysis software application. Multiple levels of coding and a constant comparative methodology generated 6 themes. There were 502 discreet survey questions measuring patient experience that were categorized using the following dimensions: rehabilitative care ecosystem, client and informal caregiver engagement, patient and health care provider relation, pain and functional status, group and individual identity, and open ended. The most common survey questions examine the care delivery ecosystem (37%), the engagement of clients and their informal caregivers (24.9%), and the quality of relations between providers and patients (21.7%). Examination of patient's functional status and management of pain yielded (15.3%) of the instruments' questions. Currently available instruments and questions that measure patients' experience in rehabilitative care are unable to assess the performance of rehabilitative delivery systems that aspire to integrate care across the continuum. However, question panels derived from our 6 key themes may measure the key concepts that define rehabilitative care and facilitate measurement of patient experience at the system level. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Optimization of photo-Fenton process for the treatment of prednisolone.
Díez, Aida María; Ribeiro, Ana Sofia; Sanromán, Maria Angeles; Pazos, Marta
2018-03-29
Prednisolone is a widely prescribed synthetic glucocorticoid and stated to be toxic to a number of non-target aquatic organisms. Its extensive consumption generates environmental concern due to its detection in wastewater samples at concentrations ranged from ng/L to μg/L that requests the application of suitable degradation processes. Regarding the actual treatment options, advanced oxidation processes (AOPs) are presented as a viable alternative. In this work, the comparison in terms of pollutant removal and energetic efficiencies, between different AOPs such as Fenton (F), photo-Fenton (UV/F), photolysis (UV), and hydrogen peroxide/photolysis (UV/H 2 O 2 ), was carried out. Light diode emission (LED) was the selected source to provide the UV radiation. The UV/F process revealed the best performance, reaching high levels of both degradation and mineralization with low energy consumption. Its optimization was conducted and the operational parameters were iron and H 2 O 2 concentrations and the working volume. Using the response surface methodology with the Box-Behnken design, the effect of independent variables and their interactions on the process response were effectively evaluated. Different responses were analyzed taking into account the prednisolone removal (TOC and drug abatements) and the energy consumptions associated. The obtained model showed an improvement of the UV/F process when treating smaller volumes and when adding high concentrations of H 2 O 2 and Fe 2+ . The validation of this model was successfully carried out, having only 5% of discrepancy between the model and the experimental results. Finally, the performance of the process when having a real wastewater matrix was also tested, achieving complete mineralization and detoxification after 8 h. In addition, prednisolone degradation products were identified. Finally, the obtained low energy permitted to confirm the viability of the process.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-10
... surrogate value for its lime input. [cir] Iceman Group alleges the Department made a clerical error by... results to use either the labor rate methodology announced on June 21, 2011, or to value lime using any...
Beginning Postsecondary Students Longitudinal Study: 1996-2001 (BPS:1996/2001) Methodology Report.
ERIC Educational Resources Information Center
Wine, Jennifer S.; Heuer, Ruth E.; Wheeless, Sara C.; Francis, Talbric L.; Franklin, Jeff W.; Dudley, Kristin M.
2002-01-01
Describes the procedures and results of the full-scale Implementation of the final followup interview with the Beginning Postsecondary Students Longitudinal Study 1996/2001. This study follows a cohort first interviewed in 1996 in their first year of postsecondary education. (SLD)
Enacting Post-Reflexive Teacher Education
ERIC Educational Resources Information Center
Vagle, Mark D.; Monette, Rachel; Thiel, Jaye Johnson; Wester-Neal, Katie
2017-01-01
The purpose of this article is to re-conceptualize Schön's call for a phenomenology of practice--moving away from reflection and towards "post-reflexion"--by explicitly drawing on philosophical and methodological tenets of phenomenology, specifically some of Vagle's theorizing of a "post-intentional phenomenology." Finally, we…
DOT National Transportation Integrated Search
2016-06-30
The overarching objective of this research is the development of a systematic methodology of employing GPR, including instruments, subsequent data processing and interpretation that can be used regularly as part of a roadway pavement and bridge evalu...
DOT National Transportation Integrated Search
2003-11-14
Transit Tracker uses global positioning system (GPS) technology to track how far a bus is along its scheduled route. This document presents the evaluation strategies and objectives, the data collection methodologies, and the results of the evaluation...
Validation of source approval of HMA surface mix aggregate : final report.
DOT National Transportation Integrated Search
2016-04-01
The main focus of this research project was to develop methodologies for the validation of source approval of hot : mix asphalt surface mix aggregate. In order to further enhance the validation process, a secondary focus was also to : create a spectr...
EPA developed a methodology for estimating the health benefits of benzene reductions and has applied it in a metropolitan-scale case study of the benefits of CAA controls on benzene emissions to accompany the main 812 analysis.
DOT National Transportation Integrated Search
2011-03-01
This project addressed several aspects of the LOSPLAN software, primarily with respect to incorporating : new FDOT and NCHRP research project results. In addition, some existing computational methodology : aspects were refined to provide more accurat...
Calibration of Resistance Factors Needed in the LRFD Design of Drilled Shafts
DOT National Transportation Integrated Search
2010-09-01
The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was : completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisia...
Calibration of resistance factors needed in the LRFD design of drilled shafts.
DOT National Transportation Integrated Search
2010-09-01
The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisiana...
Calibration of resistance factors needed in the LRFD design of drilled shafts.
DOT National Transportation Integrated Search
2010-09-01
The first report on Load and Resistance Factor Design (LRFD) calibration of driven piles in Louisiana (LTRC Final Report 449) was : completed in May 2009. As a continuing effort to implement the LRFD design methodology for deep foundations in Louisia...
77 FR 73911 - Flightcrew Member Duty and Rest Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... discussion of the methodology and information sources used in the rulemaking analysis, corrects reporting and... Aviation Administration (FAA), DOT. ACTION: Availability of Initial Supplemental Regulatory Impact Analysis. SUMMARY: The FAA is issuing an Initial Supplemental Regulatory Impact Analysis of its final rule amending...
This report focuses on mobile emission sources at ports, including oceangoing vessels (OGVs), harbor craft, and cargo handling equipment (CHE), as well as other land-side mobile emission sources at ports, such as locomotives and on-highway vehicles.
76 FR 57807 - Medicaid Program; Recovery Audit Contractors
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-16
... for Medicare & Medicaid Services 42 CFR Part 455 Medicaid Program; Recovery Audit Contractors; Final... 42 CFR Part 455 [CMS-6034-F] RIN 0938-AQ19 Medicaid Program; Recovery Audit Contractors AGENCY... costs of Medicaid Recovery Audit Contractors (Medicaid RACs) and the payment methodology for State...
Demonstration of TRAF-NETSIM for traffic operations management : final report.
DOT National Transportation Integrated Search
1991-08-01
The utility of the simulation package TRAF-NETSIM to the traffic engineer is assessed and demonstrated by means of a case study. The methodology employed in performing the analysis is presented in a way that will aid future users of TRAF-NETSIM. The ...
NASA Astrophysics Data System (ADS)
Eugenio Pappalardo, Salvatore; Ferrarese, Francesco; Tarolli, Paolo; Varotto, Mauro
2016-04-01
Traditional agricultural terraced landscapes presently embody an important cultural value to be deeply investigated, both for their role in local heritage and cultural economy and for their potential geo-hydrological hazard due to abandonment and degradation. Moreover, traditional terraced landscapes are usually based on non-intensive agro-systems and may enhance some important ecosystems services such as agro-biodiversity conservation and cultural services. Due to their unplanned genesis, mapping, quantifying and classifying agricultural terraces at regional scale is often critical as far as they are usually set up on geomorphologically and historically complex landscapes. Hence, traditional mapping methods are generally based on scientific literature and local documentation, historical and cadastral sources, technical cartography and aerial images visual interpretation or, finally, field surveys. By this, limitations and uncertainty in mapping at regional scale are basically related to forest cover and lack in thematic cartography. The Veneto Region (NE of Italy) presents a wide heterogeneity of agricultural terraced landscapes, mainly distributed within the hilly and Prealps areas. Previous studies performed by traditional mapping method quantified 2,688 ha of terraced areas, showing the higher values within the Prealps of Lessinia (1,013 ha, within the Province of Verona) and in the Brenta Valley (421 ha, within the Province of Vicenza); however, terraced features of these case studies show relevant differences in terms of fragmentation and intensity of terraces, highlighting dissimilar degrees of clusterization: 1.7 ha on one hand (Province of Verona) and 1.2 ha per terraced area (Province of Vicenza) on the other one. The aim of this paper is to implement and to compare automatic methodologies with traditional survey methodologies to map and assess agricultural terraces in two representative areas of the Veneto Region. Testing different Remote Sensing analyses such as LiDAR topography survey and visual interpretation from aerial orthophotos (RGB+NIR bands) we performed a territorial analysis in the Lessinia and Brenta Valley case studies. Preliminary results show that terraced feature extraction by automatic LiDAR survey is more efficient both in identifying geometries (walls and terraced surfaces) and in quantifying features under the forest canopy; however, traditional mapping methodology confirms its strength by matching different methods and different data such as aerial photo, visual interpretation, maps and field surveys. Hence, the two methods here compared represent a cross-validation and let us to better know the complexity of this kind of landscape.
Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah
2014-07-01
The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Al-Mousa, Amjed A.
Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.
Leaders' mental health at work: Empirical, methodological, and policy directions.
Barling, Julian; Cloutier, Anika
2017-07-01
While employees' mental health is the focus of considerable attention from researchers, the public, and policymakers, leaders' mental health has almost escaped attention. We start by considering several reasons for this, followed by discussions of the effects of leaders' mental health on their own leadership behaviors, the emotional toll of high-quality leadership, and interventions to enhance leaders' mental health. We offer 8 possible directions for future research on leaders' mental health. Finally, we discuss methodological obstacles encountered when investigating leaders' mental health, and policy dilemmas raised by leaders' mental health. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Hybrid experimental/analytical models of structural dynamics - Creation and use for predictions
NASA Technical Reports Server (NTRS)
Balmes, Etienne
1993-01-01
An original complete methodology for the construction of predictive models of damped structural vibrations is introduced. A consistent definition of normal and complex modes is given which leads to an original method to accurately identify non-proportionally damped normal mode models. A new method to create predictive hybrid experimental/analytical models of damped structures is introduced, and the ability of hybrid models to predict the response to system configuration changes is discussed. Finally a critical review of the overall methodology is made by application to the case of the MIT/SERC interferometer testbed.
Analytical and simulator study of advanced transport
NASA Technical Reports Server (NTRS)
Levison, W. H.; Rickard, W. W.
1982-01-01
An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.
2018-01-01
Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.
From Sky to Earth: Data Science Methodology Transfer
NASA Astrophysics Data System (ADS)
Mahabal, Ashish A.; Crichton, Daniel; Djorgovski, S. G.; Law, Emily; Hughes, John S.
2017-06-01
We describe here the parallels in astronomy and earth science datasets, their analyses, and the opportunities for methodology transfer from astroinformatics to geoinformatics. Using example of hydrology, we emphasize how meta-data and ontologies are crucial in such an undertaking. Using the infrastructure being designed for EarthCube - the Virtual Observatory for the earth sciences - we discuss essential steps for better transfer of tools and techniques in the future e.g. domain adaptation. Finally we point out that it is never a one-way process and there is enough for astroinformatics to learn from geoinformatics as well.
Solution methods for one-dimensional viscoelastic problems
NASA Technical Reports Server (NTRS)
Stubstad, John M.; Simitses, George J.
1987-01-01
A recently developed differential methodology for solution of one-dimensional nonlinear viscoelastic problems is presented. Using the example of an eccentrically loaded cantilever beam-column, the results from the differential formulation are compared to results generated using a previously published integral solution technique. It is shown that the results obtained from these distinct methodologies exhibit a surprisingly high degree of correlation with one another. A discussion of the various factors affecting the numerical accuracy and rate of convergence of these two procedures is also included. Finally, the influences of some 'higher order' effects, such as straining along the centroidal axis are discussed.
Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J
2015-03-15
This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.
Santaguida, Pasqualina; Oremus, Mark; Walker, Kathryn; Wishart, Laurie R; Siegel, Karen Lohmann; Raina, Parminder
2012-04-01
A "review of reviews" was undertaken to assess methodological issues in studies evaluating nondrug rehabilitation interventions in stroke patients. MEDLINE, CINAHL, PsycINFO, and the Cochrane Database of Systematic Reviews were searched from January 2000 to January 2008 within the stroke rehabilitation setting. Electronic searches were supplemented by reviews of reference lists and citations identified by experts. Eligible studies were systematic reviews; excluded citations were narrative reviews or reviews of reviews. Review characteristics and criteria for assessing methodological quality of primary studies within them were extracted. The search yielded 949 English-language citations. We included a final set of 38 systematic reviews. Cochrane reviews, which have a standardized methodology, were generally of higher methodological quality than non-Cochrane reviews. Most systematic reviews used standardized quality assessment criteria for primary studies, but not all were comprehensive. Reviews showed that primary studies had problems with randomization, allocation concealment, and blinding. Baseline comparability, adverse events, and co-intervention or contamination were not consistently assessed. Blinding of patients and providers was often not feasible and was not evaluated as a source of bias. The eligible systematic reviews identified important methodological flaws in the evaluated primary studies, suggesting the need for improvement of research methods and reporting. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
Prediction of adult height by Tanner-Whitehouse method in young Caucasian male athletes.
Ostojic, S M
2013-04-01
Although the accuracy of final height prediction using skeletal age development has been confirmed in many studies for children treated for congenital primary hypothyroidism, short normal children, constitutionally tall children, no studies compared the predicted adult height at young age with final stature in athletic population. In this study, the intention was to investigate to what extent the Tanner-Whitehouse (TW) method is adequate for prediction of final stature in young Caucasian male athletes. Prospective observational study. Plain radiographs of the left hand and wrist were obtained from 477 athletic children (ranging in age from 8.0 to 17.9 years) who came to the outpatient clinic between 2000 and 2011 for adult height estimation, with no orthopedic trauma suspected. Adult height was estimated using bone age rates according to TW method. Height was measured both at baseline and follow-up (at the age of 19 years). No significant difference was found between the estimated adult height (184.9 ± 9.7 cm) and final stature (185.6 ± 9.6 cm) [95% confidence interval (CI) 1.61-3.01, P = 0.55]. The relationship between estimated and final adult height was high (r = 0.96). Bland-Altman analysis confirmed that the 95% of differences between estimated adult height and final stature lie between limits of agreement (mean ± 2 SD) (-5.84 and 4.52 cm). TW method is an accurate method of predicting adult height in male normal-growing athletic boys.
A Patient-Specific Anisotropic Diffusion Model for Brain Tumour Spread.
Swan, Amanda; Hillen, Thomas; Bowman, John C; Murtha, Albert D
2018-05-01
Gliomas are primary brain tumours arising from the glial cells of the nervous system. The diffuse nature of spread, coupled with proximity to critical brain structures, makes treatment a challenge. Pathological analysis confirms that the extent of glioma spread exceeds the extent of the grossly visible mass, seen on conventional magnetic resonance imaging (MRI) scans. Gliomas show faster spread along white matter tracts than in grey matter, leading to irregular patterns of spread. We propose a mathematical model based on Diffusion Tensor Imaging, a new MRI imaging technique that offers a methodology to delineate the major white matter tracts in the brain. We apply the anisotropic diffusion model of Painter and Hillen (J Thoer Biol 323:25-39, 2013) to data from 10 patients with gliomas. Moreover, we compare the anisotropic model to the state-of-the-art Proliferation-Infiltration (PI) model of Swanson et al. (Cell Prolif 33:317-329, 2000). We find that the anisotropic model offers a slight improvement over the standard PI model. For tumours with low anisotropy, the predictions of the two models are virtually identical, but for patients whose tumours show higher anisotropy, the results differ. We also suggest using the data from the contralateral hemisphere to further improve the model fit. Finally, we discuss the potential use of this model in clinical treatment planning.
Maguire, Erin; Hong, Paul; Ritchie, Krista; Meier, Jeremy; Archibald, Karen; Chorney, Jill
2016-11-04
To describe the process involved in developing a decision aid prototype for parents considering adenotonsillectomy for their children with sleep disordered breathing. A paper-based decision aid prototype was developed using the framework proposed by the International Patient Decision Aids Standards Collaborative. The decision aid focused on two main treatment options: watchful waiting and adenotonsillectomy. Usability was assessed with parents of pediatric patients and providers with qualitative content analysis of semi-structured interviews, which included open-ended user feedback. A steering committee composed of key stakeholders was assembled. A needs assessment was then performed, which confirmed the need for a decision support tool. A decision aid prototype was developed and modified based on semi-structured qualitative interviews and a scoping literature review. The prototype provided information on the condition, risk and benefits of treatments, and values clarification. The prototype underwent three cycles of accessibility, feasibility, and comprehensibility testing, incorporating feedback from all stakeholders to develop the final decision aid prototype. A standardized, iterative methodology was used to develop a decision aid prototype for parents considering adenotonsillectomy for their children with sleep disordered breathing. The decision aid prototype appeared feasible, acceptable and comprehensible, and may serve as an effective means of improving shared decision-making.
Introducing CGOLS: The Cholla Galactic Outflow Simulation Suite
NASA Astrophysics Data System (ADS)
Schneider, Evan E.; Robertson, Brant E.
2018-06-01
We present the Cholla Galactic OutfLow Simulations (CGOLS) suite, a set of extremely high resolution global simulations of isolated disk galaxies designed to clarify the nature of multiphase structure in galactic winds. Using the GPU-based code Cholla, we achieve unprecedented resolution in these simulations, modeling galaxies over a 20 kpc region at a constant resolution of 5 pc. The simulations include a feedback model designed to test the effects of different mass- and energy-loading factors on galactic outflows over kiloparsec scales. In addition to describing the simulation methodology in detail, we also present the results from an adiabatic simulation that tests the frequently adopted analytic galactic wind model of Chevalier & Clegg. Our results indicate that the Chevalier & Clegg model is a good fit to nuclear starburst winds in the nonradiative region of parameter space. Finally, we investigate the role of resolution and convergence in large-scale simulations of multiphase galactic winds. While our largest-scale simulations show convergence of observable features like soft X-ray emission, our tests demonstrate that simulations of this kind with resolutions greater than 10 pc are not yet converged, confirming the need for extreme resolution in order to study the structure of winds and their effects on the circumgalactic medium.
Diagnostic approach in cases with suspected work-related asthma
2013-01-01
Background Work-related asthma (WRA) is a major cause of respiratory disease in modern societies. The diagnosis and consequently an opportunity for prevention are often missed in practice. Methods Based on recent studies and systematic reviews of the literature methods for detection of WRA and identification of specific causes of allergic WRA are discussed. Results and Conclusions All workers should be asked whether symptoms improve on days away from work or on holidays. Positive answers should lead to further investigation. Spirometry and non-specific bronchial responsiveness should be measured, but carefully performed and validly analysed serial peak expiratory flow or forced expiratory volume in one second (FEV1) measurements are more specific and confirm occupational asthma in about 82% of those still exposed to the causative agent. Skin prick testing or specific immunoglobulin E assays are useful to document allergy to high molecular weight allergens. Specific inhalational challenge tests come closest to a gold standard test, but lack standardisation, availability and sensitivity. Supervised workplace challenges can be used when specific challenges are unavailable or the results non-diagnostic, but methodology lacks standardisation. Finally, if the diagnosis remains unclear a follow-up with serial measurements of FEV1 and non-specific bronchial hyperresponsiveness should detect those likely to develop permanent impairment from their occupational exposures. PMID:23768266
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caldecott, Ralph; Kamarulzaman, Dayangku N. S.; Kirrane, James P.
The concept of momentum exchange when applied to space tethers for propulsion is well established, and a considerable body of literature now exists on the on-orbit modelling, the dynamics, and also the control of a large range of tether system applications. The authors consider here a new application for the Motorised Momentum Exchange Tether by highlighting three key stages of development leading to a conceptualisation that can subsequently be developed into a technology for Active Debris Removal. The paper starts with a study of the on-orbit mechanics of a full sized motorised tether in which it is shown that amore » laden and therefore highly massasymmetrical tether can still be forced to spin, and certainly to librate, thereby confirming its possible usefulness for active debris removal (ADR). The second part of the paper concentrates on the modelling of the centripetal deployment of a symmetrical MMET in order to get it initialized for debris removal operations, and the third and final part of the paper provides an entry into scale modelling for low cost mission design and testing. It is shown that the motorised momentum exchange tether offers a potential solution to the removal of large pieces of orbital debris, and that dynamic methodologies can be implemented to in order to optimise the emergent design.« less
Moral judgment reloaded: a moral dilemma validation study
Christensen, Julia F.; Flexas, Albert; Calabrese, Margareta; Gut, Nadine K.; Gomila, Antoni
2014-01-01
We propose a revised set of moral dilemmas for studies on moral judgment. We selected a total of 46 moral dilemmas available in the literature and fine-tuned them in terms of four conceptual factors (Personal Force, Benefit Recipient, Evitability, and Intention) and methodological aspects of the dilemma formulation (word count, expression style, question formats) that have been shown to influence moral judgment. Second, we obtained normative codings of arousal and valence for each dilemma showing that emotional arousal in response to moral dilemmas depends crucially on the factors Personal Force, Benefit Recipient, and Intentionality. Third, we validated the dilemma set confirming that people's moral judgment is sensitive to all four conceptual factors, and to their interactions. Results are discussed in the context of this field of research, outlining also the relevance of our RT effects for the Dual Process account of moral judgment. Finally, we suggest tentative theoretical avenues for future testing, particularly stressing the importance of the factor Intentionality in moral judgment. Additionally, due to the importance of cross-cultural studies in the quest for universals in human moral cognition, we provide the new set dilemmas in six languages (English, French, German, Spanish, Catalan, and Danish). The norming values provided here refer to the Spanish dilemma set. PMID:25071621
Peterson, Candida C.; Wellman, Henry M.
2011-01-01
We examined deaf and hearing children’s progression of steps in theory-of-mind (ToM) development including their understanding of social pretending. Ninety-three children (33 deaf; 60 hearing) aged 3 to 13 years were tested on a set of six closely-matched ToM tasks. Results showed that deaf children were delayed substantially behind hearing children in understanding pretending, false belief and other ToM concepts, in line with their delayed uptake of social pretend play. By using a scaling methodology, we confirmed previous evidence of a consistent five-step developmental progression for both groups. Moreover, by including social pretence understanding, both deaf and hearing children’s ToM sequences were shown to extend reliably to six sequential developmental steps. Finally and focally, even though both groups’ sequences were six steps long, the placement of pretence relative to other ToM milestones varied with hearing status. Deaf children understood social pretending at an earlier step in the ToM sequence than hearing children, albeit at a later chronological age. Theoretically, the findings are relevant to questions about how universal developmental progressions come together along with culturally-distinctive inputs and biological factors (such as hearing loss) to set the pace for ToM development. PMID:19998533
Ding, Lin; Liu, Jun-Lin; Hassan, Waseem; Wang, Lu-Lu; Yan, Fang-Rong; Shang, Jing
2014-01-01
To investigate a possible methodology of exploiting herbal medicine and design polytherapy for the treatment of non-alcoholic fatty liver disease (NAFLD), we have made use of Cichorium glandulosum Boiss et Huet (CG), a traditional Chinese herbal medicine that has been proven to be effective in treating hepatic diseases. Here, we report that the extract of CG effectively reduced lipid accumulation under conditions of lipid overloading in vivo and in vitro (in a rat high-fat diet model and a hepG2 cell model of free fatty acid treatment). CG extract also protected hepatocytes from injury and inflammation to aid its lipid-lowering properties (in a rat high-fat diet model and a L02 cell model of acetaminophen treatment). Serum chemistry analysis accompanied by in vitro drug screening confirmed that CG-4, CG-10 and CG-14 are the lipo-effective components of CG. Western blotting analysis revealed that these components can regulate key lipid targets at the molecular level, including CD36, FATP5 and PPAR-α, thus the lipid oxidation and lipid absorption pathways. Finally, we adopted the experimental design and statistical method to calculate the best combination proportion (CG-4: CG-10: CG-14 = 2.065: 1.782: 2.153) to optimize its therapeutic effect. PMID:24797163
Peterson, Candida C; Wellman, Henry M
2009-06-01
We examined deaf and hearing children's progression of steps in theory of mind (ToM) development including their understanding of social pretending. Ninety-three children (33 deaf; 60 hearing) aged 3-13 years were tested on a set of six closely matched ToM tasks. Results showed that deaf children were delayed substantially behind hearing children in understanding pretending, false belief (FB) and other ToM concepts, in line with their delayed uptake of social pretend (SP) play. By using a scaling methodology, we confirmed previous evidence of a consistent five-step developmental progression for both groups. Moreover, by including social pretence understanding, both deaf and hearing children's ToM sequences were shown to extend reliably to six sequential developmental steps. Finally and focally, even though both groups' sequences were six steps long, the placement of pretence relative to other ToM milestones varied with hearing status. Deaf children understood social pretending at an earlier step in the ToM sequence than hearing children, albeit at a later chronological age. Theoretically, the findings are relevant to questions about how universal developmental progressions come together along with culturally distinctive inputs and biological factors (such as hearing loss) to set the pace for ToM development.
Mate choice and human stature: homogamy as a unified framework for understanding mating preferences.
Courtiol, Alexandre; Raymond, Michel; Godelle, Bernard; Ferdy, Jean-Baptiste
2010-08-01
Assortative mating for human height has long attracted interest in evolutionary biology, and the phenomenon has been demonstrated in numerous human populations. It is often argued that mating preferences generate this pattern, but other processes can also induce trait correlations between mates. Here, we present a methodology tailored to quantify continuous preferences based on choice experiments between pairs of stimuli. In particular, it is possible to explore determinants of interindividual variations in preferences, such as the height of the chooser. We collected data from a sample of 200 individuals from France. Measurements obtained show that the perception of attractiveness depends on both the height of the stimuli and the stature of the individual who judged them. Therefore, this study demonstrates that homogamy is present at the level of preferences for both sexes. We also show that measurements of the function describing this homogamy are concordant with several distinct mating rules proposed in the literature. In addition, the quantitative approach introduced here fulfills metrics that can be used to compare groups of individuals. In particular, our results reveal an important disagreement between sexes regarding height preferences in the context of mutual mate choice. Finally, both women and men prefer individuals who are significantly taller than average. All major findings are confirmed by a reanalysis of previously published data.
The mechanics of motorised momentum exchange tethers when applied to active debris removal from LEO
NASA Astrophysics Data System (ADS)
Caldecott, Ralph; Kamarulzaman, Dayangku N. S.; Kirrane, James P.; Cartmell, Matthew P.; Ganilova, Olga A.
2014-12-01
The concept of momentum exchange when applied to space tethers for propulsion is well established, and a considerable body of literature now exists on the on-orbit modelling, the dynamics, and also the control of a large range of tether system applications. The authors consider here a new application for the Motorised Momentum Exchange Tether by highlighting three key stages of development leading to a conceptualisation that can subsequently be developed into a technology for Active Debris Removal. The paper starts with a study of the on-orbit mechanics of a full sized motorised tether in which it is shown that a laden and therefore highly massasymmetrical tether can still be forced to spin, and certainly to librate, thereby confirming its possible usefulness for active debris removal (ADR). The second part of the paper concentrates on the modelling of the centripetal deployment of a symmetrical MMET in order to get it initialized for debris removal operations, and the third and final part of the paper provides an entry into scale modelling for low cost mission design and testing. It is shown that the motorised momentum exchange tether offers a potential solution to the removal of large pieces of orbital debris, and that dynamic methodologies can be implemented to in order to optimise the emergent design.
Environmental and water sustainability of milk production in Northeast Spain.
Noya, I; González-García, S; Berzosa, J; Baucells, F; Feijoo, G; Moreira, M T
2018-03-01
This study focuses on the assessment of the environmental profile of a milk farm, representative of the dairy sector in Northeast Spain, from a cradle-to-gate perspective. The Life Cycle Assessment (LCA) principles established by ISO standards together with the carbon footprint guidelines proposed by International Dairy Federation (IDF) were followed. The environmental results showed two critical contributing factors: the production of the livestock feed (e.g., alfalfa) and the on-farm emissions from farming activities, with contributions higher than 50% in most impact categories. A comparison with other LCA studies was carried out, which confirmed the consistency of these results with the values reported in the literature for dairy systems from several countries. Additionally, the Water Footprint (WF) values were also estimated according to the Water Footprint Network (WFN) methodology to reveal that feed and fodder production also had a predominant influence on the global WF impacts, with contributions of 99%. Green WF was responsible for remarkable environmental burdens (around 88%) due to the impacts associated with the cultivation stage. Finally, the substitution of alfalfa by other alternative protein sources in animal diets were also proposed and analysed due to its relevance as one of the main contributors of livestock feed. Copyright © 2017 Elsevier B.V. All rights reserved.
Towards an Adolescent Friendly Methodology: Accessing the Authentic through Collective Reflection
ERIC Educational Resources Information Center
Keeffe, Mary; Andrews, Dorothy
2015-01-01
The re-emergence of student voice presents a challenge to schools and researchers to become more responsive to the voice of adolescents in education and in research. However, the poor articulation of the nature of student voice to date is confirmation of the complex and important nature of the personal advocacy and human agency that is involved in…
News and views in Histochemistry and Cell Biology.
Asan, Esther; Drenckhahn, Detlev
2004-12-01
Advances in histochemical methodology and ingenious applications of novel and improved methods continue to confirm the standing of morphological means and approaches in research efforts, and contribute significantly to increasing our knowledge about structures and functions in all areas of the life sciences from cell biology to pathology. Reports published during recent months documenting this progress are summarized in the present review.
[Failure mode effect analysis applied to preparation of intravenous cytostatics].
Santos-Rubio, M D; Marín-Gil, R; Muñoz-de la Corte, R; Velázquez-López, M D; Gil-Navarro, M V; Bautista-Paloma, F J
2016-01-01
To proactively identify risks in the preparation of intravenous cytostatic drugs, and to prioritise and establish measures to improve safety procedures. Failure Mode Effect Analysis methodology was used. A multidisciplinary team identified potential failure modes of the procedure through a brainstorming session. The impact associated with each failure mode was assessed with the Risk Priority Number (RPN), which involves three variables: occurrence, severity, and detectability. Improvement measures were established for all identified failure modes, with those with RPN>100 considered critical. The final RPN (theoretical) that would result from the proposed measures was also calculated and the process was redesigned. A total of 34 failure modes were identified. The initial accumulated RPN was 3022 (range: 3-252), and after recommended actions the final RPN was 1292 (range: 3-189). RPN scores >100 were obtained in 13 failure modes; only the dispensing sub-process was free of critical points (RPN>100). A final reduction of RPN>50% was achieved in 9 failure modes. This prospective risk analysis methodology allows the weaknesses of the procedure to be prioritised, optimize use of resources, and a substantial improvement in the safety of the preparation of cytostatic drugs through the introduction of double checking and intermediate product labelling. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Optimization of Fat-Reduced Puff Pastry Using Response Surface Methodology.
Silow, Christoph; Zannini, Emanuele; Axel, Claudia; Belz, Markus C E; Arendt, Elke K
2017-02-22
Puff pastry is a high-fat bakery product with fat playing a key role, both during the production process and in the final pastry. In this study, response surface methodology (RSM) was successfully used to evaluate puff pastry quality for the development of a fat-reduced version. The technological parameters modified included the level of roll-in fat, the number of fat layers (50-200) and the final thickness (1.0-3.5 mm) of the laminated dough. Quality characteristics of puff pastry were measured using the Texture Analyzer with an attached Extended Craft Knife (ECK) and Multiple Puncture Probe (MPP), the VolScan and the C-Cell imaging system. The number of fat layers and final dough thickness, in combination with the amount of roll-in fat, had a significant impact on the internal and external structural quality parameters. With technological changes alone, a fat-reduced (≥30%) puff pastry was developed. The qualities of fat-reduced puff pastries were comparable to conventional full-fat (33 wt %) products. A sensory acceptance test revealed no significant differences in taste of fatness or 'liking of mouthfeel'. Additionally, the fat-reduced puff pastry resulted in a significant ( p < 0.05) positive correlation to 'liking of flavor' and overall acceptance by the assessors.
Optimization of Fat-Reduced Puff Pastry Using Response Surface Methodology
Silow, Christoph; Zannini, Emanuele; Axel, Claudia; Belz, Markus C. E.; Arendt, Elke K.
2017-01-01
Puff pastry is a high-fat bakery product with fat playing a key role, both during the production process and in the final pastry. In this study, response surface methodology (RSM) was successfully used to evaluate puff pastry quality for the development of a fat-reduced version. The technological parameters modified included the level of roll-in fat, the number of fat layers (50–200) and the final thickness (1.0–3.5 mm) of the laminated dough. Quality characteristics of puff pastry were measured using the Texture Analyzer with an attached Extended Craft Knife (ECK) and Multiple Puncture Probe (MPP), the VolScan and the C-Cell imaging system. The number of fat layers and final dough thickness, in combination with the amount of roll-in fat, had a significant impact on the internal and external structural quality parameters. With technological changes alone, a fat-reduced (≥30%) puff pastry was developed. The qualities of fat-reduced puff pastries were comparable to conventional full-fat (33 wt %) products. A sensory acceptance test revealed no significant differences in taste of fatness or ‘liking of mouthfeel’. Additionally, the fat-reduced puff pastry resulted in a significant (p < 0.05) positive correlation to ‘liking of flavor’ and overall acceptance by the assessors. PMID:28231095
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid
2017-03-16
Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. ©Richard Harte, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Leo R Quinlan, Gearóid ÓLaighin. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.03.2017.
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Santiago (Chile). Regional Office for Education in Latin America and the Caribbean.
The final report of a conference concerning adult basic education de Adultos en el Marco de la REDALF del related to innovative projects in adult education in nine countries. A wide spectrum of issues related to adult basic education, curriculum, methodology, evaluation, and research are analyzed in the context of educational planning. Among the…
Sectorial analysis of nanotechnology companies in Argentina
NASA Astrophysics Data System (ADS)
Foladori, Guillermo; Lau, Edgar Záyago; Carroza, Tomás; Appelbaum, Richard P.; Villa, Liliana; Robles-Belmont, Eduardo
2017-06-01
In this paper, we identify 37 companies that produce nano-enabled products in Argentina. We locate the products of these firms in terms of both their economic sector and position in a value chain. The research was done through a four-step methodology. Firstly, an inventory of firms was created. Secondly, the firms were classified by their economic sector, following the United Nations economic classification. Thirdly, the firms were located within a simple nanotechnology value chain. Finally, the products were classified according to their final destination, being either means of production or final consumer products. The results show that healthcare, cosmetics, and medicine is the most represented sector along the value chain, followed by electronics.
Methodologies used to estimate and forecast vehicle miles traveled (VMT) : final report.
DOT National Transportation Integrated Search
2016-07-01
Vehicle miles traveled (VMT) is a measure used in transportation planning for a variety of purposes. It measures the amount of travel for all vehicles in a geographic region over a given period of time, typically a one-year period. VMT is calculated ...
Validation of source approval of HMA surface mix aggregate using spectrometer : final report.
DOT National Transportation Integrated Search
2016-04-01
The main focus of this research project was to develop methodologies for the validation of source approval of hot : mix asphalt surface mix aggregate. In order to further enhance the validation process, a secondary focus was also to : create a spectr...
CHARACTERIZATION OF EXPOSURES TO ALDEHYDES IN THREE US URBAN AREAS: 1. METHODOLOGY. (R824834C006)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Evaluation of freeway motorist assist program : final report, September 30, 2009.
DOT National Transportation Integrated Search
2010-02-01
This evaluation of the Motorist Assist (MA) program in St. Louis estimated that MA has an annual benefit-cost ratio (B/C) of 38.25:1 using 2009 dollars. This estimate was based on nationally accepted AASHTO methodology and was based on 1082 secondary...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
DOT National Transportation Integrated Search
2009-08-01
Asphalt mixtures designed using modern conventional methods, whether Marshall or Superpave methodologies, fail to address the cracking performance of these mixtures. Research previously conducted at the University of Florida for the Florida Departmen...
2006-09-22
This final rule adopts the substance of the April 15, 2004 tentative interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment allows certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This final rule also requires that nursing facilities at least install battery-operated single station smoke alarms in resident rooms and common areas if they are not fully sprinklered or they do not have system-based smoke detectors in those areas. Finally, this final rule confirms as final the provisions of the March 25, 2005 interim final rule with changes and responds to public comments on that rule.
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Davidson, John B.
1998-01-01
A multi-input, multi-output control law design methodology, named "CRAFT", is presented. CRAFT stands for the design objectives addressed, namely, Control power, Robustness, Agility, and Flying Qualities Tradeoffs. The methodology makes use of control law design metrics from each of the four design objective areas. It combines eigenspace assignment, which allows for direct specification of eigenvalues and eigenvectors, with a graphical approach for representing the metrics that captures numerous design goals in one composite illustration. Sensitivity of the metrics to eigenspace choice is clearly displayed, enabling the designer to assess the cost of design tradeoffs. This approach enhances the designer's ability to make informed design tradeoffs and to reach effective final designs. An example of the CRAFT methodology applied to an advanced experimental fighter and discussion of associated design issues are provided.
Cassini's Test Methodology for Flight Software Verification and Operations
NASA Technical Reports Server (NTRS)
Wang, Eric; Brown, Jay
2007-01-01
The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).
Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea
2014-02-01
Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.
Measuring political polarization: Twitter shows the two sides of Venezuela
NASA Astrophysics Data System (ADS)
Morales, A. J.; Borondo, J.; Losada, J. C.; Benito, R. M.
2015-03-01
We say that a population is perfectly polarized when divided in two groups of the same size and opposite opinions. In this paper, we propose a methodology to study and measure the emergence of polarization from social interactions. We begin by proposing a model to estimate opinions in which a minority of influential individuals propagate their opinions through a social network. The result of the model is an opinion probability density function. Next, we propose an index to quantify the extent to which the resulting distribution is polarized. Finally, we apply the proposed methodology to a Twitter conversation about the late Venezuelan president, Hugo Chávez, finding a good agreement between our results and offline data. Hence, we show that our methodology can detect different degrees of polarization, depending on the structure of the network.
Liu, Xiaozhen; Jin, Gan; Qian, Jiacheng; Yang, Hongjian; Tang, Hongchao; Meng, Xuli; Li, Yongfeng
2018-04-23
This study aimed to screen sensitive biomarkers for the efficacy evaluation of neoadjuvant chemotherapy in breast cancer. In this study, Illumina digital gene expression sequencing technology was applied and differentially expressed genes (DEGs) between patients presenting pathological complete response (pCR) and non-pathological complete response (NpCR) were identified. Further, gene ontology and Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway enrichment analysis were then performed. The genes in significant enriched pathways were finally quantified by quantitative real-time PCR (qRT-PCR) to confirm that they were differentially expressed. Additionally, GSE23988 from Gene Expression Omnibus database was used as the validation dataset to confirm the DEGs. After removing the low-quality reads, 715 DEGs were finally detected. After mapping to KEGG pathways, 10 DEGs belonging to the ubiquitin proteasome pathway (HECTD3, PSMB10, UBD, UBE2C, and UBE2S) and cytokine-cytokine receptor interactions (CCL2, CCR1, CXCL10, CXCL11, and IL2RG) were selected for further analysis. These 10 genes were finally quantified by qRT-PCR to confirm that they were differentially expressed (the log 2 fold changes of selected genes were - 5.34, 7.81, 6.88, 5.74, 3.11, 19.58, 8.73, 8.88, 7.42, and 34.61 for HECTD3, PSMB10, UBD, UBE2C, UBE2S, CCL2, CCR1, CXCL10, CXCL11, and IL2RG, respectively). Moreover, 53 common genes were confirmed by the validation dataset, including downregulated UBE2C and UBE2S. Our results suggested that these 10 genes belonging to these two pathways might be useful as sensitive biomarkers for the efficacy evaluation of neoadjuvant chemotherapy in breast cancer.
Biomimetic novel nanoporous niobium oxide coating for orthopaedic applications
NASA Astrophysics Data System (ADS)
Pauline, S. Anne; Rajendran, N.
2014-01-01
Niobium oxide was synthesized by sol-gel methodology and a crystalline, nanoporous and adherent coating of Nb2O5 was deposited on 316L SS using the spin coating technique and heat treatment. The synthesis conditions were optimized to obtain a nanoporous morphology. The coating was characterized using attenuated total reflectance-Infrared spectroscopy (ATR-IR), X-ray diffraction analysis (XRD), scanning electron microscopy (SEM) and energy dispersive X-ray analysis (EDX), atomic force microscopy (AFM) and transmission electron microscopy (TEM) and the formation of crystalline Nb2O5 coating with nanoporous morphology was confirmed. Mechanical studies confirmed that the coating has excellent adherence to the substrate and the hardness value of the coating was excellent. Contact angle analysis showed increased hydrophilicity for the coated substrate. In vitro bioactivity test confirmed that the Nb2O5 coating with nanoporous morphology facilitated the growth of hydroxyapatite (HAp). This was further confirmed by the solution analysis test where increased uptake of calcium and phosphorous ions from simulated body fluid (SBF) was observed. Electrochemical evaluation of the coating confirmed that the crystalline coating is insulative and protective in nature and offered excellent corrosion protection to 316L SS. Thus, this study confirmed that the nanoporous crystalline Nb2O5 coating conferred bioactivity and enhanced corrosion resistance on 316L SS.
Essential methodological considerations when using grounded theory.
Achora, Susan; Matua, Gerald Amandu
2016-07-01
To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.
Risk assessment for physical and cyber attacks on critical infrastructures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.
2005-08-01
Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results ofmore » a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.« less
Bas, Esra
2014-07-01
In this paper, an integrated methodology for Quality Function Deployment (QFD) and a 0-1 knapsack model is proposed for occupational safety and health as a systems thinking approach. The House of Quality (HoQ) in QFD methodology is a systematic tool to consider the inter-relationships between two factors. In this paper, three HoQs are used to consider the interrelationships between tasks and hazards, hazards and events, and events and preventive/protective measures. The final priority weights of events are defined by considering their project-specific preliminary weights, probability of occurrence, and effects on the victim and the company. The priority weights of the preventive/protective measures obtained in the last HoQ are fed into a 0-1 knapsack model for the investment decision. Then, the selected preventive/protective measures can be adapted to the task design. The proposed step-by-step methodology can be applied to any stage of a project to design the workplace for occupational safety and health, and continuous improvement for safety is endorsed by the closed loop characteristic of the integrated methodology. Copyright © 2013 Elsevier Ltd. All rights reserved.
2012-07-31
differences between species, and to show the response of spores to vacuum and response of cultured cells to heat . The results of this work confirmed...06_01_2012 (Accepted). Technology Transfer - 1 - Report Type: Final Technical Report Proposal Number: 54518EL Agreement Number...the response of spores to vacuum and response of cultured cells to heat . The results of this work confirmed that observed spectroscopic features
OARE STS-78 (LMS-1) Final Report
NASA Technical Reports Server (NTRS)
Rice, James E.
1996-01-01
The report is organized into sections representing the phases of work performed in analyzing the STS-78 (LMS-1) results. Section 1 briefly outlines the Orbital Acceleration Research Experiment (OARE) system features, coordinates, and measurement parameters. Section 2 describes the results from STS-78. The mission description, data calibration, and representative data obtained on STS-78 are presented. Also, the anomalous performance of OARE on STS-78 is discussed. Finally, Section 3 presents a discussion of accuracy achieved and achievable with OARE. Appendix A discusses the data processing methodology in detail.
Blais, Jules M.; Rosen, Michael R.; Smol, John P.
2015-01-01
Newly produced, as well as some so-called legacy contaminants, continue to be released into the environment at an accelerated rate. Given the general lack of integrated, direct monitoring programs, the use of natural archival records of contaminants will almost certainly continue to increase. We conclude this volume with a short chapter highlighting some of our final thoughts, with a focus on a call to action to develop and apply methodologies to assess the fidelity of the archival record.
Novel horizontal and vertical integrated bioethics curriculum for medical courses.
D'Souza, Russell F; Mathew, Mary; D'Souza, Derek S J; Palatty, Princy
2018-02-28
Studies conducted by the University of Haifa, Israel in 2001, evaluating the effectiveness of bioethics being taught in medical colleges, suggested that there was a significant lack of translation in clinical care. Analysis also revealed, ineffectiveness with the teaching methodology used, lack of longitudinal integration of bioethics into the undergraduate medical curriculum, and the limited exposure to the technology in decision making when confronting ethical dilemmas. A modern novel bioethics curriculum and innovative methodology for teaching bioethics for the medical course was developed by the UNESCO Chair in Bioethics, Haifa. The horizontal (subject-wise) curriculum was vertically integrated seamlessly through the entire course. An innovative bioethics teaching methodology was employed to implement the curriculum. This new curriculum was piloted in a few medical colleges in India from 2011 to 2015 and the outcomes were evaluated. The evaluation confirmed gains over the earlier identified translation gap with added high student acceptability and satisfaction. This integrated curriculum is now formally implemented in the Indian program's Health Science Universities which is affiliated with over 200 medical schools in India. This article offers insights from the evaluated novel integrated bioethics curriculum and the innovative bioethics teaching methodology that was used in the pilot program.
Wada, Yoshinao; Dell, Anne; Haslam, Stuart M; Tissot, Bérangère; Canis, Kévin; Azadi, Parastoo; Bäckström, Malin; Costello, Catherine E; Hansson, Gunnar C; Hiki, Yoshiyuki; Ishihara, Mayumi; Ito, Hiromi; Kakehi, Kazuaki; Karlsson, Niclas; Hayes, Catherine E; Kato, Koichi; Kawasaki, Nana; Khoo, Kay-Hooi; Kobayashi, Kunihiko; Kolarich, Daniel; Kondo, Akihiro; Lebrilla, Carlito; Nakano, Miyako; Narimatsu, Hisashi; Novak, Jan; Novotny, Milos V; Ohno, Erina; Packer, Nicolle H; Palaima, Elizabeth; Renfrow, Matthew B; Tajiri, Michiko; Thomsson, Kristina A; Yagi, Hirokazu; Yu, Shin-Yi; Taniguchi, Naoyuki
2010-04-01
The Human Proteome Organisation Human Disease Glycomics/Proteome Initiative recently coordinated a multi-institutional study that evaluated methodologies that are widely used for defining the N-glycan content in glycoproteins. The study convincingly endorsed mass spectrometry as the technique of choice for glycomic profiling in the discovery phase of diagnostic research. The present study reports the extension of the Human Disease Glycomics/Proteome Initiative's activities to an assessment of the methodologies currently used for O-glycan analysis. Three samples of IgA1 isolated from the serum of patients with multiple myeloma were distributed to 15 laboratories worldwide for O-glycomics analysis. A variety of mass spectrometric and chromatographic procedures representative of current methodologies were used. Similar to the previous N-glycan study, the results convincingly confirmed the pre-eminent performance of MS for O-glycan profiling. Two general strategies were found to give the most reliable data, namely direct MS analysis of mixtures of permethylated reduced glycans in the positive ion mode and analysis of native reduced glycans in the negative ion mode using LC-MS approaches. In addition, mass spectrometric methodologies to analyze O-glycopeptides were also successful.
Architectural Methodology Report
NASA Technical Reports Server (NTRS)
Dhas, Chris
2000-01-01
The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.
Alvarez-Nemegyei, José; Buenfil-Rello, Fátima Annai; Pacheco-Pantoja, Elda Leonor
2016-01-01
Reports regarding the association between body composition and inflammatory activity in rheumatoid arthritis (RA) have consistently yielded contradictory results. To perform a systematic review on the association between overweight/obesity and inflammatory activity in RA. FAST approach: Article search (Medline, EBSCO, Cochrane Library), followed by abstract retrieval, full text review and blinded assessment of methodological quality for final inclusion. Because of marked heterogeneity in statistical approach and RA activity assessment method, a meta-analysis could not be done. Results are presented as qualitative synthesis. One hundred and nineteen reports were found, 16 of them qualified for full text review. Eleven studies (8,147 patients; n range: 37-5,161) approved the methodological quality filter and were finally included. Interobserver agreement for methodological quality score (ICC: 0.93; 95% CI: 0.82-0.98; P<.001) and inclusion/rejection decision (k 1.00, P>.001) was excellent. In all reports body composition was assessed by BMI; however a marked heterogeneity was found in the method used for RA activity assessment. A significant association between BMI and RA activity was found in 6 reports having larger mean sample size: 1,274 (range: 140-5,161). On the other hand, this association was not found in 5 studies having lower mean sample size: 100 (range: 7-150). The modulation of RA clinical status by body fat mass is suggested because a significant association was found between BMI and inflammatory activity in those reports with a trend toward higher statistical power. The relationship between body composition and clinical activity in RA requires be approached with further studies with higher methodological quality. Copyright © 2015 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.
van Mil, Anke C C M; Greyling, Arno; Zock, Peter L; Geleijnse, Johanna M; Hopman, Maria T; Mensink, Ronald P; Reesink, Koen D; Green, Daniel J; Ghiadoni, Lorenzo; Thijssen, Dick H
2016-09-01
Brachial artery flow-mediated dilation (FMD) is a popular technique to examine endothelial function in humans. Identifying volunteer and methodological factors related to variation in FMD is important to improve measurement accuracy and applicability. Volunteer-related and methodology-related parameters were collected in 672 volunteers from eight affiliated centres worldwide who underwent repeated measures of FMD. All centres adopted contemporary expert-consensus guidelines for FMD assessment. After calculating the coefficient of variation (%) of the FMD for each individual, we constructed quartiles (n = 168 per quartile). Based on two regression models (volunteer-related factors and methodology-related factors), statistically significant components of these two models were added to a final regression model (calculated as β-coefficient and R). This allowed us to identify factors that independently contributed to the variation in FMD%. Median coefficient of variation was 17.5%, with healthy volunteers demonstrating a coefficient of variation 9.3%. Regression models revealed age (β = 0.248, P < 0.001), hypertension (β = 0.104, P < 0.001), dyslipidemia (β = 0.331, P < 0.001), time between measurements (β = 0.318, P < 0.001), lab experience (β = -0.133, P < 0.001) and baseline FMD% (β = 0.082, P < 0.05) as contributors to the coefficient of variation. After including all significant factors in the final model, we found that time between measurements, hypertension, baseline FMD% and lab experience with FMD independently predicted brachial artery variability (total R = 0.202). Although FMD% showed good reproducibility, larger variation was observed in conditions with longer time between measurements, hypertension, less experience and lower baseline FMD%. Accounting for these factors may improve FMD% variability.
Ergonomics program management in Tucuruí Hydropower Plant using TPM methodology.
Santos, R M; Sassi, A C; Sá, B M; Miguez, S A; Pardauil, A A
2012-01-01
This paper aims to present the benefits achieved in the ergonomics process management with the use of the TPM methodology (Total Productive Maintenance) in Tucuruí Hydropower Plant. The methodology is aligned with the corporate guidelines, moreover with the Strategic Planning of the company, it is represented in the TPM Pillars including the Health Pillar in which is inserted the ergonomics process. The results of the ergonomic actions demonstrated a 12% reduction over the absenteeism rate due to musculoskeletal disorders, solving 77,0% of ergonomic non-conformities, what favored the rise of the Organizational Climate in 44,8%, impacting on the overall performance of the company. Awards confirmed the success of the work by the achievement of the Award for TPM Excellence in 2001, Award for Excellence in Consistent TPM Commitment in 2009 and more recently the Special Award for TPM Achievement, 2010. The determination of the high rank administration and workers, allied with the involvement/dynamism of Pillars, has assured the success of this management practice in Tucuruí Hydropower Plant.
Armour, Carl L.; Taylor, Jonathan G.
1991-01-01
This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
Extracting the sovereigns’ CDS market hierarchy: A correlation-filtering approach
NASA Astrophysics Data System (ADS)
León, Carlos; Leiton, Karen; Pérez, Jhonatan
2014-12-01
This paper employs correlation-into-distance mapping techniques and a minimal spanning tree-based correlation-filtering methodology on 36 sovereign CDS spread time-series in order to identify the sovereigns’ informational hierarchy. The resulting hierarchy (i) concurs with sovereigns’ eigenvector centrality; (ii) confirms the importance of geographical and credit rating clustering; (iii) identifies Russia, Turkey and Brazil as regional benchmarks; (iv) reveals the idiosyncratic nature of Japan and United States; (v) confirms that a small set of common factors affects the system; (vi) suggests that lower-medium grade rated sovereigns are the most influential, but also the most prone to contagion; and (vii) suggests the existence of a “Latin American common factor”.
Distributive Education Competency-Based Curriculum Models by Occupational Clusters. Final Report.
ERIC Educational Resources Information Center
Davis, Rodney E.; Husted, Stewart W.
To meet the needs of distributive education teachers and students, a project was initiated to develop competency-based curriculum models for marketing and distributive education clusters. The models which were developed incorporate competencies, materials and resources, teaching methodologies/learning activities, and evaluative criteria for the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-10
... final recommendation documents, and participating in workgroups on specific topics or methods. A... literature and in the methods of evidence review; 2. Understanding and experience in the application of... based on their expertise in methodological issues such as medical decisionmaking, clinical epidemiology...
DOT National Transportation Integrated Search
2008-12-08
The objective of this research is to develop a general methodological framework for planning and : evaluating the effectiveness of highway reconstruction strategies on the systems performance : measures, in particular safety, mobility, and the tot...
A Study of the Inter-Organizational Behavior in Consortia. Final Report.
ERIC Educational Resources Information Center
Silverman, Robert J.
In an attempt to formulate hypotheses and administrative guidelines for voluntary consortia in higher education, a heuristic framework was devised through which behavioral patterns of consortia member organizations and their representatives could be ascertained. The rationale, the framework, and the methodology of the study are first discussed.…
Investigating Relationships among Quality Dimensions in Higher Education
ERIC Educational Resources Information Center
Ardi, Romadhani; Hidayatno, Akhmad; Zagloel, Teuku Yuri M.
2012-01-01
Purpose: This study aims to assess the relationships among quality dimensions in higher education (HE) and to determine the effect of each quality dimension on students' satisfaction. Design/methodology/approach: A questionnaire was developed and distributed to 270 final year students of an engineering faculty in an Indonesian state university.…
Polychlorinated dibenzo-p-dioxins (PCDDs), dibenzofurans (PCDFs), and biphenyls (PCBs) are persistent contaminants found widely in the environment. Several of these compounds bioaccumulate in the tissues of fish, birds, and mammals and have been shown to cause mortality and adver...
50 CFR 86.124 - What are the Comprehensive National Assessment products?
Code of Federal Regulations, 2013 CFR
2013-10-01
...: (a) A single report, including the following information: (1) A national summary of all the..., background, methodology, results, and findings. (6) Information on the following: (i) Boater trends, such as.... (b) Summary report abstracting important information from the final national report. And (c) A key...
50 CFR 86.124 - What are the Comprehensive National Assessment products?
Code of Federal Regulations, 2014 CFR
2014-10-01
...: (a) A single report, including the following information: (1) A national summary of all the..., background, methodology, results, and findings. (6) Information on the following: (i) Boater trends, such as.... (b) Summary report abstracting important information from the final national report. And (c) A key...
42 CFR 422.260 - Appeals of quality bonus payment determinations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... overall star rating. (ii) The reconsideration official's decision is final and binding unless a request... the star ratings (including the calculation of the overall star ratings); cut-off points for determining measure thresholds; the set of measures included in the star rating system; and the methodology...
42 CFR 422.260 - Appeals of quality bonus payment determinations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... overall star rating. (ii) The reconsideration official's decision is final and binding unless a request... the star ratings (including the calculation of the overall star ratings); cut-off points for determining measure thresholds; the set of measures included in the star rating system; and the methodology...
42 CFR 422.260 - Appeals of quality bonus payment determinations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... overall star rating. (ii) The reconsideration official's decision is final and binding unless a request... the star ratings (including the calculation of the overall star ratings); cut-off points for determining measure thresholds; the set of measures included in the star rating system; and the methodology...
42 CFR 422.260 - Appeals of quality bonus payment determinations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... overall star rating. (ii) The reconsideration official's decision is final and binding unless a request... the star ratings (including the calculation of the overall star ratings); cut-off points for determining measure thresholds; the set of measures included in the star rating system; and the methodology...
Project D. E. A. F. Final Report.
ERIC Educational Resources Information Center
Hairston, Ernest E.
Project D.E.A.F., its introduction and background, purpose, clientele, methodology, and activities, are described. Project D.E.A.F. was established by an expansion grant from the Rehabilitation Service Administration of the Department of Health, Education and Welfare, in cooperation with the Ohio Bureau of Vocational Rehabilitation for the purpose…
Educational Planning for Utilization of Space Shuttle (ED-PLUSS). Final Research Report.
ERIC Educational Resources Information Center
Engle, Harry A.; Christensen, David L.
Possible educational uses of the proposed space-shuttle program of the National Aeronautics and Space Administration are outlined. Potential users of information developed by the project are identified and their characteristics analyzed. Other space-education programs operated by NASA are detailed. Proposals for a methodology for expanding…
DOT National Transportation Integrated Search
2004-01-01
The steady growth of commercial truck travel has led to an increasing demand for truck parking spaces at public rest areas and private truck stops on interstate highways in Virginia. This study developed a methodology to determine the supply and dema...
ERIC Educational Resources Information Center
Bhattacharyya, Ena; Patil, Arun; Sargunan, Rajeswary Appacutty
2010-01-01
Engineering communication studies indicate the importance of oral presentations as an indispensable component of workplace oral communication activities; however, since there is limited literature regarding stakeholder perceptions of effective presentation skills and attributes in technical oral presentations or final year engineering project…
Personal Accountability in Education: Measure Development and Validation
ERIC Educational Resources Information Center
Rosenblatt, Zehava
2017-01-01
Purpose: The purpose of this paper, three-study research project, is to establish and validate a two-dimensional scale to measure teachers' and school administrators' accountability disposition. Design/methodology/approach: The scale items were developed in focus groups, and the final measure was tested on various samples of Israeli teachers and…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-12
... that can demonstrate impact through other methodological approaches such as a quasi-experimental design... definition of ``relevant outcome.'' Lastly, quasi-experimental designs are already included in the definition... paragraph (b) of this definition, provided they are rigorous and comparable across schools. (b) For non...
DOT National Transportation Integrated Search
2017-09-29
Past research efforts have used a wide variety of methodological approaches to analyze pavement performance indicators, pavement rehabilitation treatments, and pavement service life. Using big data informatics methods, the intent of this study is to ...
Automatic mathematical modeling for real time simulation program (AI application)
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1989-01-01
A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.
ERIC Educational Resources Information Center
Mora-Whitehurst, Rina
2013-01-01
This article focuses on elementary principals as instructional leaders, as well as public school initiatives and educational accountability in the United States. It presents the methodology, instrumentation, measures of academic achievement in Florida, data collection, and processing procedures. Finally, it presents data analysis, results of the…
Final Report of Charles E. Strickland.
ERIC Educational Resources Information Center
Strickland, Charles E.
This report is a brief account of the author's 1968-69 postdoctoral fellowship activities at Harvard, studying the bearing of social science literature and methodology on the history of education, particularly with reference to historical patterns of socialization. Specific products to which reference is made include two papers: "Childhood in…
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Differential V-Q Ability: Twenty Years Later
ERIC Educational Resources Information Center
McCarthy, S. Viterbo
1975-01-01
The initial portion of this paper addresses itself to some of the methodological concerns associated with Verbal-Quantitative (V-Q) research. The second section focuses on studies using differential V-Q ability as an independent variable. The final section focuses on reasearch using V-Q ability measures as dependent variables. (Author/BJG)
Students' Entrepreneurial Self-Efficacy: Does the Teaching Method Matter?
ERIC Educational Resources Information Center
Abaho, Ernest; Olomi, Donath R.; Urassa, Goodluck Charles
2015-01-01
Purpose: The purpose of this paper is to examine the various entrepreneurship teaching methods in Uganda and how these methods relate to entrepreneurial self-efficacy (ESE). Design/methodology/approach: A sample of 522 final year students from selected universities and study programs was surveyed using self-reported questionnaires. Findings: There…
ERIC Educational Resources Information Center
System Development Corp., Santa Monica, CA.
A national data program for the marine environment is recommended. Volume 2 includes: (1) objectives, scope, and methodology; (2) summary of the technical development plan; (3) agency development plans - Great Lakes and coastal development and (4) marine data network development plans. (Author)
Assessing System Architectures: The Canonical Decomposition Fuzzy Comparative Methodology
2011-01-01
me. Thank you to my sisters, Vanessa and Valerie, for their support and for putting up with me while we were growing up. Finally and most...Antenna Handbook Theory, Applications, and Deign. New York: Van Nostrand Reinhold. 85 Maier, M. W. and E. Rechtin. 2002. The Art of Systems
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Cooperative Catalog Conversion Study. Final Report.
ERIC Educational Resources Information Center
Peat, Marwick, Mitchell and Co., Washington, DC.
Cost estimates provided by cataloging vendors during January 1981 are analyzed to identify the costs of catalog conversion options and alternatives to the card catalog for six Minnesota regional library systems. Following an executive summary of the study is a discussion of its background, scope, objectives, data gathering methodology, and…
Developing a Methodology for Designing Systems of Instruction.
ERIC Educational Resources Information Center
Carpenter, Polly
This report presents a description of a process for instructional system design, identification of the steps in the design process, and determination of their sequence and interrelationships. As currently envisioned, several interrelated steps must be taken, five of which provide the inputs to the final design process. There are analysis of…
2013-10-03
: In the fiscal year (FY) 2014 inpatient prospective payment systems (IPPS)/long-term care hospital (LTCH) PPS final rule, we established the methodology for determining the amount of uncompensated care payments made to hospitals eligible for the disproportionate share hospital (DSH) payment adjustment in FY 2014 and a process for making interim and final payments. This interim final rule with comment period revises certain operational considerations for hospitals with Medicare cost reporting periods that span more than one Federal fiscal year and also makes changes to the data that will be used in the uncompensated care payment calculation in order to ensure that data from Indian Health Service (IHS) hospitals are included in Factor 1 and Factor 3 of that calculation.
An empirical study using permutation-based resampling in meta-regression
2012-01-01
Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815
Nikolić, Branka; Ćurković, Aleksandar; Dikić, Svetlana Dragojević; Mitrović, Ana; Kuzmanović, Igor; Arandjelović, Aleksandra; Stanković, Goran
2015-07-01
Gestational trophoblastic neoplasm (GTN), choriocarcinoma in coexistence with primary cervical adenocarcinoma, is a rare event not easy to diagnose. Choriocarcinoma is a malignant form of GTN but curable if metastases do not appear early and spread fast. We presented choriocarcinoma in coexistence with primary cervical adenocarcinoma in a 48-year-old patient who had radical hysterectomy because of confirmed cervical carcinoma (Dg: Carcinomaporo vaginalis uteri FIGO st I B1). Histological findings confirmed cervical choriocarcinoma with extensive vascular invasion and apoptosis but GTN choriocarcinoma was finally confirmed after immunohystochemical examinations. Preoperative serum human gonadotropine (beta hCG) level stayed unknown. This patient did not have any pregnancy-like symptoms before the operation. The first beta hCG monitoring was done two months after the operation and found negative. According to the final diagnosis the decision of Consilium for Malignant Diseases was that this patient needed serum hCG monitoring as well as treatment with chemotherapy for high-risk GTN and consequent irradiation for adenocarcinoma. The early and proper diagnosis of nonmetastatic choriocarcinoma of nongestational origine in coexistence with cervical carcinoma is curable and can have good prognosis.
Navigating the grounded theory terrain. Part 1.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
Carfora, Anna; Campobasso, Carlo Pietro; Cassandro, Paola; Petrella, Raffaella; Borriello, Renata
2018-05-09
A recent update of the Italian Road Traffic Law (RTL 41/2016), established severe penal sanctions when drivers, driving under the influence of alcohol (DUI) or drugs (DUID), are involved in road accident that results in death or injuries. A study was carried out to assess the trends of consumption of alcohol, illicit drugs or pharmaceutical among injured drivers suspected for DUI or DUID from 2009 to 2016 in the region of Campania (Italy). Confirmation toxicological analyses were performed on 780 blood samples and 1017 urine samples collected from 1797 injured drivers. These drivers all tested positive for alcohol or drug use through immunoassay screening applied at hospital emergency units and their biological samples transferred to the Forensic Reference Laboratory (FRL) for confirmation analysis. The GC/HS-FID methodology was used to test Blood Alcohol Concentration (BAC). Qualitative and quantitative analyses for drugs were performed using the GC/MS or LC-MS/MS methodology. The BAC >0.5g/L was confirmed in 91.5% of drivers suspected for DUI cases and in 93% of DUID respectively. In DUI cases, results show an increasing incidence of road accidents involving drivers with BAC above 1.5g/L while at concentrations above 0.8g/L alcohol and drugs are both used. Among the suspected DUID cases, the intake of alcohol in association with drugs has consistently increased over time and positive results on blood samples was confirmed for multiple drugs (20%) or cannabis and cocaine alone (18%) followed by benzodiazepines (6%) and methadone (3.5%) respectively. The majority of injured drivers suspected for DUID (1017 cases) did not authorize blood sampling, therefore only urine was analyzed showing the prevalent use of cannabis, followed by multiple drug>cocaine>benzodiazepines>opiates. Among 1797 drivers, suspected at screening for DUI or DUID, 15.4% of cases (64 blood and 213 urine samples) were not confirmed by GC/HS, GC/MS or LC-MS/MS analysis. In forensic toxicological investigations, it is mandatory to satisfy the best quality standards, which is not achievable if immunochemical screening is only performed on urine. Therefore, only confirmed positive results of alcohol or drugs on blood samples can represent conclusive evidence to demonstrate the DUI or DUID related offences. An improvement of the protocols currently applied in Italy for the assessment of DUI or DUID crimes is needed and the confirmation analysis on blood should be considered mandatory. Copyright © 2018 Elsevier B.V. All rights reserved.
On the hunt for the gene of perspective taking: pitfalls in methodology.
Miklósi, Adám; Topál, József
2011-12-01
In this commentary, we evaluate the methodology of Udell, Dorey, and Wynne's (Learning & Behavior, in press) experiment in controlling for environmental factors and argue that their conclusion is not supported. In particular, we emphasise that comparative studies on dogs and wolves need to ensure that both species enjoyed the same rearing history, are comparable in age, and have the same experience with the testing conditions. We also argue that the utilisation of shelter dogs does not control for genetic effects on social behaviour. Finally, we propose a synergetic model to account for both genetic and environmental effects on interspecific social behaviour in dogs and wolves.
Methodology of shell structure reinforcement layout optimization
NASA Astrophysics Data System (ADS)
Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof
2018-01-01
This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.