A New Methodology for Systematic Exploitation of Technology Databases.
ERIC Educational Resources Information Center
Bedecarrax, Chantal; Huot, Charles
1994-01-01
Presents the theoretical aspects of a data analysis methodology that can help transform sequential raw data from a database into useful information, using the statistical analysis of patents as an example. Topics discussed include relational analysis and a technology watch approach. (Contains 17 references.) (LRW)
A design methodology for nonlinear systems containing parameter uncertainty
NASA Technical Reports Server (NTRS)
Young, G. E.; Auslander, D. M.
1983-01-01
In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.
Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment
NASA Technical Reports Server (NTRS)
Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.
2009-01-01
An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.
ICCE/ICCAI 2000 Full & Short Papers (Methodologies).
ERIC Educational Resources Information Center
2000
This document contains the full text of the following full and short papers on methodologies from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Methodology for Learning Pattern Analysis from Web Logs by Interpreting Web Page Contents" (Chih-Kai Chang and…
Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.
1999-01-01
Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio
2015-01-15
The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. Copyright © 2014 Elsevier Ltd. All rights reserved.
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants
Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.
Proposed Objective Odor Control Test Methodology for Waste Containment
NASA Technical Reports Server (NTRS)
Vos, Gordon
2010-01-01
The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.
Dowall, Stuart D; Graham, Victoria A; Tipton, Thomas R W; Hewson, Roger
2009-08-31
Work with highly pathogenic material mandates the use of biological containment facilities, involving microbiological safety cabinets and specialist laboratory engineering structures typified by containment level 3 (CL3) and CL4 laboratories. Consequences of working in high containment are the practical difficulties associated with containing specialist assays and equipment often essential for experimental analyses. In an era of increased interest in biodefence pathogens and emerging diseases, immunological analysis has developed rapidly alongside traditional techniques in virology and molecular biology. For example, in order to maximise the use of small sample volumes, multiplexing has become a more popular and widespread approach to quantify multiple analytes simultaneously, such as cytokines and chemokines. The luminex microsphere system allows for the detection of many cytokines and chemokines in a single sample, but the detection method of using aligned lasers and fluidics means that samples often have to be analysed in low containment facilities. In order to perform cytokine analysis in materials from high containment (CL3 and CL4 laboratories), we have developed an appropriate inactivation methodology after staining steps, which although results in a reduction of median fluorescent intensity, produces statistically comparable outcomes when judged against non-inactivated samples. This methodology thus extends the use of luminex technology for material that contains highly pathogenic biological agents.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y. -D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, delta J(sub eff) as the governing parameter. The methodology contains original and literature J and delta J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
Development of a Practical Methodology for Elastic-Plastic and Fully Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Lee, Y.-D.; Russell, D. A.; Orient, G. E.
1999-01-01
A practical engineering methodology has been developed to analyze and predict fatigue crack growth rates under elastic-plastic and fully plastic conditions. The methodology employs the closure-corrected effective range of the J-integral, (Delta)J(sub eff), as the governing parameter. The methodology contains original and literature J and (Delta)J solutions for specific geometries, along with general methods for estimating J for other geometries and other loading conditions, including combined mechanical loading and combined primary and secondary loading. The methodology also contains specific practical algorithms that translate a J solution into a prediction of fatigue crack growth rate or life, including methods for determining crack opening levels, crack instability conditions, and material properties. A critical core subset of the J solutions and the practical algorithms has been implemented into independent elastic-plastic NASGRO modules. All components of the entire methodology, including the NASGRO modules, have been verified through analysis and experiment, and limits of applicability have been identified.
GuidosToolbox: universal digital image object analysis
Peter Vogt; Kurt Riitters
2017-01-01
The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...
Methodological Issues in Meta-Analyzing Standard Deviations: Comment on Bond and DePaulo (2008)
ERIC Educational Resources Information Center
Pigott, Therese D.; Wu, Meng-Jia
2008-01-01
In this comment on C. F. Bond and B. M. DePaulo, the authors raise methodological concerns about the approach used to analyze the data. The authors suggest further refinement of the procedures used, and they compare the approach taken by Bond and DePaulo with standard methods for meta-analysis. (Contains 1 table and 2 figures.)
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
A Narrative in Search of a Methodology.
Treloar, Anna; Stone, Teresa Elizabeth; McMillan, Margaret; Flakus, Kirstin
2015-07-01
Research papers present us with the summaries of scholars' work; what we readers do not see are the struggles behind the decision to choose one methodology over another. A student's mental health portfolio contained a narrative that led to an exploration of the most appropriate methodology for a projected study of clinical anecdotes told by nurses who work in mental health settings to undergraduates and new recruits about mental health nursing. This paper describes the process of struggle, beginning with the student's account, before posing a number of questions needing answers before the choice of the most appropriate methodology. We argue, after discussing the case for the use of literary analysis, discourse analysis, symbolic interactionism, hermeneutics, and narrative research, that case study research is the methodology of choice. Case study is frequently used in educational research and is sufficiently flexible to allow for an exploration of the phenomenon. © 2014 Wiley Periodicals, Inc.
System analysis through bond graph modeling
NASA Astrophysics Data System (ADS)
McBride, Robert Thomas
2005-07-01
Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.
ERIC Educational Resources Information Center
2003
The Communication Theory & Methodology Division of the proceedings contains the following 14 papers: "Interaction As a Unit of Analysis for Interactive Media Research: A Conceptualization" (Joo-Hyun Lee and Hairong Li); "Towards a Network Approach of Human Action: Theoretical Concepts and Empirical Observations in Media…
ERIC Educational Resources Information Center
Gilpatrick, Eleanor
This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
DESIGN ANALYSIS FOR THE NAVAL SNF WASTE PACKAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.L. Mitchell
2000-05-31
The purpose of this analysis is to demonstrate the design of the naval spent nuclear fuel (SNF) waste package (WP) using the Waste Package Department's (WPD) design methodologies and processes described in the ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000b). The calculations that support the design of the naval SNF WP will be discussed; however, only a sub-set of such analyses will be presented and shall be limited to those identified in the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The objective of this analysis is to describe themore » naval SNF WP design method and to show that the design of the naval SNF WP complies with the ''Naval Spent Nuclear Fuel Disposal Container System Description Document'' (CRWMS M&O 1999a) and Interface Control Document (ICD) criteria for Site Recommendation. Additional criteria for the design of the naval SNF WP have been outlined in Section 6.2 of the ''Waste Package Design Sensitivity Report'' (CRWMS M&O 2000c). The scope of this analysis is restricted to the design of the naval long WP containing one naval long SNF canister. This WP is representative of the WPs that will contain both naval short SNF and naval long SNF canisters. The following items are included in the scope of this analysis: (1) Providing a general description of the applicable design criteria; (2) Describing the design methodology to be used; (3) Presenting the design of the naval SNF waste package; and (4) Showing compliance with all applicable design criteria. The intended use of this analysis is to support Site Recommendation reports and assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the technical product development plan (TPDP) ''Design Analysis for the Naval SNF Waste Package (CRWMS M&O 2000a).« less
Yang, Heejung; Kim, Hyun Woo; Kwon, Yong Soo; Kim, Ho Kyong; Sung, Sang Hyun
2017-09-01
Anthocyanins are potent antioxidant agents that protect against many degenerative diseases; however, they are unstable because they are vulnerable to external stimuli including temperature, pH and light. This vulnerability hinders the quality control of anthocyanin-containing berries using classical high-performance liquid chromatography (HPLC) analytical methodologies based on UV or MS chromatograms. To develop an alternative approach for the quality assessment and discrimination of anthocyanin-containing berries, we used MS spectral data acquired in a short analytical time rather than UV or MS chromatograms. Mixtures of anthocyanins were separated from other components in a short gradient time (5 min) due to their higher polarity, and the representative MS spectrum was acquired from the MS chromatogram corresponding to the mixture of anthocyanins. The chemometric data from the representative MS spectra contained reliable information for the identification and relative quantification of anthocyanins in berries with good precision and accuracy. This fast and simple methodology, which consists of a simple sample preparation method and short gradient analysis, could be applied to reliably discriminate the species and geographical origins of different anthocyanin-containing berries. These features make the technique useful for the food industry. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Higher Education Value Added Using Multiple Outcomes
ERIC Educational Resources Information Center
Milla, Joniada; Martín, Ernesto San; Van Bellegem, Sébastien
2016-01-01
In this article we develop a methodology for the joint value added analysis of multiple outcomes that takes into account the inherent correlation between them. This is especially crucial in the analysis of higher education institutions. We use a unique Colombian database on universities, which contains scores in five domains tested in a…
ERIC Educational Resources Information Center
Jones, Earl I., Ed.
This five-section symposium report includes 22 papers assessing the state-of-the-art in occupational research. Section 1, Occupational Analysis, Structure, and Methods, contains four papers that discuss: the Air Force Occupational Research project, methodologies in job analysis, evaluation, structures and requirements, career development,…
30 CFR 780.21 - Hydrologic information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... contain information on water availability and alternative water sources, including the suitability of...) flooding or streamflow alteration; (D) ground water and surface water availability; and (E) other... Hydrologic information. (a) Sampling and analysis methodology. All water-quality analyses performed to meet...
Analysis of effects of impurities intentionally incorporated into silicon
NASA Technical Reports Server (NTRS)
Uno, F.
1977-01-01
A methodology was developed and implemented to allow silicon samples containing intentionally incorporated impurities to be fabricated into finished solar cells under carefully controlled conditions. The electrical and spectral properties were then measured for each group processed.
2002-12-01
influence? C. METHODOLOGY The methodology for this thesis will be a qualitative analysis of topical scholarly texts, government policy, personal ...other elements of collective security. The best example of collective defense language is contained in Article 5 of the Washington Treaty which... histrionics , has been essentially powerless to halt the enlargement of NATO. All of the actions taken by Russia during the Kosovo crisis in 1999
Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D
NASA Astrophysics Data System (ADS)
Bales, Ben; Pollock, Tresa; Petzold, Linda
2017-06-01
Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.
Amezquita-Sanchez, Juan P; Adeli, Anahita; Adeli, Hojjat
2016-05-15
Mild cognitive impairment (MCI) is a cognitive disorder characterized by memory impairment, greater than expected by age. A new methodology is presented to identify MCI patients during a working memory task using MEG signals. The methodology consists of four steps: In step 1, the complete ensemble empirical mode decomposition (CEEMD) is used to decompose the MEG signal into a set of adaptive sub-bands according to its contained frequency information. In step 2, a nonlinear dynamics measure based on permutation entropy (PE) analysis is employed to analyze the sub-bands and detect features to be used for MCI detection. In step 3, an analysis of variation (ANOVA) is used for feature selection. In step 4, the enhanced probabilistic neural network (EPNN) classifier is applied to the selected features to distinguish between MCI and healthy patients. The usefulness and effectiveness of the proposed methodology are validated using the sensed MEG data obtained experimentally from 18 MCI and 19 control patients. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of a Methodology for Incorporating FESWMS-2DH Results
DOT National Transportation Integrated Search
2000-05-01
This study presents the analysis of a complex flow system that contains two roadways with multiple openings: US Highway 75 and the Southeast Kansas Corridor. Typical analyses of floodplains at such sites involve the use of the one-dimensional backwat...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Partnering for functional genomics research conference: Abstracts of poster presentations
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-06-01
This reports contains abstracts of poster presentations presented at the Functional Genomics Research Conference held April 16--17, 1998 in Oak Ridge, Tennessee. Attention is focused on the following areas: mouse mutagenesis and genomics; phenotype screening; gene expression analysis; DNA analysis technology development; bioinformatics; comparative analyses of mouse, human, and yeast sequences; and pilot projects to evaluate methodologies.
Synthesis of deoxyribonucleotidyl(3'5')arabinonucleosides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, S.H.; Ainsworth, C.F.; Bell, C.L.
Two different synthetic routes using phosphotriester methodology have been utilized to prepare deoxyribonucleotidyl(3'-5)arabinonucleosides containing 9-..beta..-D-arabinofuranosyladenine (ara-A vidarabine) and 1-..beta..-D-arabinofuranosylcytosine (ara-C, cytarabine) at the 3'-terminus in amounts and purity (greater than 95%) suitable for NMR analysis.
ERIC Educational Resources Information Center
American Association for Health, Physical Education, and Recreation, Washington, DC.
This report contains articles on research in kinesiology, the study of the principles of mechanics and anatomy in relation to human movement. Research on sequential timing, somatotype methodology, and linear measurement with cinematographical analysis are presented in the first section. Studies of the hip extensor muscles, kinetic energy, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-10
.... BOEM requests comments from states, local governments, native groups, Tribes, the oil and gas industry... Analysis for the OCS 5-Year Program 2012-2017: Theory and Methodology (BOEM 050-2011), a paper containing a...
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
Versatile fluid-mixing device for cell and tissue microgravity research applications.
Wilfinger, W W; Baker, C S; Kunze, E L; Phillips, A T; Hammerstedt, R H
1996-01-01
Microgravity life-science research requires hardware that can be easily adapted to a variety of experimental designs and working environments. The Biomodule is a patented, computer-controlled fluid-mixing device that can accommodate these diverse requirements. A typical shuttle payload contains eight Biomodules with a total of 64 samples, a sealed containment vessel, and a NASA refrigeration-incubation module. Each Biomodule contains eight gas-permeable Silastic T tubes that are partitioned into three fluid-filled compartments. The fluids can be mixed at any user-specified time. Multiple investigators and complex experimental designs can be easily accommodated with the hardware. During flight, the Biomodules are sealed in a vessel that provides two levels of containment (liquids and gas) and a stable, investigator-controlled experimental environment that includes regulated temperature, internal pressure, humidity, and gas composition. A cell microencapsulation methodology has also been developed to streamline launch-site sample manipulation and accelerate postflight analysis through the use of fluorescent-activated cell sorting. The Biomodule flight hardware and analytical cell encapsulation methodology are ideally suited for temporal, qualitative, or quantitative life-science investigations.
Quantitative on-line analysis of sulfur compounds in complex hydrocarbon matrices.
Djokic, Marko R; Ristic, Nenad D; Olahova, Natalia; Marin, Guy B; Van Geem, Kevin M
2017-08-04
An improved method for on-line measurement of sulfur containing compounds in complex matrices is presented. The on-line system consists of a specifically designed sampling system connected to a comprehensive two-dimensional gas chromatograph (GC×GC) equipped with two capillary columns (Rtx ® -1 PONA×SGE BPX50), a flame ionization detector (FID) and a sulfur chemiluminescence detector (SCD). The result is an unprecedented sensitivity down to ppm level (1 ppm-w) for various sulfur containing compounds in very complex hydrocarbon matrices. In addition to the GC×GC-SCD, the low molecular weight sulfur containing compounds such as hydrogen sulfide (H 2 S) and carbonyl sulfide (COS) can be analyzed using a thermal conductivity detector of a so-called refinery gas analyzer (RGA). The methodology was extensively tested on a continuous flow pilot plant for steam cracking, in which quantification of sulfur containing compounds in the reactor effluent was carried out using 3-chlorothiophene as internal standard. The GC×GC-FID/-SCD settings were optimized for ppm analysis of sulfur compounds in olefin-rich (ethylene- and propylene-rich) hydrocarbon matrices produced by steam cracking of petroleum feedstocks. Besides that is primarily used for analysis of the hydrocarbon matrix, FID of the GC×GC-FID/-SCD set-up serves to double check the amount of added sulfur internal standard which is crucial for a proper quantification of sulfur compounds. When vacuum gas oil containing 780 ppm-w of elemental sulfur in the form of benzothiophenes and dibenzothiophenes is subjected to steam cracking, the sulfur balance was closed, with 75% of the sulfur contained in the feed is converted to hydrogen sulfide, 13% to alkyl homologues of thiophene while the remaining 12% is present in the form of alkyl homologues of benzothiophenes. The methodology can be applied for many other conversion processes which use sulfur containing feeds such as hydrocracking, catalytic cracking, kerogen evolution, bio-waste pyrolysis, supercritical water treatment, etc. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2014-03-01
A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.
ERIC Educational Resources Information Center
Whittenburg, John A.; Tice, Pamela Paradis; Baker, Gail L.; Lemmey, Dorothy E.
2001-01-01
Presents a methodological critique of the 1998 meta-analysis of child sexual abuse outcomes by Rind et al. By restricting a supposedly broad meta-analysis to only some of the population in question, the conclusions they drew regarding this complex topic, primarily that adult-child sex is not necessarily harmful, are invalid. (Contains 33…
ERIC Educational Resources Information Center
Bashaw, W. L., Ed.; Findley, Warren G., Ed.
This volume contains the five major addresses and subsequent discussion from the Symposium on the General Linear Models Approach to the Analysis of Experimental Data in Educational Research, which was held in 1967 in Athens, Georgia. The symposium was designed to produce systematic information, including new methodology, for dissemination to the…
ERIC Educational Resources Information Center
Swail, Watson Scott; Cabrera, Alberto F.; Lee, Chul; Williams, Adriane
2005-01-01
The third component of our three-part series focuses on students who attained a bachelor's degree and what it took to get there. We used multiple regression analysis to determine the factors that seemed to matter on the pathway to the BA. The appendix of this report provides methodological details to this analysis. (Contains 2 footnotes and 7…
Depression Prevention Research: Design, Implementation, and Analysis of Randomized Trials.
ERIC Educational Resources Information Center
Munoz, Ricardo F.; And Others
This document contains three papers concerned with prevention intervention research, a new area of depression research which has shown great promise for contributing new knowledge to the understanding of depression. The first paper, "Clinical Trials vs. Prevention Trials: Methodological Issues in Depression Research" (Ricardo F. Munoz), emphasizes…
Some Statistical Properties of Tonality, 1650-1900
ERIC Educational Resources Information Center
White, Christopher Wm.
2013-01-01
This dissertation investigates the statistical properties present within corpora of common practice music, involving a data set of more than 8,000 works spanning from 1650 to 1900, and focusing specifically on the properties of the chord progressions contained therein. In the first chapter, methodologies concerning corpus analysis are presented…
DLA Systems Modernization Methodology: Logical Analysis and Design Procedures
1990-07-01
Information Requirement would have little meaning and thus would lose its value . 3 I3 I 1.1.3 INPUT PRODUCTS 3 1.1.3.1 Enterprise Model Objective List 1.1.3.2...at the same time, the attribute is said to be multi- valued . i For example, an E-R model may contain information on the languages an employee speaks...Relationship model is examined in detail to ensure that each data group contains attributes whose values are absolutely determined by their respective
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
Families with Noncompliant Children: Applications of the Systemic Model.
ERIC Educational Resources Information Center
Neilans, Thomas H.; And Others
This paper describes the application of a systems approach model to assessing families with a labeled noncompliant child. The first section describes and comments on the applied methodology for the model. The second section describes the classification of 61 families containing a child labeled by the family as noncompliant. An analysis of data…
Manpower and Personnel and Logistics Analysis in the Palletized Load System (PLS)
1994-10-01
models used algonthms sanctioned byE AR 570-2 and were written in Microsoft Excel version 4.0 software. HARDMAN comparative methodology (HCM) formed the...357281300) contained forty-eight vehiclks. The medium ftrck company taxing the 5 ton tractorl with in Mg7l wailer (SRC 33721LI00) cotained sixty
Risk ranking of LANL nuclear material storage containers for repackaging prioritization.
Smith, Paul H; Jordan, Hans; Hoffman, Jenifer A; Eller, P Gary; Balkey, Simon
2007-05-01
Safe handling and storage of nuclear material at U.S. Department of Energy facilities relies on the use of robust containers to prevent container breaches and subsequent worker contamination and uptake. The U.S. Department of Energy has no uniform requirements for packaging and storage of nuclear materials other than those declared excess and packaged to DOE-STD-3013-2000. This report describes a methodology for prioritizing a large inventory of nuclear material containers so that the highest risk containers are repackaged first. The methodology utilizes expert judgment to assign respirable fractions and reactivity factors to accountable levels of nuclear material at Los Alamos National Laboratory. A relative risk factor is assigned to each nuclear material container based on a calculated dose to a worker due to a failed container barrier and a calculated probability of container failure based on material reactivity and container age. This risk-based methodology is being applied at LANL to repackage the highest risk materials first and, thus, accelerate the reduction of risk to nuclear material handlers.
NASA Technical Reports Server (NTRS)
Akse, J. R.; Thompson, J. O.; Sauer, R. L.; Atwater, J. E.
1998-01-01
Flow injection analysis instrumentation and methodology for the determination of ammonia and ammonium ions in an aqueous solution are described. Using in-line solid phase basification beds containing crystalline media. the speciation of ammoniacal nitrogen is shifted toward the un-ionized form. which diffuses in the gas phase across a hydrophobic microporous hollow fiber membrane into a pure-water-containing analytical stream. The two streams flow in a countercurrent configuration on opposite sides of the membrane. The neutral pH of the analytical stream promotes the formation of ammonium cations, which are detected using specific conductance. The methodology provides a lower limit of detection of 10 microgram/L and a dynamic concentration range spanning three orders of magnitude using a 315-microliters sample injection volume. Using immobilized urease to enzymatically promote the hydrolysis of urea to produce ammonia and carbon dioxide, the technique has been extended to the determination of urea.
Stakeholder analysis methodologies resource book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babiuch, W.M.; Farhar, B.C.
1994-03-01
Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and theirmore » commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.« less
Non-destructive fraud detection in rosehip oil by MIR spectroscopy and chemometrics.
Santana, Felipe Bachion de; Gontijo, Lucas Caixeta; Mitsutake, Hery; Mazivila, Sarmento Júnior; Souza, Leticia Maria de; Borges Neto, Waldomiro
2016-10-15
Rosehip oil (Rosa eglanteria L.) is an important oil in the food, pharmaceutical and cosmetic industries. However, due to its high added value, it is liable to adulteration with other cheaper or lower quality oils. With this perspective, this work provides a new simple, fast and accurate methodology using mid-infrared (MIR) spectroscopy and partial least squares discriminant analysis (PLS-DA) as a means to discriminate authentic rosehip oil from adulterated rosehip oil containing soybean, corn and sunflower oils in different proportions. The model showed excellent sensitivity and specificity with 100% correct classification. Therefore, the developed methodology is a viable alternative for use in the laboratory and industry for standard quality analysis of rosehip oil since it is fast, accurate and non-destructive. Copyright © 2016 Elsevier Ltd. All rights reserved.
A method for the design of transonic flexible wings
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1990-01-01
Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.
Underwater Sediment Sampling Research
2017-01-01
resolved through further experimentation . Underwater Sediment Sampling Research vi UNCLAS//Public | CG-926 RDC | A. Hanson, et al. Public...Chemical Oceanographer, and In situ Chemical Analysis Subject Matter Expert (SME). 2 LABORATORY TEST SET UP The experimental research and laboratory... methodology involved using a fluorescence oil sensor (Turner Designs Cyclops-7) to measure the TPH contained in the interstitial waters (i.e., pore
ERIC Educational Resources Information Center
Aitken, Joan E.
A study categorized self-perceptions of subjects regarding their feelings about initial communication interaction. Using Q-Technique, a total of 138 subjects, mostly students at a midsized, midwestern, urban university enrolled in interpersonal communication courses, were studied through the use of two structured Q-sorts containing statements…
Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills
ERIC Educational Resources Information Center
Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko
2012-01-01
Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…
Occupations in the Hotel Tourist Sector within the European Community. A Comparative Analysis.
ERIC Educational Resources Information Center
Peroni, Giovanni; Guerra, Duccio
This report contains a directory of job profiles in the tourist/hotel sector that is based on seven national monographs. It provides an instrument for comparing factors that characterize practitioners working in the sector in Germany, Spain, France, Greece, Italy, Portugal, and the United Kingdom. A methodological note discusses study objectives,…
ERIC Educational Resources Information Center
Tao, Fumiyo; And Others
This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…
Online Courses Assessment through Measuring and Archetyping of Usage Data
ERIC Educational Resources Information Center
Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros
2016-01-01
Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…
NASA Technical Reports Server (NTRS)
Young, G.
1982-01-01
A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.
U.S. Army Research Laboratory (ARL) XPairIt Simulator for Peptide Docking and Analysis
2014-07-01
results from a case study, docking a short peptide to a small protein. For this test we choose the 1RXZ system from the Protein Data Bank, which...estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data ...core of XPairIt, which additionally contains many data management and organization options, analysis tools, and custom simulation methodology. Two
Supporting Air and Space Expeditionary Forces: Analysis of Combat Support Basing Options
2004-01-01
Brooke et al., 2003. 13 For more information on Set Covering models, see Daskin , 1995. Analysis Methodology 43 Transportation Model. A detailed...This PDF document was made available from www.rand.org as a public service of the RAND Corporation. 6Jump down to document Visit RAND at...www.rand.org Explore RAND Project AIR FORCE View document details This document and trademark(s) contained herein are protected by law as indicated in a
DESIGN ANALYSIS FOR THE DEFENSE HIGH-LEVEL WASTE DISPOSAL CONTAINER
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. Radulesscu; J.S. Tang
The purpose of ''Design Analysis for the Defense High-Level Waste Disposal Container'' analysis is to technically define the defense high-level waste (DHLW) disposal container/waste package using the Waste Package Department's (WPD) design methods, as documented in ''Waste Package Design Methodology Report'' (CRWMS M&O [Civilian Radioactive Waste Management System Management and Operating Contractor] 2000a). The DHLW disposal container is intended for disposal of commercial high-level waste (HLW) and DHLW (including immobilized plutonium waste forms), placed within disposable canisters. The U.S. Department of Energy (DOE)-managed spent nuclear fuel (SNF) in disposable canisters may also be placed in a DHLW disposal container alongmore » with HLW forms. The objective of this analysis is to demonstrate that the DHLW disposal container/waste package satisfies the project requirements, as embodied in Defense High Level Waste Disposal Container System Description Document (SDD) (CRWMS M&O 1999a), and additional criteria, as identified in Waste Package Design Sensitivity Report (CRWMS M&Q 2000b, Table 4). The analysis briefly describes the analytical methods appropriate for the design of the DHLW disposal contained waste package, and summarizes the results of the calculations that illustrate the analytical methods. However, the analysis is limited to the calculations selected for the DHLW disposal container in support of the Site Recommendation (SR) (CRWMS M&O 2000b, Section 7). The scope of this analysis is restricted to the design of the codisposal waste package of the Savannah River Site (SRS) DHLW glass canisters and the Training, Research, Isotopes General Atomics (TRIGA) SNF loaded in a short 18-in.-outer diameter (OD) DOE standardized SNF canister. This waste package is representative of the waste packages that consist of the DHLW disposal container, the DHLW/HLW glass canisters, and the DOE-managed SNF in disposable canisters. The intended use of this analysis is to support Site Recommendation reports and to assist in the development of WPD drawings. Activities described in this analysis were conducted in accordance with the Development Plan ''Design Analysis for the Defense High-Level Waste Disposal Container'' (CRWMS M&O 2000c) with no deviations from the plan.« less
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
Entropy Filtered Density Function for Large Eddy Simulation of Turbulent Reacting Flows
NASA Astrophysics Data System (ADS)
Safari, Mehdi
Analysis of local entropy generation is an effective means to optimize the performance of energy and combustion systems by minimizing the irreversibilities in transport processes. Large eddy simulation (LES) is employed to describe entropy transport and generation in turbulent reacting flows. The entropy transport equation in LES contains several unclosed terms. These are the subgrid scale (SGS) entropy flux and entropy generation caused by irreversible processes: heat conduction, mass diffusion, chemical reaction and viscous dissipation. The SGS effects are taken into account using a novel methodology based on the filtered density function (FDF). This methodology, entitled entropy FDF (En-FDF), is developed and utilized in the form of joint entropy-velocity-scalar-turbulent frequency FDF and the marginal scalar-entropy FDF, both of which contain the chemical reaction effects in a closed form. The former constitutes the most comprehensive form of the En-FDF and provides closure for all the unclosed filtered moments. This methodology is applied for LES of a turbulent shear layer involving transport of passive scalars. Predictions show favor- able agreements with the data generated by direct numerical simulation (DNS) of the same layer. The marginal En-FDF accounts for entropy generation effects as well as scalar and entropy statistics. This methodology is applied to a turbulent nonpremixed jet flame (Sandia Flame D) and predictions are validated against experimental data. In both flows, sources of irreversibility are predicted and analyzed.
Teresa E. Jordan
2015-10-22
The files included in this submission contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.
Analysis of launch site processing effectiveness for the Space Shuttle 26R payload
NASA Technical Reports Server (NTRS)
Flores, Carlos A.; Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1991-01-01
A trend analysis study has been performed on problem reports recorded during the Space Shuttle 26R payload's processing cycle at NASA-Kennedy, using the defect-flow analysis (DFA) methodology; DFA gives attention to the characteristics of the problem-report 'population' as a whole. It is established that the problem reports contain data which distract from pressing problems, and that fully 60 percent of such reports were caused during processing at NASA-Kennedy. The second major cause of problem reports was design defects.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
A Survey on Security Isolation of Virtualization, Containers, and Unikernels
2017-05-01
this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents. Citation of...characteristics is necessary to understand the potential threats. Each of these technologies contains subtle differences in the methodology and...technologies contains subtle differences in the methodology and software architecture to provide secure isolation between guests. All 3 of these
Full-Envelope Launch Abort System Performance Analysis Methodology
NASA Technical Reports Server (NTRS)
Aubuchon, Vanessa V.
2014-01-01
The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.
Trace Chemical Analysis Methodology
1980-04-01
oxidation of nitrite-containing species. Calibration studies were then made in preparation for the analysis of unknown samples of nitrate in urine and...the procedure for nitrate determination was made on two types of samples : human urine , and drinking water from a city water supply. Five samples of...AND URINE Concentration Standard Sample type of NO3, ppm deviation Drinking water 1.29 ±0 04 1.20 ±0.09 1.29 ±0.14 1.15 ±0.08 0.88 ±0.07 Human urine
NASA Astrophysics Data System (ADS)
Shultheis, C. F.
1985-02-01
This technical report describes an analysis of the performance allocations for a satellite link, focusing specifically on a single-hop 7 to 8 GHz link of the Defense Satellite Communications System (DSCS). The analysis is performed for three primary reasons: (1) to reevaluate link power margin requirements for DSCS links based on digital signalling; (2) to analyze the implications of satellite availability and error rate allocations contained in proposed MIL-STD-188-323, system design and engineering standards for long haul digital transmission system performance; and (3) to standardize a methodology for determination of rain-related propagation constraints. The aforementioned methodology is then used to calculate the link margin requirements of typical DSCS binary/quaternary phase shift keying (BPSK/QPSK) links at 7 to 8 GHz for several different Earth terminal locations.
Methodological considerations in cost of illness studies on Alzheimer disease
2012-01-01
Cost-of-illness studies (COI) can identify and measure all the costs of a particular disease, including the direct, indirect and intangible dimensions. They are intended to provide estimates about the economic impact of costly disease. Alzheimer disease (AD) is a relevant example to review cost of illness studies because of its costliness.The aim of this study was to review relevant published cost studies of AD to analyze the method used and to identify which dimension had to be improved from a methodological perspective. First, we described the key points of cost study methodology. Secondly, cost studies relating to AD were systematically reviewed, focussing on an analysis of the different methods used. The methodological choices of the studies were analysed using an analytical grid which contains the main methodological items of COI studies. Seventeen articles were retained. Depending on the studies, annual total costs per patient vary from $2,935 to $52, 954. The methods, data sources, and estimated cost categories in each study varied widely. The review showed that cost studies adopted different approaches to estimate costs of AD, reflecting a lack of consensus on the methodology of cost studies. To increase its credibility, closer agreement among researchers on the methodological principles of cost studies would be desirable. PMID:22963680
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
NASA Technical Reports Server (NTRS)
Dhas, Chris
2000-01-01
NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.
Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong
2016-01-01
Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider.
Pervez, Zeeshan; Ahmad, Mahmood; Khattak, Asad Masood; Lee, Sungyoung; Chung, Tae Choong
2016-01-01
Privacy-aware search of outsourced data ensures relevant data access in the untrusted domain of a public cloud service provider. Subscriber of a public cloud storage service can determine the presence or absence of a particular keyword by submitting search query in the form of a trapdoor. However, these trapdoor-based search queries are limited in functionality and cannot be used to identify secure outsourced data which contains semantically equivalent information. In addition, trapdoor-based methodologies are confined to pre-defined trapdoors and prevent subscribers from searching outsourced data with arbitrarily defined search criteria. To solve the problem of relevant data access, we have proposed an index-based privacy-aware search methodology that ensures semantic retrieval of data from an untrusted domain. This method ensures oblivious execution of a search query and leverages authorized subscribers to model conjunctive search queries without relying on predefined trapdoors. A security analysis of our proposed methodology shows that, in a conspired attack, unauthorized subscribers and untrusted cloud service providers cannot deduce any information that can lead to the potential loss of data privacy. A computational time analysis on commodity hardware demonstrates that our proposed methodology requires moderate computational resources to model a privacy-aware search query and for its oblivious evaluation on a cloud service provider. PMID:27571421
NASA Astrophysics Data System (ADS)
Abdullah, U. N. N.; Handroos, H.
2017-09-01
Introduction: This paper presents the study of sense of control parameters to improve the lack of direct motion feeling through remote operated container crane station (ROCCS) joystick interface. The investigations of the parameters in this study are important to develop the engineering parameters related to the sense of control goal in the next design process. Methodology: Structured interviews and observations were conducted to obtain the user experience data from thirteen remote container crane operators from two international terminals. Then, interview analysis, task analysis, activity analysis and time line analysis were conducted to compare and contrast the results from interviews and observations. Results: Four experience parameters were identified to support the sense of control goal in the later design improvement of the ROCC joystick interface. The significance of difficulties to control, unsynchronized movements, facilitate in control and decision making in unexpected situation as parameters to the sense of control goal were validated by' feedbacks from operators as well as analysis. Contribution: This study provides feedback directly from end users towards developing a sustainable control interface for ROCCS in specific and remote operated off-road vehicles in general.
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
NASA Technical Reports Server (NTRS)
Binienda, Wieslaw K.; Sancaktar, Erol; Roberts, Gary D. (Technical Monitor)
2002-01-01
An effective design methodology was established for composite jet engine containment structures. The methodology included the development of the full and reduced size prototypes, and FEA models of the containment structure, experimental and numerical examination of the modes of failure clue to turbine blade out event, identification of materials and design candidates for future industrial applications, and design and building of prototypes for testing and evaluation purposes.
Analysis report for 241-BY-104 Auger samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, M.A.
1994-11-10
This report describes the analysis of the surface crust samples taken from single-shell tank (SST) BY-104, suspected of containing ferrocyanide wastes. This sampling and analysis will assist in ascertaining whether there is any hazard due to combustion (burning) or explosion of these solid wastes. These characteristics are important to future efforts to characterize the salt and sludge in this type of waste tank. This report will outline the methodology and detail the results of analyses performed during the characterization of this material. All analyses were performed by Westinghouse Hanford Company at the 222-S laboratory unless stated otherwise.
Analytical optimal pulse shapes obtained with the aid of genetic algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrero, Rubén D., E-mail: rdguerrerom@unal.edu.co; Arango, Carlos A.; Reyes, Andrés
2015-09-28
We propose a methodology to design optimal pulses for achieving quantum optimal control on molecular systems. Our approach constrains pulse shapes to linear combinations of a fixed number of experimentally relevant pulse functions. Quantum optimal control is obtained by maximizing a multi-target fitness function using genetic algorithms. As a first application of the methodology, we generated an optimal pulse that successfully maximized the yield on a selected dissociation channel of a diatomic molecule. Our pulse is obtained as a linear combination of linearly chirped pulse functions. Data recorded along the evolution of the genetic algorithm contained important information regarding themore » interplay between radiative and diabatic processes. We performed a principal component analysis on these data to retrieve the most relevant processes along the optimal path. Our proposed methodology could be useful for performing quantum optimal control on more complex systems by employing a wider variety of pulse shape functions.« less
NASA Astrophysics Data System (ADS)
Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan
2017-09-01
Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.
Detection and Processing Techniques of FECG Signal for Fetal Monitoring
2009-01-01
Fetal electrocardiogram (FECG) signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system. PMID:19495912
NASA Technical Reports Server (NTRS)
Walters, Robert; Summers, Geoffrey P.; Warmer. Keffreu J/; Messenger, Scott; Lorentzen, Justin R.; Morton, Thomas; Taylor, Stephen J.; Evans, Hugh; Heynderickx, Daniel; Lei, Fan
2007-01-01
This paper presents a method for using the SPENVIS on-line computational suite to implement the displacement damage dose (D(sub d)) methodology for calculating end-of-life (EOL) solar cell performance for a specific space mission. This paper builds on our previous work that has validated the D(sub d) methodology against both measured space data [1,2] and calculations performed using the equivalent fluence methodology developed by NASA JPL [3]. For several years, the space solar community has considered general implementation of the D(sub d) method, but no computer program exists to enable this implementation. In a collaborative effort, NRL, NASA and OAI have produced the Solar Array Verification and Analysis Tool (SAVANT) under NASA funding, but this program has not progressed beyond the beta-stage [4]. The SPENVIS suite with the Multi Layered Shielding Simulation Software (MULASSIS) contains all of the necessary components to implement the Dd methodology in a format complementary to that of SAVANT [5]. NRL is currently working with ESA and BIRA to include the Dd method of solar cell EOL calculations as an integral part of SPENVIS. This paper describes how this can be accomplished.
The effects of videotape modeling on staff acquisition of functional analysis methodology.
Moore, James W; Fisher, Wayne W
2007-01-01
Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.
CERT tribal internship program. Final intern report: David Conrad, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-09-01
The intern`s report contains a Master`s thesis entitled, ``An implementation analysis of the US Department of Energy`s American Indian policy as part of its environmental restoration and waste management mission.`` This thesis examines the implementation of a working relationship between the Nez Perce Tribe and the US Department of Energy`s Office of Environmental Restoration and Waste Management at the Hanford reservation. It examines the relationship using a qualitative methodology and three generations of policy analysis literature to gain a clear understanding of the potential for successful implementation.
The Effects of Videotape Modeling on Staff Acquisition of Functional Analysis Methodology
Moore, James W; Fisher, Wayne W
2007-01-01
Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805
Carbon dioxide fluid-flow modeling and injectivity calculations
Burke, Lauri
2011-01-01
These results were used to classify subsurface formations into three permeability classifications for the probabilistic calculations of storage efficiency and containment risk of the U.S. Geological Survey geologic carbon sequestration assessment methodology. This methodology is currently in use to determine the total carbon dioxide containment capacity of the onshore and State waters areas of the United States.
Luster measurements of lips treated with lipstick formulations.
Yadav, Santosh; Issa, Nevine; Streuli, David; McMullen, Roger; Fares, Hani
2011-01-01
In this study, digital photography in combination with image analysis was used to measure the luster of several lipstick formulations containing varying amounts and types of polymers. A weighed amount of lipstick was applied to a mannequin's lips and the mannequin was illuminated by a uniform beam of a white light source. Digital images of the mannequin were captured with a high-resolution camera and the images were analyzed using image analysis software. Luster analysis was performed using Stamm (L(Stamm)) and Reich-Robbins (L(R-R)) luster parameters. Statistical analysis was performed on each luster parameter (L(Stamm) and L(R-R)), peak height, and peak width. Peak heights for lipstick formulation containing 11% and 5% VP/eicosene copolymer were statistically different from those of the control. The L(Stamm) and L(R-R) parameters for the treatment containing 11% VP/eicosene copolymer were statistically different from these of the control. Based on the results obtained in this study, we are able to determine whether a polymer is a good pigment dispersant and contributes to visually detected shine of a lipstick upon application. The methodology presented in this paper could serve as a tool for investigators to screen their ingredients for shine in lipstick formulations.
Deepak, V; Kalishwaralal, K; Ramkumarpandian, S; Babu, S Venkatesh; Senthilkumar, S R; Sangiliyandi, G
2008-11-01
Response surface methodology and central composite rotary design (CCRD) was employed to optimize a fermentation medium for the production of Nattokinase by Bacillus subtilis at pH 7.5. The four variables involved in this study were Glucose, Peptone, CaCl2, and MgSO4. The statistical analysis of the results showed that, in the range studied; only peptone had a significant effect on Nattokinase production. The optimized medium containing (%) Glucose: 1, Peptone: 5.5, MgSO4: 0.2 and CaCl2: 0.5 resulted in 2-fold increased level of Nattokinase (3194.25U/ml) production compared to initial level (1599.09U/ml) after 10h of fermentation. Nattokinase production was checked with fibrinolytic activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.
2012-08-01
Phase change materials (PCM) represent a potential technology to reduce peak loads and HVAC energy consumption in residential buildings. This paper summarizes NREL efforts to obtain accurate energy simulations when PCMs are modeled in residential buildings: the overall methodology to verify and validate Conduction Finite Difference (CondFD) and PCM algorithms in EnergyPlus is presented in this study. It also shows preliminary results of three residential building enclosure technologies containing PCM: PCM-enhanced insulation, PCM impregnated drywall and thin PCM layers. The results are compared based on predicted peak reduction and energy savings using two algorithms in EnergyPlus: the PCM and Conductionmore » Finite Difference (CondFD) algorithms.« less
Reference Model 6 (RM6): Oscillating Wave Energy Converter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana L; Smith, Chris; Jenne, Dale Scott
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter reference model design in a complementary manner to Reference Models 1-4 contained in the above report. In this report, a conceptual design for an Oscillating Water Column Wave Energy Converter (WEC) device appropriate for the modeled reference resource site was identified, and a detailed backward bent duct buoy (BBDB) device design was developed using a combination of numerical modeling tools and scaled physical models. Our team used the methodology in SAND2013-9040more » for the economic analysis that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays, up to 100 devices. The methodology was applied to identify key cost drivers and to estimate levelized cost of energy (LCOE) for this RM6 Oscillating Water Column device in dollars per kilowatt-hour ($/kWh). Although many costs were difficult to estimate at this time due to the lack of operational experience, the main contribution of this work was to disseminate a detailed set of methodologies and models that allow for an initial cost analysis of this emerging technology. This project is sponsored by the U.S. Department of Energy's (DOE) Wind and Water Power Technologies Program Office (WWPTO), within the Office of Energy Efficiency & Renewable Energy (EERE). Sandia National Laboratories, the lead in this effort, collaborated with partners from National Laboratories, industry, and universities to design and test this reference model.« less
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Krosel, S. M.; Bruton, W. M.
1982-01-01
A systematic, computer-aided, self-documenting methodology for developing hybrid computer simulations of turbofan engines is presented. The methodology that is pesented makes use of a host program that can run on a large digital computer and a machine-dependent target (hybrid) program. The host program performs all the calculations and data manipulations that are needed to transform user-supplied engine design information to a form suitable for the hybrid computer. The host program also trims the self-contained engine model to match specified design-point information. Part I contains a general discussion of the methodology, describes a test case, and presents comparisons between hybrid simulation and specified engine performance data. Part II, a companion document, contains documentation, in the form of computer printouts, for the test case.
A hierarchical clustering methodology for the estimation of toxicity.
Martin, Todd M; Harten, Paul; Venkatapathy, Raghuraman; Das, Shashikala; Young, Douglas M
2008-01-01
ABSTRACT A quantitative structure-activity relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural similarity is defined in terms of 2-D physicochemical descriptors (such as connectivity and E-state indices). A genetic algorithm-based technique is used to generate statistically valid QSAR models for each cluster (using the pool of descriptors described above). The toxicity for a given query compound is estimated using the weighted average of the predictions from the closest cluster from each step in the hierarchical clustering assuming that the compound is within the domain of applicability of the cluster. The hierarchical clustering methodology was tested using a Tetrahymena pyriformis acute toxicity data set containing 644 chemicals in the training set and with two prediction sets containing 339 and 110 chemicals. The results from the hierarchical clustering methodology were compared to the results from several different QSAR methodologies.
Statistical Time Series Models of Pilot Control with Applications to Instrument Discrimination
NASA Technical Reports Server (NTRS)
Altschul, R. E.; Nagel, P. M.; Oliver, F.
1984-01-01
A general description of the methodology used in obtaining the transfer function models and verification of model fidelity, frequency domain plots of the modeled transfer functions, numerical results obtained from an analysis of poles and zeroes obtained from z plane to s-plane conversions of the transfer functions, and the results of a study on the sequential introduction of other variables, both exogenous and endogenous into the loop are contained.
Preloaded joint analysis methodology for space flight systems
NASA Technical Reports Server (NTRS)
Chambers, Jeffrey A.
1995-01-01
This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
Methodology for Designing Fault-Protection Software
NASA Technical Reports Server (NTRS)
Barltrop, Kevin; Levison, Jeffrey; Kan, Edwin
2006-01-01
A document describes a methodology for designing fault-protection (FP) software for autonomous spacecraft. The methodology embodies and extends established engineering practices in the technical discipline of Fault Detection, Diagnosis, Mitigation, and Recovery; and has been successfully implemented in the Deep Impact Spacecraft, a NASA Discovery mission. Based on established concepts of Fault Monitors and Responses, this FP methodology extends the notion of Opinion, Symptom, Alarm (aka Fault), and Response with numerous new notions, sub-notions, software constructs, and logic and timing gates. For example, Monitor generates a RawOpinion, which graduates into Opinion, categorized into no-opinion, acceptable, or unacceptable opinion. RaiseSymptom, ForceSymptom, and ClearSymptom govern the establishment and then mapping to an Alarm (aka Fault). Local Response is distinguished from FP System Response. A 1-to-n and n-to- 1 mapping is established among Monitors, Symptoms, and Responses. Responses are categorized by device versus by function. Responses operate in tiers, where the early tiers attempt to resolve the Fault in a localized step-by-step fashion, relegating more system-level response to later tier(s). Recovery actions are gated by epoch recovery timing, enabling strategy, urgency, MaxRetry gate, hardware availability, hazardous versus ordinary fault, and many other priority gates. This methodology is systematic, logical, and uses multiple linked tables, parameter files, and recovery command sequences. The credibility of the FP design is proven via a fault-tree analysis "top-down" approach, and a functional fault-mode-effects-and-analysis via "bottoms-up" approach. Via this process, the mitigation and recovery strategy(s) per Fault Containment Region scope (width versus depth) the FP architecture.
Proposed solution methodology for the dynamically coupled nonlinear geared rotor mechanics equations
NASA Technical Reports Server (NTRS)
Mitchell, L. D.; David, J. W.
1983-01-01
The equations which describe the three-dimensional motion of an unbalanced rigid disk in a shaft system are nonlinear and contain dynamic-coupling terms. Traditionally, investigators have used an order analysis to justify ignoring the nonlinear terms in the equations of motion, producing a set of linear equations. This paper will show that, when gears are included in such a rotor system, the nonlinear dynamic-coupling terms are potentially as large as the linear terms. Because of this, one must attempt to solve the nonlinear rotor mechanics equations. A solution methodology is investigated to obtain approximate steady-state solutions to these equations. As an example of the use of the technique, a simpler set of equations is solved and the results compared to numerical simulations. These equations represent the forced, steady-state response of a spring-supported pendulum. These equations were chosen because they contain the type of nonlinear terms found in the dynamically-coupled nonlinear rotor equations. The numerical simulations indicate this method is reasonably accurate even when the nonlinearities are large.
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
NASA Astrophysics Data System (ADS)
Crevelin, Eduardo J.; Salami, Fernanda H.; Alves, Marcela N. R.; De Martinis, Bruno S.; Crotti, Antônio E. M.; Moraes, Luiz A. B.
2016-05-01
Amphetamine-type stimulants (ATS) are among illicit stimulant drugs that are most often used worldwide. A major challenge is to develop a fast and efficient methodology involving minimal sample preparation to analyze ATS in biological fluids. In this study, a urine pool solution containing amphetamine, methamphetamine, ephedrine, sibutramine, and fenfluramine at concentrations ranging from 0.5 pg/mL to 100 ng/mL was prepared and analyzed by atmospheric solids analysis probe tandem mass spectrometry (ASAP-MS/MS) and multiple reaction monitoring (MRM). A urine sample and saliva collected from a volunteer contributor (V1) were also analyzed. The limit of detection of the tested compounds ranged between 0.002 and 0.4 ng/mL in urine samples; the signal-to-noise ratio was 5. These results demonstrated that the ASAP-MS/MS methodology is applicable for the fast detection of ATS in urine samples with great sensitivity and specificity, without the need for cleanup, preconcentration, or chromatographic separation. Thus ASAP-MS/MS could potentially be used in clinical and forensic toxicology applications.
A novel integrated assessment methodology of urban water reuse.
Listowski, A; Ngo, H H; Guo, W S; Vigneswaran, S
2011-01-01
Wastewater is no longer considered a waste product and water reuse needs to play a stronger part in securing urban water supply. Although treatment technologies for water reclamation have significantly improved the question that deserves further analysis is, how selection of a particular wastewater treatment technology relates to performance and sustainability? The proposed assessment model integrates; (i) technology, characterised by selected quantity and quality performance parameters; (ii) productivity, efficiency and reliability criteria; (iii) quantitative performance indicators; (iv) development of evaluation model. The challenges related to hierarchy and selections of performance indicators have been resolved through the case study analysis. The goal of this study is to validate a new assessment methodology in relation to performance of the microfiltration (MF) technology, a key element of the treatment process. Specific performance data and measurements were obtained at specific Control and Data Acquisition Points (CP) to satisfy the input-output inventory in relation to water resources, products, material flows, energy requirements, chemicals use, etc. Performance assessment process contains analysis and necessary linking across important parametric functions leading to reliable outcomes and results.
Arbitrary Steady-State Solutions with the K-epsilon Model
NASA Technical Reports Server (NTRS)
Rumsey, Christopher L.; Pettersson Reif, B. A.; Gatski, Thomas B.
2006-01-01
Widely-used forms of the K-epsilon turbulence model are shown to yield arbitrary steady-state converged solutions that are highly dependent on numerical considerations such as initial conditions and solution procedure. These solutions contain pseudo-laminar regions of varying size. By applying a nullcline analysis to the equation set, it is possible to clearly demonstrate the reasons for the anomalous behavior. In summary, the degenerate solution acts as a stable fixed point under certain conditions, causing the numerical method to converge there. The analysis also suggests a methodology for preventing the anomalous behavior in steady-state computations.
Stream habitat analysis using the instream flow incremental methodology
Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim
1998-01-01
This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.
Methodologies for Crawler Based Web Surveys.
ERIC Educational Resources Information Center
Thelwall, Mike
2002-01-01
Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…
Convergent close coupling versus the generalized Sturmian function approach: Wave-function analysis
NASA Astrophysics Data System (ADS)
Ambrosio, M.; Mitnik, D. M.; Gasaneo, G.; Randazzo, J. M.; Kadyrov, A. S.; Fursa, D. V.; Bray, I.
2015-11-01
We compare the physical information contained in the Temkin-Poet (TP) scattering wave function representing electron-impact ionization of hydrogen, calculated by the convergent close-coupling (CCC) and generalized Sturmian function (GSF) methodologies. The idea is to show that the ionization cross section can be extracted from the wave functions themselves. Using two different procedures based on hyperspherical Sturmian functions we show that the transition amplitudes contained in both GSF and CCC scattering functions lead to similar single-differential cross sections. The single-continuum channels were also a subject of the present studies, and we show that the elastic and excitation amplitudes are essentially the same as well.
A normative price for a manufactured product: The SAMICS methodology. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Chamberlain, R. G.
1979-01-01
A summary for the Solar Array Manufacturing Industry Costing Standards report contains a discussion of capabilities and limitations, a non-technical overview of the methodology, and a description of the input data which must be collected. It also describes the activities that were and are being taken to ensure validity of the results and contains an up-to-date bibliography of related documents.
González, Martín Maximino León
2009-10-01
With the purpose to analyze the health strategic planning model based on determinants experienced in the municipality of Campo Bom, Rio Grande do Sul State, it was conducted an observational, qualitative study, of documental analysis as well as an evaluation of new process technologies in local health administration. This study contains an analysis of the methodological coherency and applicability of this model, based on the revision of the elaborated plans. The plans presented at Campo Bom case shows the possibility of integration and applicability at local level, of a health strategic planning model oriented to the new health concepts considering elements of different theoretical developments that enables the response to the most common local needs and situations. It was identified evolutional stages of health planning and analyzed integrative elements of the model and limitations of its application, pointing to the need of support the deepening on the study and the development of the field.
Critical Protection Item classification for a waste processing facility at Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ades, M.J.; Garrett, R.J.
1993-10-01
This paper describes the methodology for Critical Protection Item (CPI) classification and its application to the Structures, Systems and Components (SSC) of a waste processing facility at the Savannah River Site (SRS). The WSRC methodology for CPI classification includes the evaluation of the radiological and non-radiological consequences resulting from postulated accidents at the waste processing facility and comparison of these consequences with allowable limits. The types of accidents considered include explosions and fire in the facility and postulated accidents due to natural phenomena, including earthquakes, tornadoes, and high velocity straight winds. The radiological analysis results indicate that CPIs are notmore » required at the waste processing facility to mitigate the consequences of radiological release. The non-radiological analysis, however, shows that the Waste Storage Tank (WST) and the dike spill containment structures around the formic acid tanks in the cold chemical feed area and waste treatment area of the facility should be identified as CPIs. Accident mitigation options are provided and discussed.« less
Deep Borehole Emplacement Mode Hazard Analysis Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David
This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent ofmore » this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.]« less
Scanning Angle Raman spectroscopy in polymer thin film characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Vy H.T.
The focus of this thesis is the application of Raman spectroscopy for the characterization of thin polymer films. Chapter 1 provides background information and motivation, including the fundamentals of Raman spectroscopy for chemical analysis, scanning angle Raman scattering and scanning angle Raman scattering for applications in thin polymer film characterization. Chapter 2 represents a published manuscript that focuses on the application of scanning angle Raman spectroscopy for the analysis of submicron thin films with a description of methodology for measuring the film thickness and location of an interface between two polymer layers. Chapter 3 provides an outlook and future directionsmore » for the work outlined in this thesis. Appendix A, contains a published manuscript that outlines the use of Raman spectroscopy to aid in the synthesis of heterogeneous catalytic systems. Appendix B and C contain published manuscripts that set a foundation for the work presented in Chapter 2.« less
1980-08-01
5K 2. METHODOLOGY . . . . . . . . . . . . . . . . . . . . 5 3. RESULTS . . . . . . . . . . . . . . . . . . . . . . 23I 4...2. METHODOLOGY The first step required in this study was to characterize the prone protected posture. Basically, a man in the prone posture differs...reduction in the presented area of target personnel. Reference 6 contains a concise discussion of the methodology used to generate the shielding functions
Methodology for Evaluating Encapsulated Beneficial Uses of Coal Combustion Residuals
The primary purpose of this document is to present an evaluation methodology developed by the EPA for making determinations about environmental releases from encapsulated products containing coal combustion residuals.
Riley, Richard D; Ensor, Joie; Jackson, Dan; Burke, Danielle L
2017-01-01
Many meta-analysis models contain multiple parameters, for example due to multiple outcomes, multiple treatments or multiple regression coefficients. In particular, meta-regression models may contain multiple study-level covariates, and one-stage individual participant data meta-analysis models may contain multiple patient-level covariates and interactions. Here, we propose how to derive percentage study weights for such situations, in order to reveal the (otherwise hidden) contribution of each study toward the parameter estimates of interest. We assume that studies are independent, and utilise a decomposition of Fisher's information matrix to decompose the total variance matrix of parameter estimates into study-specific contributions, from which percentage weights are derived. This approach generalises how percentage weights are calculated in a traditional, single parameter meta-analysis model. Application is made to one- and two-stage individual participant data meta-analyses, meta-regression and network (multivariate) meta-analysis of multiple treatments. These reveal percentage study weights toward clinically important estimates, such as summary treatment effects and treatment-covariate interactions, and are especially useful when some studies are potential outliers or at high risk of bias. We also derive percentage study weights toward methodologically interesting measures, such as the magnitude of ecological bias (difference between within-study and across-study associations) and the amount of inconsistency (difference between direct and indirect evidence in a network meta-analysis).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Espinosa-Paredes, Gilberto; Prieto-Guerrero, Alfonso; Nunez-Carrera, Alejandro
This paper introduces a wavelet-based method to analyze instability events in a boiling water reactor (BWR) during transient phenomena. The methodology to analyze BWR signals includes the following: (a) the short-time Fourier transform (STFT) analysis, (b) decomposition using the continuous wavelet transform (CWT), and (c) application of multiresolution analysis (MRA) using discrete wavelet transform (DWT). STFT analysis permits the study, in time, of the spectral content of analyzed signals. The CWT provides information about ruptures, discontinuities, and fractal behavior. To detect these important features in the signal, a mother wavelet has to be chosen and applied at several scales tomore » obtain optimum results. MRA allows fast implementation of the DWT. Features like important frequencies, discontinuities, and transients can be detected with analysis at different levels of detail coefficients. The STFT was used to provide a comparison between a classic method and the wavelet-based method. The damping ratio, which is an important stability parameter, was calculated as a function of time. The transient behavior can be detected by analyzing the maximum contained in detail coefficients at different levels in the signal decomposition. This method allows analysis of both stationary signals and highly nonstationary signals in the timescale plane. This methodology has been tested with the benchmark power instability event of Laguna Verde nuclear power plant (NPP) Unit 1, which is a BWR-5 NPP.« less
Fish, Wayne W
2007-02-21
Natural sources of carotenoids for nutraceutical use are desired by the food industry as a result of the increased production of convenience and other highly processed foods. As new physiological roles are discovered for some of the minor carotenoids that are found in only small amounts in present sources, the need for discovery of new sources will amplify. Thus, a method is needed that will effectively and gently concentrate carotenoids from potential new sources for subsequent identification and analysis. A procedure is presented by which carotenoid-containing tissue chromoplasts can be extracted and subsequently concentrated by precipitation, all in an aqueous milieu. The chromoplasts are extracted and solubilized with 0.3% sodium dodecyl sulfate (SDS) in water. The addition of a nominally equal volume of acetonitrile to the chromoplasts in SDS immediately precipitates the chromoplasts out of solution with generally >90% recovery. Carotenoids contained in the concentrated, still-intact chromoplasts can then be solubilized by organic solvent extraction for subsequent analysis. This methodology offers a means to effectively and gently concentrate carotenoids from fruit tissues where yields are often low (e.g., yellow watermelon).
Ashengroph, Morahem; Ababaf, Sajad
2014-12-01
Microbial caffeine removal is a green solution for treatment of caffeinated products and agro-industrial effluents. We directed this investigation to optimizing a bio-decaffeination process with growing cultures of Pseudomonas pseudoalcaligenes through Taguchi methodology which is a structured statistical approach that can be lowered variations in a process through Design of Experiments (DOE). Five parameters, i.e. initial fructose, tryptone, Zn(+2) ion and caffeine concentrations and also incubation time selected and an L16 orthogonal array was applied to design experiments with four 4-level factors and one 3-level factor (4(4) × 1(3)). Data analysis was performed using the statistical analysis of variance (ANOVA) method. Furthermore, the optimal conditions were determined by combining the optimal levels of the significant factors and verified by a confirming experiment. Measurement of residual caffeine concentration in the reaction mixture was performed using high-performance liquid chromatography (HPLC). Use of Taguchi methodology for optimization of design parameters resulted in about 86.14% reduction of caffeine in 48 h incubation when 5g/l fructose, 3 mM Zn(+2) ion and 4.5 g/l of caffeine are present in the designed media. Under the optimized conditions, the yield of degradation of caffeine (4.5 g/l) by the native strain of Pseudomonas pseudoalcaligenes TPS8 has been increased from 15.8% to 86.14% which is 5.4 fold higher than the normal yield. According to the experimental results, Taguchi methodology provides a powerful methodology for identifying the favorable parameters on caffeine removal using strain TPS8 which suggests that the approach also has potential application with similar strains to improve the yield of caffeine removal from caffeine containing solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donnelly, H.; Fullwood, R.; Glancy, J.
This is the second volume of a two volume report on the VISA method for evaluating safeguards at fixed-site facilities. This volume contains appendices that support the description of the VISA concept and the initial working version of the method, VISA-1, presented in Volume I. The information is separated into four appendices, each describing details of one of the four analysis modules that comprise the analysis sections of the method. The first appendix discusses Path Analysis methodology, applies it to a Model Fuel Facility, and describes the computer codes that are being used. Introductory material on Path Analysis given inmore » Chapter 3.2.1 and Chapter 4.2.1 of Volume I. The second appendix deals with Detection Analysis, specifically the schemes used in VISA-1 for classifying adversaries and the methods proposed for evaluating individual detection mechanisms in order to build the data base required for detection analysis. Examples of evaluations on identity-access systems, SNM portal monitors, and intrusion devices are provided. The third appendix describes the Containment Analysis overt-segment path ranking, the Monte Carlo engagement model, the network simulation code, the delay mechanism data base, and the results of a sensitivity analysis. The last appendix presents general equations used in Interruption Analysis for combining covert-overt segments and compares them with equations given in Volume I, Chapter 3.« less
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987
NASA Technical Reports Server (NTRS)
Gilmore, John F. (Editor)
1987-01-01
The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.
A global audit of the status and trends of Arctic and Northern Hemisphere goose populations
Schmutz, Joel A.; Fox, Anthony D.; Leafloor, James O.
2018-01-01
This report attempts to review the abundance, status and distribution of natural wild goose populations in the northern hemisphere. The report comprises three parts that 1) summarise key findings from the study and the methodology and analysis applied; 2) contain the individual accounts for each of the 68 populations included in this report; and 3) provide the datasets compiled for this study which will be made accessible on the Arctic Biodiversity Data Service.
Chicago Area Transportation Study (CATS): Methodological Overview
DOT National Transportation Integrated Search
1994-04-01
This report contains a methodological discussion of the Chicago Area : Transportation Study (CATS) 1990 Household Travel Survey. It was prepared to : assist those who are working with the Household Travel Survey database. This : report concentrates o...
Inquiry and Cultural Responsive Teaching in General Music
ERIC Educational Resources Information Center
Hayes, Christine Cozzens
2013-01-01
Inquiry-based learning is shown as an effective methodology to reach diverse student populations. It aligns with the National Center for Culturally Responsive Educational Systems and their methodology of culturally responsive teaching. (Contains 2 resources.)
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Novel analytical methods to assess the chemical and physical properties of liposomes.
Kothalawala, Nuwan; Mudalige, Thilak K; Sisco, Patrick; Linder, Sean W
2018-08-01
Liposomes are used in commercial pharmaceutical formulations (PFs) and dietary supplements (DSs) as a carrier vehicle to protect the active ingredient from degradation and to increase the half-life of the injectable. Even as the commercialization of liposomal products has rapidly increased, characterization methodologies to evaluate physical and chemical properties of the liposomal products have not been well-established. Herein we develop rapid methodologies to evaluate chemical and selected physical properties of liposomal formulations. Chemical properties of liposomes are determined by their lipid composition. The lipid composition is evaluated by first screening of the lipids present in the sample using HPLC-ELSD followed by HPLC-MSMS analysis with high mass accuracy (<5 ppm), fragmentation pattern and lipid structure databases searching. Physical properties such as particle size and size distribution were investigated using Tunable Resistive Pulse Sensing (TRPS). The developed methods were used to analyze commercially available PFs and DSs. As results, PFs contain distinct number of lipids as indicated by the manufacture, but DSs were more complicated containing a large number of lipids belonging to different sub-classes. Commercially available liposomes have particles with wide size distribution based on size measurements performed by TRPS. The high mass accuracy as well as identification lipids using multiple fragment ions aided to accurately identify the lipids and differentiate them from other lipophilic molecules. The developed analytical methodologies were successfully adapted to measure the physiochemical properties of commercial liposomes. Copyright © 2018. Published by Elsevier B.V.
A genomic regulatory network for development
NASA Technical Reports Server (NTRS)
Davidson, Eric H.; Rast, Jonathan P.; Oliveri, Paola; Ransick, Andrew; Calestani, Cristina; Yuh, Chiou-Hwa; Minokawa, Takuya; Amore, Gabriele; Hinman, Veronica; Arenas-Mena, Cesar;
2002-01-01
Development of the body plan is controlled by large networks of regulatory genes. A gene regulatory network that controls the specification of endoderm and mesoderm in the sea urchin embryo is summarized here. The network was derived from large-scale perturbation analyses, in combination with computational methodologies, genomic data, cis-regulatory analysis, and molecular embryology. The network contains over 40 genes at present, and each node can be directly verified at the DNA sequence level by cis-regulatory analysis. Its architecture reveals specific and general aspects of development, such as how given cells generate their ordained fates in the embryo and why the process moves inexorably forward in developmental time.
Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique
2009-04-01
A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.
Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.
The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAEmore » by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.« less
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Progress reports are presented for the following fuels researches: Development of analytical methodology for analysis of heave crudes; and thermochemistry and thermophysical properties of organic nitrogen and diheteroatom-containing compounds. Some of the accomplishments are: Topical reports summarizing GC/MS methodology for determination of amines in petroleum and catalytic cracking behavior of compound type in Wilmington 650{degrees} F+ resid were completed; density measurements between 320 K and 550 K were completed for 8-methylquinoline; high-temperature heat-capacities and critical temperature (near 800 K) for 8-methylquinoline were determined; vapor-pressure measurements were completed for 2,6-dimethylpyridine; and a series of enthalpy-of-combustion measurement was completed for 1,10-phenanthroline, phenazine,more » 2-methylquinoline, and 8-methylquinoline.« less
NASA Astrophysics Data System (ADS)
Nebot, Àngela; Mugica, Francisco
2012-10-01
Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.
Okahashi, Nobuyuki; Kohno, Susumu; Kitajima, Shunsuke; Matsuda, Fumio; Takahashi, Chiaki; Shimizu, Hiroshi
2015-12-01
Studying metabolic directions and flow rates in cultured mammalian cells can provide key information for understanding metabolic function in the fields of cancer research, drug discovery, stem cell biology, and antibody production. In this work, metabolic engineering methodologies including medium component analysis, (13)C-labeling experiments, and computer-aided simulation analysis were applied to characterize the metabolic phenotype of soft tissue sarcoma cells derived from p53-null mice. Cells were cultured in medium containing [1-(13)C] glutamine to assess the level of reductive glutamine metabolism via the reverse reaction of isocitrate dehydrogenase (IDH). The specific uptake and production rates of glucose, organic acids, and the 20 amino acids were determined by time-course analysis of cultured media. Gas chromatography-mass spectrometry analysis of the (13)C-labeling of citrate, succinate, fumarate, malate, and aspartate confirmed an isotopically steady state of the cultured cells. After removing the effect of naturally occurring isotopes, the direction of the IDH reaction was determined by computer-aided analysis. The results validated that metabolic engineering methodologies are applicable to soft tissue sarcoma cells derived from p53-null mice, and also demonstrated that reductive glutamine metabolism is active in p53-null soft tissue sarcoma cells under normoxia. Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Artistic image analysis using graph-based learning approaches.
Carneiro, Gustavo
2013-08-01
We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
APPLICATION OF A GEOGRAPHIC INFORMATION SYSTEM FOR A CONTAINMENT SYSTEM LEAK DETECTION
The use of physical and hydraulic containment systems for the isolation of contaminated ground water associated with hazardous waste sites has increased during the last decade. Existing methodologies for monitoring and evaluating leakage from hazardous waste containment systems ...
Anelone, Anet J N; Spurgeon, Sarah K
2017-02-01
It is demonstrated that the reachability paradigm from variable structure control theory is a suitable framework to monitor and predict the progression of the human immunodeficiency virus (HIV) infection following initiation of antiretroviral therapy (ART). A manifold is selected which characterises the infection-free steady-state. A model of HIV infection together with an associated reachability analysis is used to formulate a dynamical condition for the containment of HIV infection on the manifold. This condition is tested using data from two different HIV clinical trials which contain measurements of the CD4+ T cell count and HIV load in the peripheral blood collected from HIV infected individuals for the six month period following initiation of ART. The biological rates of the model are estimated using the multi-point identification method and data points collected in the initial period of the trial. Using the parameter estimates and the numerical solutions of the model, the predictions of the reachability analysis are shown to be consistent with the clinical diagnosis at the conclusion of the trial. The methodology captures the dynamical characteristics of eventual successful, failed and marginal outcomes. The findings evidence that the reachability analysis is an appropriate tool to monitor and develop personalised antiretroviral treatment.
Kamilari, Eleni; Farsalinos, Konstantinos; Poulas, Konstantinos; Kontoyannis, Christos G; Orkoula, Malvina G
2018-06-01
Electronic cigarettes are considered healthier alternatives to conventional cigarettes containing tobacco. They produce vapor through heating of the refill liquids (e-liquids) which consist of propylene glycol, vegetable glycerin, nicotine (in various concentrations), water and flavoring agents. Heavy metals may enter the refill liquid during the production, posing a risk for consumer's health due to their toxicity. The objective of the present study was the development of a methodology for the detection and quantitative analysis of cadmium (Cd), lead (Pb), nickel (Ni), copper (Cu), arsenic (As) and chromium (Cr), employing Total Reflection X-Ray Fluorescence Spectroscopy (TXRF) as an alternative technique to ICP-MS or ICP-OES commonly used for this type of analysis. TXRF was chosen due to its advantages, which include short analysis time, promptness, simultaneous multi-element analysis capability and minimum sample preparation, low purchase and operational cost. The proposed methodology was applied to a large number of electronic cigarette liquids commercially available, as well as their constituents, in order to evaluate their safety. TXRF may be a valuable tool for probing heavy metals in electronic cigarette refill liquids to serve for the protection of human health. Copyright © 2018 Elsevier Ltd. All rights reserved.
Arribas, Alberto Sánchez; Martínez-Fernández, Marta; Moreno, Mónica; Bermejo, Esperanza; Zapardiel, Antonio; Chicharro, Manuel
2014-06-01
A method was developed for the simultaneous detection of eight polyphenols (t-resveratrol, (+)-catechin, quercetin and p-coumaric, caffeic, sinapic, ferulic, and gallic acids) by CZE with electrochemical detection. Separation of these polyphenols was achieved within 25 min using a 200 mM borate buffer (pH 9.4) containing 10% methanol as separation electrolyte. Amperometric detection of polyphenols was carried out with a glassy carbon electrode (GCE) modified with a multiwalled carbon nanotubes (CNT) layer obtained from a dispersion of CNT in polyethylenimine. The excellent electrochemical properties of this modified electrode allowed the detection and quantification of the selected polyphenols in white wines without any pretreatment step, showing remarkable signal stability despite the presence of potential fouling substances in wine. The electrophoretic profiles of white wines, obtained using this methodology, have proven to be useful for the classification of these wines by means of chemometric multivariate techniques. Principal component analysis and discriminant analysis allowed accurate classification of wine samples on the basis of their grape varietal (verdejo and airén) using the information contained in selected zones of the electropherogram. The utility of the proposed CZE methodology based on the electrochemical response of CNT-modified electrodes appears to be promising in the field of wine industry and it is expected to be successfully extended to classification of a wider range of wines made of other grape varietals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improved Atmospheric Soundings and Error Estimates from Analysis of AIRS/AMSU Data
NASA Technical Reports Server (NTRS)
Susskind, Joel
2007-01-01
The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Three very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control; and 3) development of an accurate AIRS only cloud clearing and retrieval system. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions, without the need for microwave observations in the cloud clearing step as has been done previously. In this methodology, longwave C02 channel observations in the spectral region 700 cm-' to 750 cm-' are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm-' to 2395 cm-' are used for temperature sounding purposes. The new methodology for improved error estimates and their use in quality control is described briefly and results are shown indicative of their accuracy. Results are also shown of forecast impact experiments assimilating AIRS Version 5.0 retrieval products in the Goddard GEOS 5 Data Assimilation System using different quality control thresholds.
This document contains general comments on the original Indicators methodology, the toxicity weighting, the chronic ecological indicator and other issues. OPPT's responses and proposed changes are also discussed.
Biologically-inspired data decorrelation for hyper-spectral imaging
NASA Astrophysics Data System (ADS)
Picon, Artzai; Ghita, Ovidiu; Rodriguez-Vaamonde, Sergio; Iriondo, Pedro Ma; Whelan, Paul F.
2011-12-01
Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification
NASA Astrophysics Data System (ADS)
Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui
2016-07-01
Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.
Nelson, Jon P
2014-01-01
Precise estimates of price elasticities are important for alcohol tax policy. Using meta-analysis, this paper corrects average beer elasticities for heterogeneity, dependence, and publication selection bias. A sample of 191 estimates is obtained from 114 primary studies. Simple and weighted means are reported. Dependence is addressed by restricting number of estimates per study, author-restricted samples, and author-specific variables. Publication bias is addressed using funnel graph, trim-and-fill, and Egger's intercept model. Heterogeneity and selection bias are examined jointly in meta-regressions containing moderator variables for econometric methodology, primary data, and precision of estimates. Results for fixed- and random-effects regressions are reported. Country-specific effects and sample time periods are unimportant, but several methodology variables help explain the dispersion of estimates. In models that correct for selection bias and heterogeneity, the average beer price elasticity is about -0.20, which is less elastic by 50% compared to values commonly used in alcohol tax policy simulations. Copyright © 2013 Elsevier B.V. All rights reserved.
Ares, Florencia; Arrarte, Eloísa; De León, Tania; Ares, Gastón; Gámbaro, Adriana
2012-10-01
Sensory characteristics play a key role in determining consumers' acceptance of functional foods. In this context, the aim of the present work was to apply a combination of sensory and consumer methodologies to the development of chocolate milk desserts enriched with resistant starch. Chocolate milk desserts containing modified waxy maize starch were formulated with six different concentrations of two types of resistant starch (which are part of insoluble dietary fiber). The desserts were evaluated by trained assessors using Quantitative Descriptive Analysis. Moreover, consumers scored their overall liking and willingness to purchase and answered an open-ended question. Resistant starch caused significant changes in the sensory characteristics of the desserts and a significant decrease in consumers' overall liking and willingness to purchase. Consumer data was analyzed applying survival analysis on overall liking scores, considering the risk on consumers liking and willing to purchase the functional products less than their regular counterparts. The proposed methodologies proved to be useful to develop functional foods taking into account consumers' perception, which could increase their success in the market.
PCP METHODOLOGY FOR DETERMINING DOSE RATES FOR SMALL GRAM QUANTITIES IN SHIPPING PACKAGINGS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nathan, S.
The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials, are significantly less hazardous than large amounts of the same materials. This study describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials perform, under both normal and accident conditions, the essential functions of material containment, subcriticality, and maintain external radiation levels within regulatory limits.more » 10 CFR 71.33(b)(1)(2)&(3) state radioactive and fissile materials must be identified and their maximum quantity, chemical and physical forms be included in an application. Furthermore, the U.S. Federal Regulations require application contain an evaluation demonstrating the package (i.e., the packaging and its contents) satisfies the external radiation standards for all packages (10 CFR 71.31(2), 71.35(a), & 71.47). By placing the contents in a He leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large external dose rate. Quantifying of the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings described in this report provides bounding mass limits for a set of proposed SGQ isotopes. Methodology calculations were performed to estimate external radiation levels for the 9977 shipping package using the MCNP radiation transport code to develop a set of response multipliers (Green's functions) for 'dose per particle' for each neutron and photon spectral group. The source spectrum for each isotope generated using the ORIGEN-S and RASTA computer codes was folded with the response multipliers to generate the dose rate per gram of each isotope in the 9977 shipping package and its associated shielded containers. The maximum amount of a single isotope that could be shipped within the regulatory limits contained in 10 CFR 71.47 for dose rate at the surface of the package is determined. If a package contains a mixture of isotopes, the acceptability for shipment can be determined by a sum of fractions approach. Furthermore, the results of this analysis can be easily extended to additional radioisotopes by simply evaluating the neutron and/or photon spectra of those isotopes and folding the spectral data with the Green's functions provided.« less
Jacobo-Velázquez, D A; Ramos-Parra, P A; Hernández-Brenes, C
2010-08-01
High hydrostatic pressure (HHP) pasteurized and refrigerated avocado and mango pulps contain lower microbial counts and thus are safer and acceptable for human consumption for a longer period of time, when compared to fresh unprocessed pulps. However, during their commercial shelf life, changes in their sensory characteristics take place and eventually produce the rejection of these products by consumers. Therefore, in the present study, the use of sensory evaluation was proposed for the shelf-life determinations of HHP-processed avocado and mango pulps. The study focused on evaluating the feasibility of applying survival analysis methodology to the data generated by consumers in order to determine the sensory shelf lives of both HHP-treated pulps of avocado and mango. Survival analysis proved to be an effective methodology for the estimation of the sensory shelf life of avocado and mango pulps processed with HHP, with potential application for other pressurized products. Practical Application: At present, HHP processing is one of the most effective alternatives for the commercial nonthermal pasteurization of fresh tropical fruits. HHP processing improves the microbial stability of the fruit pulps significantly; however, the products continue to deteriorate during their refrigerated storage mainly due to the action of residual detrimental enzymes. This article proposes the application of survival analysis methodology for the determination of the sensory shelf life of HHP-treated avocado and mango pulps. Results demonstrated that the procedure appears to be simple and practical for the sensory shelf-life determination of HHP-treated foods when their main mode of failure is not caused by increases in microbiological counts that can affect human health.
Methodologies and Methods for User Behavioral Research.
ERIC Educational Resources Information Center
Wang, Peiling
1999-01-01
Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…
Standard methodologies for virus research in Apis mellifera
USDA-ARS?s Scientific Manuscript database
The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...
Standard methodologies for Nosema apis and N. ceranae research
USDA-ARS?s Scientific Manuscript database
The international research network COLOSS (Prevention of honey bee COlony LOSSes) was established to coordinate efforts towards improving the health of western honey bee at the global level. The COLOSS BEEBOOK contains a collection of chapters intended to standardized methodologies for monitoring ...
1986-01-01
by sensors in the test cell and sampled, digitized, averaged, and calibrated by the facility computer system. The data included flowrates calculated ...before the next test could be started. This required about 2 minutes. 6.4 Combat Damage Testing Appendix C contains calculations and analysis...were comparable (Figure 7-5). Agent quantities required per MIL-E-22285 were again calculated using the equations noted in paragraph 7.1.1. The
Divergent synthesis and identification of the cellular targets of deoxyelephantopins
NASA Astrophysics Data System (ADS)
Lagoutte, Roman; Serba, Christelle; Abegg, Daniel; Hoch, Dominic G.; Adibekian, Alexander; Winssinger, Nicolas
2016-08-01
Herbal extracts containing sesquiterpene lactones have been extensively used in traditional medicine and are known to be rich in α,β-unsaturated functionalities that can covalently engage target proteins. Here we report synthetic methodologies to access analogues of deoxyelephantopin, a sesquiterpene lactone with anticancer properties. Using alkyne-tagged cellular probes and quantitative proteomics analysis, we identified several cellular targets of deoxyelephantopin. We further demonstrate that deoxyelephantopin antagonizes PPARγ activity in situ via covalent engagement of a cysteine residue in the zinc-finger motif of this nuclear receptor.
Automated determination of dust particles trajectories in the coma of comet 67P
NASA Astrophysics Data System (ADS)
Marín-Yaseli de la Parra, J.; Küppers, M.; Perez Lopez, F.; Besse, S.; Moissl, R.
2017-09-01
During more than two years Rosetta spent at comet 67P, it took thousands of images that contain individual dust particles. To arrive at a statistics of the dust properties, automatic image analysis is required. We present a new methodology for fast-dust identification using a star mask reference system for matching a set of images automatically. The main goal is to derive particle size distributions and to determine if traces of the size distribution of primordial pebbles are still present in today's cometary dust [1].
From Patient Discharge Summaries to an Ontology for Psychiatry.
Richard, Marion; Aimé, Xavier; Jaulent, Marie-Christine; Krebs, Marie-Odile; Charlet, Jean
2017-01-01
Psychiatry aims at detecting symptoms, providing diagnoses and treating mental disorders. We developed ONTOPSYCHIA, an ontology for psychiatry in three modules: social and environmental factors of mental disorders, mental disorders, and treatments. The use of ONTOPSYCHIA, associated with dedicated tools, will facilitate semantic research in Patient Discharge Summaries (PDS). To develop the first module of the ontology we propose a PDS text analysis in order to explicit psychiatry concepts. We decided to set aside classifications during the construction of the modu le, to focus only on the information contained in PDS (bottom-up approach) and to return to domain classifications solely for the enrichment phase (top-down approach). Then, we focused our work on the development of the LOVMI methodology (Les Ontologies Validées par Méthode Interactive - Ontologies Validated by Interactive Method), which aims to provide a methodological framework to validate the structure and the semantic of an ontology.
NASA Astrophysics Data System (ADS)
Grasel, Fábio dos Santos; Ferrão, Marco Flôres; Wolf, Carlos Rodolfo
2016-01-01
Tannins are polyphenolic compounds of complex structures formed by secondary metabolism in several plants. These polyphenolic compounds have different applications, such as drugs, anti-corrosion agents, flocculants, and tanning agents. This study analyses six different type of polyphenolic extracts by Fourier transform infrared spectroscopy (FTIR) combined with multivariate analysis. Through both principal component analysis (PCA) and hierarchical cluster analysis (HCA), we observed well-defined separation between condensed (quebracho and black wattle) and hydrolysable (valonea, chestnut, myrobalan, and tara) tannins. For hydrolysable tannins, it was also possible to observe the formation of two different subgroups between samples of chestnut and valonea and between samples of tara and myrobalan. Among all samples analysed, the chestnut and valonea showed the greatest similarity, indicating that these extracts contain equivalent chemical compositions and structure and, therefore, similar properties.
DOT National Transportation Integrated Search
2010-02-01
This project developed a methodology to couple a new pollutant dispersion model with a traffic : assignment process to contain air pollution while maximizing mobility. The overall objective of the air : quality modeling part of the project is to deve...
Detection of Cyanotoxins During Potable Water Treatment
USDA-ARS?s Scientific Manuscript database
In 2007, the U.S. EPA listed three cyanobacterial toxins on the CCL3 containment priority list for potable drinking waters. This paper describes all methodologies used for detection of these toxins, and assesses each on a cost/benefit basis. Methodologies for microcystin, cylindrospermopsin, and a...
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
FY 1998 Proposed Rail Improvement Program Supplement
DOT National Transportation Integrated Search
1997-01-01
This FY 1998 Proposed Rail Improvement Program Supplement contains those rail plan amendments which have been published subsequent to the FY 1997 Proposed Rail Improvement program supplement. This document also contains the benefit/cost methodology u...
Consistency and accuracy of indexing systematic review articles and meta-analyses in medline.
Wilczynski, Nancy L; Haynes, R Brian
2009-09-01
Systematic review articles support the advance of science and translation of research evidence into healthcare practice. Inaccurate retrieval from medline could limit access to reviews. To determine the quality of indexing systematic reviews and meta-analyses in medline. The Clinical Hedges Database, containing the results of a hand search of 161 journals, was used to test medline indexing terms for their ability to retrieve systematic reviews that met predefined methodologic criteria (labelled as 'pass' review articles) and reviews that reported a meta-analysis. The Clinical Hedges Database contained 49 028 articles; 753 were 'pass' review articles (552 with a meta-analysis). In total 758 review articles (independent of whether they passed) reported a meta-analysis. The search strategy that retrieved the highest number of 'pass' systematic reviews achieved a sensitivity of 97.1%. The publication type 'meta analysis' had a false positive rate of 5.6% (95% CI 3.9 to 7.6), and false negative rate of 0.31% (95% CI 0.26 to 0.36) for retrieving systematic reviews that reported a meta-analysis. Inaccuracies in indexing systematic reviews and meta-analyses in medline can be partly overcome by a 5-term search strategy. Introducing a publication type for systematic reviews of the literature could improve retrieval performance.
Capillary isotachophoresis for the analysis of ionic liquid entities.
Markowska, Aleksandra; Stepnowski, Piotr
2010-07-01
Simple, selective and sensitive isotachophoretic methods for the analysis of ionic liquid (IL) compartments were developed in this study. A leading electrolyte containing 10 mM L-histidine + 10 mM histidine hydrochloride and a terminating electrolyte containing 5 mM glutamic acid + 5 mM L-histidine were selected to separate nitrate(V), chlorate(V), hexafluorophosphate, dicyanimide, trifluoromethanesulfonate, phosphate(V) and bis(trifluoromethanesulfonyl)imide in anionic mode. In contrast, seven short-chain alkylimidazolium, alkylpyrrolidinium, alkylpyridinium and non-chromophoric tetraalkylammonium and tetraalkylphosphonium IL cations were separated with 10 mM potassium hydroxide + 10 mM acetic acid as leading electrolyte, and 10 mM beta-alanine + 10 mM acetate as terminating electrolyte. Both methods were optimized and validated with good analytical performance parameters. LOD was about 3-5 microM, and the repeatability lay in the range of 1.06-5.59%. These methods were evaluated for their applicability to the analysis of soil samples and freshwater contaminated with ILs. In light of hitherto the absence of reports on the determination of non-chromophoric IL cations, this study delivers for the first time a universal method enabling analysis of these species. Moreover, as there is still significant lack of methodologies of IL anion analysis, the obtained results offer an interesting alternative in that matter.
Training Methodology. Part 3: Instructional Methods and Techniques; an Annotated Bibliography.
ERIC Educational Resources Information Center
National Inst. of Mental Health (DHEW), Bethesda, MD.
One of a series of bibliographies within a larger series on mental health inservice training and training methodology, this publication contains 346 abstracts, annotations, and other recent selected references (largely 1960-68) on apprenticeship, coaching, programmed instruction, correspondence study, lectures, group discussion, meetings,…
Harmonizing Automatic Test System Assets, Drivers, and Control Methodologies
1999-07-18
ORGANIZATION PRINCIPAL AREAS OF INTEREST TO ATS NAME 1394 TA Firewire Trade Association Defining high speed bus protocol Active Group Accelerating ActiveX ...System Assets, Drivers, and Control Methodologies 17 JUL, 1999 component is a diagonal matrix containing scaling values such that when the three
Francis, Andrew J; Resendiz, Marino J E
2017-07-28
Solid-phase synthesis has been used to obtain canonical and modified polymers of nucleic acids, specifically of DNA or RNA, which has made it a popular methodology for applications in various fields and for different research purposes. The procedure described herein focuses on the synthesis, purification, and characterization of dodecamers of RNA 5'-[CUA CGG AAU CAU]-3' containing zero, one, or two modifications located at the C2'-O-position. The probes are based on 2-thiophenylmethyl groups, incorporated into RNA nucleotides via standard organic synthesis and introduced into the corresponding oligonucleotides via their respective phosphoramidites. This report makes use of phosphoramidite chemistry via the four canonical nucleobases (Uridine (U), Cytosine (C), Guanosine (G), Adenosine (A)), as well as 2-thiophenylmethyl functionalized nucleotides modified at the 2'-O-position; however, the methodology is amenable for a large variety of modifications that have been developed over the years. The oligonucleotides were synthesized on a controlled-pore glass (CPG) support followed by cleavage from the resin and deprotection under standard conditions, i.e., a mixture of ammonia and methylamine (AMA) followed by hydrogen fluoride/triethylamine/N-methylpyrrolidinone. The corresponding oligonucleotides were purified via polyacrylamide electrophoresis (20% denaturing) followed by elution, desalting, and isolation via reversed-phase chromatography (Sep-pak, C18-column). Quantification and structural parameters were assessed via ultraviolet-visible (UV-vis) and circular dichroism (CD) photometric analysis, respectively. This report aims to serve as a resource and guide for beginner and expert researchers interested in embarking in this field. It is expected to serve as a work-in-progress as new technologies and methodologies are developed. The description of the methodologies and techniques within this document correspond to a DNA/RNA synthesizer (refurbished and purchased in 2013) that uses phosphoramidite chemistry.
In-patient costs of agitation and containment in a mental health catchment area.
Serrano-Blanco, Antoni; Rubio-Valera, Maria; Aznar-Lou, Ignacio; Baladón Higuera, Luisa; Gibert, Karina; Gracia Canales, Alfredo; Kaskens, Lisette; Ortiz, José Miguel; Salvador-Carulla, Luis
2017-06-06
There is a scarce number of studies on the cost of agitation and containment interventions and their results are still inconclusive. We aimed to calculate the economic consequences of agitation events in an in-patient psychiatric facility providing care for an urban catchment area. A mixed approach combining secondary analysis of clinical databases, surveys and expert knowledge was used to model the 2013 direct costs of agitation and containment events for adult inpatients with mental disorders in an area of 640,572 adult inhabitants in South Barcelona (Spain). To calculate costs, a seven-step methodology with novel definition of agitation was used along with a staff survey, a database of containment events, and data on aggressive incidents. A micro-costing analysis of specific containment interventions was used to estimate both prevalence and direct costs from the healthcare provider perspective, by means of a mixed approach with a probabilistic model evaluated on real data. Due to the complex interaction of the multivariate covariances, a sensitivity analysis was conducted to have empirical bounds of variability. During 2013, 918 patients were admitted to the Acute Inpatient Unit. Of these, 52.8% were men, with a mean age of 44.6 years (SD = 15.5), 74.4% were compulsory admissions, 40.1% were diagnosed with schizophrenia or non-affective psychosis, with a mean length of stay of 24.6 days (SD = 16.9). The annual estimate of total agitation events was 508. The cost of containment interventions ranges from 282€ at the lowest level of agitation to 822€ when verbal containment plus seclusion and restraint have to be used. The annual total cost of agitation was 280,535€, representing 6.87% of the total costs of acute hospitalisation in the local area. Agitation events are frequent and costly. Strategies to reduce their number and severity should be implemented to reduce costs to the Health System and alleviate patient suffering.
Analysis of methods. [information systems evolution environment
NASA Technical Reports Server (NTRS)
Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.
1991-01-01
Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.
Valuation effects of health cost containment measures.
Strange, M L; Ezzell, J R
2000-01-01
This study reports the findings of research into the valuation effects of health cost containment activities by publicly traded corporations. The motivation for this study was employers' increasing cost of providing health care insurance to their employees and employers' efforts to contain those costs. A 1990 survey of corporate health benefits indicated that these costs represented 25 percent of employers' net earnings and this would rise by the year 2000 if no actions were taken to reduce cost. Health cost containment programs that are implemented by firms should be seen by shareholders as a wealth maximizing effort. As such, this should be reflected in share price. This study employed standard event study methodology where the event is a media announcement or report regarding an attempt by a firm to contain the costs of providing health insurance and other health related benefits to employees. It examined abnormal returns on a number of event days and for a number of event intervals. Of the daily and interval returns that are least significant at the 10 percent level, virtually all are negative. Cross-sectional analysis shows that the abnormal returns are related negatively to a unionization variable.
Evaluation of soil water stable isotope analysis by H2O(liquid)-H2O(vapor) equilibration method
NASA Astrophysics Data System (ADS)
Gralher, Benjamin; Stumpp, Christine
2014-05-01
Environmental tracers like stable isotopes of water (δ18O, δ2H) have proven to be valuable tools to study water flow and transport processes in soils. Recently, a new technique for soil water isotope analysis has been developed that employs a vapor phase being in isothermal equilibrium with the liquid phase of interest. This has increased the potential application of water stable isotopes in unsaturated zone studies as it supersedes laborious extraction of soil water. However, uncertainties of analysis and influencing factors need to be considered. Therefore, the objective of this study was to evaluate different methodologies of analysing stable isotopes in soil water in order to reduce measurement uncertainty. The methodologies included different preparation procedures of soil cores for equilibration of vapor and soil water as well as raw data correction. Two different inflatable sample containers (freezer bags, bags containing a metal layer) and equilibration atmospheres (N2, dry air) were tested. The results showed that uncertainties for δ18O were higher compared to δ2H that cannot be attributed to any specific detail of the processing routine. Particularly, soil samples with high contents of organic matter showed an apparent isotope enrichment which is indicative for fractionation due to evaporation. However, comparison of water samples obtained from suction cups with the local meteoric water line indicated negligible fractionation processes in the investigated soils. Therefore, a method was developed to correct the raw data reducing the uncertainties of the analysis.. We conclude that the evaluated method is advantageous over traditional methods regarding simplicity, resource requirements and sample throughput but careful consideration needs to be made regarding sample handling and data processing. Thus, stable isotopes of water are still a good tool to determine water flow and transport processes in the unsaturated zone.
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
O'Brien, Daniel Tumminelli; Montgomery, Barrett W
2015-03-01
Much research has focused on physical disorder in urban neighborhoods as evidence that the community does not maintain local norms and spaces. Little attention has been paid to the opposite: indicators of proactive investment in the neighborhood's upkeep. This manuscript presents a methodology that translates a database of approved building permits into an ecometric of investment by community members, establishing basic content, criteria for reliability, and construct validity. A database from Boston, MA contained 150,493 permits spanning 2.5 years, each record including the property to be modified, permit type, and date issued. Investment was operationalized as the proportion of properties in a census block group that underwent an addition or renovation, excluding larger developments involving the demolition or construction of a building. The reliability analysis found that robust measures could be generated every 6 months, and that longitudinal analysis could differentiate between trajectories across neighborhoods. The validity analysis supported two hypotheses: investment was best predicted by homeownership and median income; and maintained an independent relationship with measures of physical disorder despite controlling for demographics, implying that it captures the other end of a spectrum of neighborhood maintenance. Possible uses for the measure in research and policy are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Denys, R.M.; Martin, J.T.
1995-02-01
Modern pipeline standards contain alternative methodologies for determining the acceptable defect size in pipeline welds. Through the use of fracture mechanics and plastic collapse assessments, the mechanical and toughness properties of the defective region relate to the applied stress at the defect and defect geometry. The assumptions made in these methodologies are not always representative of the situation accurring in pipeline girth welds. To determine the effect of the various input parameters on acceptable defect size, The Welding Supervisory Committee of the American Gas Association commenced in 1990, in collaboration with the Laboratorium Soete of the University Gent, Belgium, amore » series of small scale (Charpy V impact and CTOD) and large scale (fatigue pre-cracked wide plate) tests. All the experimental investigations were intended to evaluate the effects of weld metal mis-match, temperature, defect size, defect type, defect interaction, pipe wall thickness and yield to tensile ratio on girth weld fracture behaviour. The aim of this report was to determine how weld metal yield strength overmatching or undermatching influences girth weld defect size prediction. A further analysis was conducted using the newly revised PD6493:1991 to provide a critical analysis with the objective of explaining the behaviour of the wide plate tests.« less
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.
In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.
Progressive Fracture of Composite Structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon
2001-01-01
This report includes the results of a research in which the COmposite Durability STRuctural ANalysis (CODSTRAN) computational simulation capabilities were augmented and applied to various structures for demonstration of the new features and verification. The first chapter of this report provides an introduction to the computational simulation or virtual laboratory approach for the assessment of damage and fracture progression characteristics in composite structures. The second chapter outlines the details of the overall methodology used, including the failure criteria and the incremental/iterative loading procedure with the definitions of damage, fracture, and equilibrium states. The subsequent chapters each contain an augmented feature of the code and/or demonstration examples. All but one of the presented examples contains laminated composite structures with various fiber/matrix constituents. For each structure simulated, damage initiation and progression mechanisms are identified and the structural damage tolerance is quantified at various degradation stages. Many chapters contain the simulation of defective and defect free structures to evaluate the effects of existing defects on structural durability.
The effect of docetaxel on developing oedema in patients with breast cancer: a systematic review.
Hugenholtz-Wamsteker, W; Robbeson, C; Nijs, J; Hoelen, W; Meeus, M
2016-03-01
Docetaxel is extensively used in chemotherapy for the treatment of breast cancer. Little attention has been given to oedema as a possible side effect of docetaxel-containing therapies. Until now, no review was conducted to evaluate docetaxel-containing therapies versus docetaxel-free therapies on the magnitude of the risk of developing oedema. In this systematic review, we investigated the risk of developing oedema in patients being treated for breast cancer with or without docetaxel. In this systematic literature review, we searched PubMed and Web of Knowledge for studies on breast cancer patients treated with chemotherapy containing docetaxel. We included clinical trials comparing docetaxel versus docetaxel-free chemotherapy. Oedema had to be reported and measured as a key outcome or an adverse effect. Methodological checklists were used to assess the risk of bias within the selected studies. Seven randomised clinical trials were included. Six trials were of moderate methodological quality. All trials showed an increased rate of oedema in the docetaxel-treatment arm. The trial of weakest methodological quality reported the highest incidence of oedema. The results moderately suggest that adjuvant chemotherapy containing docetaxel is related to a significantly increased risk of developing oedema, compared with docetaxel-free chemotherapy. © 2014 John Wiley & Sons Ltd.
Fardin-Kia, Ali Reza; Delmonte, Pierluigi; Kramer, John K G; Jahreis, Gerhard; Kuhnt, Katrin; Santercole, Viviana; Rader, Jeanne I
2013-12-01
The fatty acids contained in marine oils or products are traditionally analyzed by gas chromatography using capillary columns coated with polyethylene glycol phases. Recent reports indicate that 100 % cyanopropyl siloxane phases should also be used when the analyzed samples contain trans fatty acids. We investigated the separation of the fatty acid methyl esters prepared from menhaden oil using the more polar SLB-IL111 (200 m × 0.25 mm) ionic liquid capillary column and the chromatographic conditions previously optimized for the separation of the complex mixture of fatty acid methyl esters prepared from milk fat. Identifications of fatty acids were achieved by applying Ag(+)-HPLC fractionation and GC-TOF/MS analysis in CI(+) mode with isobutane as the ionization reagent. Calculation of equivalent chain lengths confirmed the assignment of double bond positions. This methodology allowed the identification of 125 fatty acids in menhaden oil, including isoprenoid and furanoid fatty acids, and the novel 7-methyl-6-hexadecenoic and 7-methyl-6-octadecenoic fatty acids. The chromatographic conditions applied in this study showed the potential of separating in a single 90-min analysis, among others, the short chain and trans fatty acids contained in dairy products, and the polyunsaturated fatty acids contained in marine products.
Discourse Analysis and the Study of Educational Leadership
ERIC Educational Resources Information Center
Anderson, Gary; Mungal, Angus Shiva
2015-01-01
Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…
76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY: Federal Student Aid, Department of Education. ACTION: Notice of revision of the Federal Need Analysis...; 84.268; 84.379]. Federal Need Analysis Methodology for the 2012-2013 award year; Federal Pell Grant...
NASA Technical Reports Server (NTRS)
Oishi, Meeko; Tomlin, Claire; Degani, Asaf
2003-01-01
Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.
Rizo-Decelis, L D; Pardo-Igúzquiza, E; Andreo, B
2017-12-15
In order to treat and evaluate the available data of water quality and fully exploit monitoring results (e.g. characterize regional patterns, optimize monitoring networks, infer conditions at unmonitored locations, etc.), it is crucial to develop improved and efficient methodologies. Accordingly, estimation of water quality along fluvial ecosystems is a frequent task in environment studies. In this work, a particular case of this problem is examined, namely, the estimation of water quality along a main stem of a large basin (where most anthropic activity takes place), from observational data measured along this river channel. We adapted topological kriging to this case, where each watershed contains all the watersheds of the upstream observed data ("nested support effect"). Data analysis was additionally extended by taking into account the upstream distance to the closest contamination hotspot as an external drift. We propose choosing the best estimation method by cross-validation. The methodological approach in spatial variability modeling may be used for optimizing the water quality monitoring of a given watercourse. The methodology presented is applied to 28 water quality variables measured along the Santiago River in Western Mexico. Copyright © 2017 Elsevier B.V. All rights reserved.
Singh, Pankaj Kumar; Negi, Arvind; Gupta, Pawan Kumar; Chauhan, Monika; Kumar, Raj
2016-08-01
Toxicity is a common drawback of newly designed chemotherapeutic agents. With the exception of pharmacophore-induced toxicity (lack of selectivity at higher concentrations of a drug), the toxicity due to chemotherapeutic agents is based on the toxicophore moiety present in the drug. To date, methodologies implemented to determine toxicophores may be broadly classified into biological, bioanalytical and computational approaches. The biological approach involves analysis of bioactivated metabolites, whereas the computational approach involves a QSAR-based method, mapping techniques, an inverse docking technique and a few toxicophore identification/estimation tools. Being one of the major steps in drug discovery process, toxicophore identification has proven to be an essential screening step in drug design and development. The paper is first of its kind, attempting to cover and compare different methodologies employed in predicting and determining toxicophores with an emphasis on their scope and limitations. Such information may prove vital in the appropriate selection of methodology and can be used as screening technology by researchers to discover the toxicophoric potentials of their designed and synthesized moieties. Additionally, it can be utilized in the manipulation of molecules containing toxicophores in such a manner that their toxicities might be eliminated or removed.
NASA Astrophysics Data System (ADS)
Ayoobi, Iman; Tangestani, Majid H.
2017-10-01
This study investigates the effect of spatial subsets of Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) L1B visible-near infrared and short wave-infrared (VNIR-SWIR) data on matched filtering results at the central part of Kerman magmatic arc, where abundant porphyry copper deposits exist. The matched filtering (MF) procedure was run separately at sites containing hydrothermal minerals such as sericite, kaolinite, chlorite, and jarosite to map the abundances of these minerals on spatial subsets containing 100, 75, 50, and 25 percent of the original scene. Results were evaluated by comparing the matched filtering scores with the mineral abundances obtained by semi-quantitative XRD analysis of corresponding field samples. It was concluded that MF method should be applied to the whole scene prior to any data subsetting.
Grasel, Fábio dos Santos; Ferrão, Marco Flôres; Wolf, Carlos Rodolfo
2016-01-15
Tannins are polyphenolic compounds of complex structures formed by secondary metabolism in several plants. These polyphenolic compounds have different applications, such as drugs, anti-corrosion agents, flocculants, and tanning agents. This study analyses six different type of polyphenolic extracts by Fourier transform infrared spectroscopy (FTIR) combined with multivariate analysis. Through both principal component analysis (PCA) and hierarchical cluster analysis (HCA), we observed well-defined separation between condensed (quebracho and black wattle) and hydrolysable (valonea, chestnut, myrobalan, and tara) tannins. For hydrolysable tannins, it was also possible to observe the formation of two different subgroups between samples of chestnut and valonea and between samples of tara and myrobalan. Among all samples analysed, the chestnut and valonea showed the greatest similarity, indicating that these extracts contain equivalent chemical compositions and structure and, therefore, similar properties. Copyright © 2015 Elsevier B.V. All rights reserved.
Study of intensification zones in a rectangular acoustic cavity
NASA Technical Reports Server (NTRS)
Peretti, Linda F.; Dowell, Earl H.
1992-01-01
The interior acoustic field of a rectangular acoustic cavity, which is excited by the structural vibration of one of its walls, or a portion of the wall, has been studied. Particularly, the spatial variations of sound pressure levels from the peak levels at the boundaries (intensification zones) to the uniform interior are considered. Analytical expressions, which describe the intensification zones, are obtained using the methodology of asymptotic modal analysis. These results agree well with results computed by a discrete summation over all of the modes. The intensification zones were also modeled as a set of oblique waves incident upon a surface. The result for a rigid surface agrees with the asymptotic modal analysis result. In the presence of an absorptive surface, the character of the intensification zone is dramatically changed. The behavior of the acoustic field near an absorptive wall is described by an expression containing the rigid wall result plus additional terms containing impedance information. The important parameter in the intensification zone analysis is the bandwidth to center frequency ratio. The effect of bandwidth is separated from that of center frequency by expanding the expression about the center frequency wave number. The contribution from the bandwidth is second order in bandwidth to center frequency ratio.
Methodologies for reducing truck turn time at marine container terminals.
DOT National Transportation Integrated Search
2005-05-01
One of the prominent issues container terminal operators in the US are seeking to address is how to effectively : reduce truck turn time. Historically, truck turn time has received very little attention from terminal operators because port : congesti...
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
Foreign Language Methodology Conference Workshop Reports, 1976.
ERIC Educational Resources Information Center
Carranza, Jose M., Ed.; Whitmer, Robert L., Ed.
This collection resulting from a workshop on language teaching methodology contains the following papers: (1) "The Role of Culture in Foreign Language Learning," by N. Brooks; (2) "Guidelines and Ideas to Boost the Enrollment in Foreign Language Courses," by L.F. Gonzalez-Cruz; "Cooking in the Classroom," by K. Boykin; (4) "Performance Based…
NASA Technical Reports Server (NTRS)
Hargrove, William T.
1991-01-01
This methodology is used to determine inspection procedures and intervals for components contained within tank mounted air compressor systems (TMAC) and base mounted air compressor systems (BMAC). These systems are included in the Pressure Vessel and System Recertification inventory at GSFC.
Meijster, Tim; van Duuren-Stuurman, Birgit; Heederik, Dick; Houba, Remko; Koningsveld, Ernst; Warren, Nicholas; Tielemans, Erik
2011-10-01
Use of cost-benefit analysis in occupational health increases insight into the intervention strategy that maximises the cost-benefit ratio. This study presents a methodological framework identifying the most important elements of a cost-benefit analysis for occupational health settings. One of the main aims of the methodology is to evaluate cost-benefit ratios for different stakeholders (employers, employees and society). The developed methodology was applied to two intervention strategies focused on reducing respiratory diseases. A cost-benefit framework was developed and used to set up a calculation spreadsheet containing the inputs and algorithms required to calculate the costs and benefits for all cost elements. Inputs from a large variety of sources were used to calculate total costs, total benefits, net costs and the benefit-to-costs ratio for both intervention scenarios. Implementation of a covenant intervention program resulted in a net benefit of €16 848 546 over 20 years for a population of 10 000 workers. Implementation was cost-effective for all stakeholders. For a health surveillance scenario, total benefits resulting from a decreased disease burden were estimated to be €44 659 352. The costs of the interventions could not be calculated. This study provides important insights for developing effective intervention strategies in the field of occupational medicine. Use of a model based approach enables investigation of those parameters most likely to impact on the effectiveness and costs of interventions for work related diseases. Our case study highlights the importance of considering different perspectives (of employers, society and employees) in assessing and sharing the costs and benefits of interventions.
A comprehensive methodology for the multidimensional and synchronic data collecting in soundscape.
Kogan, Pablo; Turra, Bruno; Arenas, Jorge P; Hinalaf, María
2017-02-15
The soundscape paradigm is comprised of complex living systems where individuals interact moment-by-moment among one another and with the physical environment. The real environments provide promising conditions to reveal deep soundscape behavior, including the multiple components involved and their interrelations as a whole. However, measuring and analyzing the numerous simultaneous variables of soundscape represents a challenge that is not completely understood. This work proposes and applies a comprehensive methodology for multidimensional and synchronic data collection in soundscape. The soundscape variables were organized into three main entities: experienced environment, acoustic environment, and extra-acoustic environment, containing, in turn, subgroups of variables called components. The variables contained in these components were acquired through synchronic field techniques that include surveys, acoustic measurements, audio recordings, photography, and video. The proposed methodology was tested, optimized, and applied in diverse open environments, including squares, parks, fountains, university campuses, streets, and pedestrian areas. The systematization of this comprehensive methodology provides a framework for soundscape research, a support for urban and environment management, and a preliminary procedure for standardization in soundscape data collecting. Copyright © 2016 Elsevier B.V. All rights reserved.
Self-Contained Automated Methodology for Optimal Flow Control
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff
1997-01-01
This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.
An Analysis of Software Design Methodologies
1979-08-01
L[ I + C T 8L1j~j)+ T 2M •_.P and "or" symbols , and with explicit indications of iteration. For example, Figure 5a (from Bell et al, 1977...contains a structure chart with logical "and" ("&") and ."or" ("+’) symbols . Figure 5b illustrates Jacksin’s (1977) approach, in which asterisks ("*") are...is suggested. Such a data flow graph is illustrated in Figure 14. In this case, "T" is the TRANSACTION CENTER and the "(D " symbol is used to indi
Soft fruit traceability in food matrices using real-time PCR.
Palmieri, Luisa; Bozza, Elisa; Giongo, Lara
2009-02-01
Food product authentication provides a means of monitoring and identifying products for consumer protection and regulatory compliance. There is a scarcity of analytical methods for confirming the identity of fruit pulp in products containing Soft Fruit. In the present work we have developed a very sensible qualitative and quantitative method to determine the presence of berry DNAs in different food matrices. To our knowledge, this is the first study that shows the applicability, to Soft Fruit traceability, of melting curve analysis and multiplexed fluorescent probes, in a Real-Time PCR platform. This methodology aims to protect the consumer from label misrepresentation.
Investigating System Dependability Modeling Using AADL
NASA Technical Reports Server (NTRS)
Hall, Brendan; Driscoll, Kevin R.; Madl, Gabor
2013-01-01
This report describes Architecture Analysis & Design Language (AADL) models for a diverse set of fault-tolerant, embedded data networks and describes the methods and tools used to created these models. It also includes error models per the AADL Error Annex. Some networks were modeled using Error Detection Isolation Containment Types (EDICT). This report gives a brief description for each of the networks, a description of its modeling, the model itself, and evaluations of the tools used for creating the models. The methodology includes a naming convention that supports a systematic way to enumerate all of the potential failure modes.
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
Using a Realist Research Methodology in Policy Analysis
ERIC Educational Resources Information Center
Lourie, Megan; Rata, Elizabeth
2017-01-01
The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…
Precision-Guided Munitions Effects Representation
2017-01-03
Center for Army Analysis (CAA) by the TRADOC Analysis Center, Monterey (TRAC-MTRY). The focus of the research is to improve the current methodology ... Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-2 Timeline... Methodology . . . . . . . . . . . . . . . . . . . . C-1 MATLAB Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-49 Damage
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
A Review of Citation Analysis Methodologies for Collection Management
ERIC Educational Resources Information Center
Hoffmann, Kristin; Doucette, Lise
2012-01-01
While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Strangeness Production in Jets with ALICE at the LHC
NASA Astrophysics Data System (ADS)
Smith, Chrismond; Harton, Austin; Garcia, Edmundo; Alice Collaboration
2016-03-01
The study of strange particle production is an important tool for understanding the properties of the hot and dense QCD medium created in heavy-ion collisions at ultra-relativistic energies. The study of strange particles in these collisions provides information on parton fragmentation, a fundamental QCD process. While measurements at low and intermediate pT, are already in progress at the LHC, the study of high momentum observables is equally important for a complete understanding of the QCD matter, this can be achieved by studying jet interactions. We propose the measurement of the characteristics of the jets containing strange particles. Starting with proton-proton collisions, we have calculated the inclusive pTJet spectra and the spectra for jets containing strange particles (K-short or lambda), and we are extending this analysis to lead-lead collisions. In this talk the ALICE experiment will be described, the methodology used for the data analysis and the available results will be discussed. This material is based upon work supported by the National Science Foundation under Grants PHY-1305280 and PHY-1407051.
Chemical synthesis and NMR characterization of structured polyunsaturated triacylglycerols.
Fauconnot, Laëtitia; Robert, Fabien; Villard, Renaud; Dionisi, Fabiola
2006-02-01
The chemical synthesis of pure triacylglycerol (TAG) regioisomers, that contain long chain polyunsaturated fatty acids, such as arachidonic acid (AA) or docosahexaenoic acid (DHA), and saturated fatty acids, such as lauric acid (La) or palmitic acid (P), at defined positions, is described. A single step methodology using (benzotriazol-1-yloxy)-tripyrrolidinophosphonium hexafluorophosphate (PyBOP), an activator of carboxyl group commonly used in peptide synthesis and occasionally used in carboxylic acid esterification, has been developed for structured TAG synthesis. Identification of the fatty acyl chains for each TAG species was confirmed by atmospheric pressure chemical ionisation mass spectrometry (APCI-MS) and fatty acid positional distribution was determined by (1)H and (13)C NMR spectra. The generic described procedures can be applied to a large variety of substrates and was used for the production of specific triacylglycerols of defined molecular structures, with high regioisomeric purity. Combination of MS and NMR was shown to be an efficient tool for structural analysis of TAG. In particular, some NMR signals were demonstrated to be regioisomer specific, allowing rapid positional analysis of LC-PUFA containing TAG.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Felicione, F. S.
2006-01-23
The potential for generation of gases in transuranic (TRU) waste by microbial activity, chemical interactions, corrosion, and radiolysis was addressed in the Argonne National Laboratory-West (ANL-West) Gas-Generation Experiments (GGE). Data was collected over several years by simulating the conditions in the Waste Isolation Pilot Plant (WIPP) after the eventual intrusion of brine into the repository. Fourteen test containers with various actual TRU waste immersed in representative brine were inoculated with WIPP-relevant microbes, pressurized with inert gases, and kept in an inert-atmosphere environment for several years to provide estimates of the gas-generation rates that will be used in computer models formore » future WIPP Performance Assessments. Modest temperature variations occurred during the long-term ANL-West experiments. Although the experiment temperatures always remained well within the experiment specifications, the small temperature variation was observed to affect the test container pressure far more than had been anticipated. In fact, the pressure variations were so large, and seemingly erratic, that it was impossible to discern whether the data was even valid and whether the long-term pressure trend was increasing, decreasing, or constant. The result was that no useful estimates of gas-generation rates could be deduced from the pressure data. Several initial attempts were made to quantify the pressure fluctuations by relating these to the measured temperature variation, but none was successful. The work reported here carefully analyzed the pressure measurements to determine if these were valid or erroneous data. It was found that a thorough consideration of the physical phenomena that were occurring can, in conjunction with suitable gas laws, account quite accurately for the pressure changes that were observed. Failure of the earlier attempts to validate the data was traced to the omission of several phenomena, the most important being the variation in the headspace volume caused by thermal expansion and contraction within the brine and waste. A further effort was directed at recovering useful results from the voluminous archived pressure data. An analytic methodology to do this was developed. This methodology was applied to each archived pressure measurement to nullify temperature and other effects to yield an adjusted pressure, from which gas-generation rates could be calculated. A review of the adjusted-pressure data indicated that generated-gas concentrations among these containers after approximately 3.25 years of test operation ranged from zero to over 17,000 ppm by volume. Four test containers experienced significant gas generation. All test containers that showed evidence of significant gas generation contained carbon-steel in the waste, indicating that corrosion was the predominant source of gas generation.« less
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Transportation networks : data, analysis, methodology development and visualization.
DOT National Transportation Integrated Search
2007-12-29
This project provides data compilation, analysis methodology and visualization methodology for the current network : data assets of the Alabama Department of Transportation (ALDOT). This study finds that ALDOT is faced with a : considerable number of...
NASA Astrophysics Data System (ADS)
Aleva, D.; McCracken, J.
This paper will overview a Cognitive Task Analysis (CTA) of the tasks accomplished by space operators in the Combat Operations Division (COD) of the Joint Space Operations Center (JSpOC). The methodology used to collect data will be presented. The work was performed in support of the AFRL Space Situation Awareness Fusion Intelligent Research Environment (SAFIRE) effort. SAFIRE is a multi-directorate program led by Air Force Research Laboratory (AFRL), Space Vehicles Directorate (AFRL/RV) and supporting Future Long Term Challenge 2.6.5. It is designed to address research areas identified from completion of a Core Process 3 effort for Joint Space Operations Center (JSpOC). The report is intended to be a resource for those developing capability in support of SAFIRE, the Joint Functional Component Command (JFCC) Space Integrated Prototype (JSIP) User-Defined Operating Picture (UDOP), and other related projects. The report is under distribution restriction; our purpose here is to expose its existence to a wider audience so that qualified individuals may access it. The report contains descriptions of the organization, its most salient products, tools, and cognitive tasks. Tasks reported are derived from the data collected and presented at multiple levels of abstraction. Recommendations for leveraging the findings of the report are presented. The report contains a number of appendices that amplify the methodology, provide background or context support, and includes references in support of cognitive task methodology. In a broad sense, the CTA is intended to be the foundation for relevant, usable capability in support of space warfighters. It presents, at an unclassified level, introductory material to familiarize inquirers with the work of the COD; this is embedded in a description of the broader context of the other divisions of the JSpOC. It does NOT provide guidance for the development of Tactics, Techniques, and Procedures (TT&Ps) in the development of JSpOC processes. However, the TT&Ps are a part of the structure of work, and are, therefore, a factor in developing future capability. The authors gratefully acknowledge the cooperation and assistance from the warfighters at the JSpOC as well as the personnel of the JSpOC Capabilities Integration Office (JCIO). Their input to the process created the value of this effort.
Development of weight and cost estimates for lifting surfaces with active controls
NASA Technical Reports Server (NTRS)
Anderson, R. D.; Flora, C. C.; Nelson, R. M.; Raymond, E. T.; Vincent, J. H.
1976-01-01
Equations and methodology were developed for estimating the weight and cost incrementals due to active controls added to the wing and horizontal tail of a subsonic transport airplane. The methods are sufficiently generalized to be suitable for preliminary design. Supporting methodology and input specifications for the weight and cost equations are provided. The weight and cost equations are structured to be flexible in terms of the active control technology (ACT) flight control system specification. In order to present a self-contained package, methodology is also presented for generating ACT flight control system characteristics for the weight and cost equations. Use of the methodology is illustrated.
A PROBABILISTIC METHOD FOR ESTIMATING MONITORING POINT DENSITY FOR CONTAINMENT SYSTEM LEAK DETECTION
The use of physical and hydraulic containment systems for the isolation of contaminated ground water and aquifer materials ssociated with hazardous waste sites has increased during the last decade. The existing methodologies for monitoring and evaluating leakage from hazardous w...
Petasis, Doros T; Hendrich, Michael P
2015-01-01
Electron paramagnetic resonance (EPR) spectroscopy has long been a primary method for characterization of paramagnetic centers in materials and biological complexes. Transition metals in biological complexes have valence d-orbitals that largely define the chemistry of the metal centers. EPR spectra are distinctive for metal type, oxidation state, protein environment, substrates, and inhibitors. The study of many metal centers in proteins, enzymes, and biomimetic complexes has led to the development of a systematic methodology for quantitative interpretation of EPR spectra from a wide array of metal containing complexes. The methodology is now contained in the computer program SpinCount. SpinCount allows simulation of EPR spectra from any sample containing multiple species composed of one or two metals in any spin state. The simulations are quantitative, thus allowing determination of all species concentrations in a sample directly from spectra. This chapter will focus on applications to transition metals in biological systems using EPR spectra from multiple microwave frequencies and modes. © 2015 Elsevier Inc. All rights reserved.
Miles, Christopher O; Kilcoyne, Jane; McCarron, Pearse; Giddings, Sabrina D; Waaler, Thor; Rundberget, Thomas; Samdal, Ingunn A; Løvberg, Kjersti E
2018-03-21
Azaspiracids belong to a family of more than 50 polyether toxins originating from marine dinoflagellates such as Azadinium spinosum. All of the azaspiracids reported thus far contain a 21,22-dihydroxy group. Boric acid gel can bind selectively to compounds containing vic-diols or α-hydroxycarboxylic acids via formation of reversible boronate complexes. Here we report use of the gel to selectively capture and release azaspiracids from extracts of blue mussels. Analysis of the extracts and fractions by liquid chromatography-tandem mass spectrometry (LC-MS) showed that this procedure resulted in an excellent cleanup of the azaspiracids in the extract. Analysis by enzyme-linked immunoasorbent assay (ELISA) and LC-MS indicated that most azaspiracid analogues were recovered in good yield by this procedure. The capacity of boric acid gel for azaspiracids was at least 50 μg/g, making this procedure suitable for use in the early stages of preparative purification of azaspiracids. In addition to its potential for concentration of dilute samples, the extensive cleanup provided by boric acid gel fractionation of azaspiracids in mussel samples almost eliminated matrix effects during subsequent LC-MS and could be expected to reduce matrix effects during ELISA analysis. The method may therefore prove useful for quantitative analysis of azaspiracids as part of monitoring programs. Although LC-MS data showed that okadaic acid analogues also bound to the gel, this was much less efficient than for azaspiracids under the conditions used. The boric acid gel methodology is potentially applicable to other important groups of natural toxins containing diols including ciguatoxins, palytoxins, pectenotoxins, tetrodotoxin, trichothecenes, and toxin glycosides.
The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale
ERIC Educational Resources Information Center
Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine
2013-01-01
Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…
ERIC Educational Resources Information Center
Lenk, Hans
This document contains nine essays on the sociology and social psychology of team dynamics, including methodological and epistemological issues involved in such study. Essay titles are: (1) Conflict and Achievement in Top Athletic Teams--Sociometric Structures of Racing Eight Oar Crews; (2) Top Performance Despite Internal Conflict--An Antithesis…
The methodology of semantic analysis for extracting physical effects
NASA Astrophysics Data System (ADS)
Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.
2017-01-01
The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal Pell Grant, Federal Perkins Loan, Federal Work-Study, Federal Supplemental Educational Opportunity... announces the annual updates to the tables used in the statutory Federal Need Analysis Methodology that...
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Imprecise (fuzzy) information in geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardossy, A.; Bogardi, I.; Kelly, W.E.
1988-05-01
A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in amore » fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durbin, Samuel G.; Luna, Robert Earl
Assessing the risk to the public and the environment from a release of radioactive material produced by accidental or purposeful forces/environments is an important aspect of the regulatory process in many facets of the nuclear industry. In particular, the transport and storage of radioactive materials is of particular concern to the public, especially with regard to potential sabotage acts that might be undertaken by terror groups to cause injuries, panic, and/or economic consequences to a nation. For many such postulated attacks, no breach in the robust cask or storage module containment is expected to occur. However, there exists evidence thatmore » some hypothetical attack modes can penetrate and cause a release of radioactive material. This report is intended as an unclassified overview of the methodology for release estimation as well as a guide to useful resource data from unclassified sources and relevant analysis methods for the estimation process.« less
Osorio, Maria Teresa; Haughey, Simon A; Elliott, Christopher T; Koidis, Anastasios
2015-12-15
European Regulation 1169/2011 requires producers of foods that contain refined vegetable oils to label the oil types. A novel rapid and staged methodology has been developed for the first time to identify common oil species in oil blends. The qualitative method consists of a combination of a Fourier Transform Infrared (FTIR) spectroscopy to profile the oils and fatty acid chromatographic analysis to confirm the composition of the oils when required. Calibration models and specific classification criteria were developed and all data were fused into a simple decision-making system. The single lab validation of the method demonstrated the very good performance (96% correct classification, 100% specificity, 4% false positive rate). Only a small fraction of the samples needed to be confirmed with the majority of oils identified rapidly using only the spectroscopic procedure. The results demonstrate the huge potential of the methodology for a wide range of oil authenticity work. Copyright © 2014 Elsevier Ltd. All rights reserved.
Determination of polycyclic aromatic hydrocarbons in kerosene and bio-kerosene soot.
Andrade-Eiroa, Auréa; Leroy, Valérie; Dagaut, Philippe; Bedjanian, Yuri
2010-03-01
Here we report a new, efficient and reliable analytical methodology for sensitive and selective quantification of Polycyclic Aromatic Hydrocarbons (PAHs) in soot samples. The methodology developed is based on ultrasonic extraction of the soot-bound PAHs into small volumes of acetonitrile, purification of the extracts through C(18) Solid Phase Extraction (SPE) cartridges and analysis by Reverse Phase Liquid Chromatography (RPLC) with UV and fluorimetric detection. For the first time, we report the convenience of adapting the SPE procedure to the nature of the soot samples. As a matter of fact, extracts containing high percentage of unpolar material are recommended to be cleaned with acetone, whereas extracts poor in unpolar compounds can be efficiently cleaned with methanol. The method was satisfactorily applied to kerosene and bio-kerosene soot from atmospheric open diffusion flames (pool fires) and premixed flames achieving Quantification and Detection limits in the range ng mg(-1) soot and recoveries about 90% for most of the PAHs studied. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Huang, Chi-Te; Tsai, Chia-Hsun; Tsou, Hsin-Yeh; Huang, Yaw-Bin; Tsai, Yi-Hung; Wu, Pao-Chu
2011-01-01
Response surface methodology (RSM) was used to develop and optimize the mesomorphic phase formulation for a meloxicam transdermal dosage form. A mixture design was applied to prepare formulations which consisted of three independent variables including oleic acid (X(1)), distilled water (X(2)) and ethanol (X(3)). The flux and lag time (LT) were selected as dependent variables. The result showed that using mesomorphic phases as vehicles can significantly increase flux and shorten LT of drug. The analysis of variance showed that the permeation parameters of meloxicam from formulations were significantly influenced by the independent variables and their interactions. The X(3) (ethanol) had the greatest potential influence on the flux and LT, followed by X(1) and X(2). A new formulation was prepared according to the independent levels provided by RSM. The observed responses were in close agreement with the predicted values, demonstrating that RSM could be successfully used to optimize mesomorphic phase formulations.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Diaz, Aaron A.; Burghard, Brion J.; Skorpik, James R.; Shepard, Chester L.; Samuel, Todd J.; Pappas, Richard A.
2003-07-01
The Pacific Northwest National Laboratory (PNNL) has developed a portable, battery-operated, handheld ultrasonic device that provides non-invasive container interrogation and material identification capabilities. The technique governing how the acoustic inspection device (AID) functions, involves measurements of ultrasonic pulses (0.1 to 5 MHz) that are launched into a container or material. The return echoes from these pulses are analyzed in terms of time-of-flight and frequency content to extract physical property measurements (the acoustic velocity and attenuation coefficient) of the material under test. The AID performs an automated analysis of the return echoes to identify the material, and detect contraband in the form of submerged packages and concealed compartments in liquid filled containers and solid-form commodities. An inspector can quickly interrogate outwardly innocuous commodity items such as shipping barrels, tanker trucks, and metal ingots. The AID can interrogate container sizes ranging from approximately 6 inches in diameter to over 96 inches in diameter and allows the inspector to sort liquid and material types into groups of like and unlike; a powerful method for discovering corrupted materials or miss-marked containers co-mingled in large shipments. This manuscript describes the functionality, capabilities and measurement methodology of the technology as it relates to homeland security applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tortorelli, J.P.
1995-08-01
A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less
Thermodynamics of concentrated solid solution alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Michael C.; Zhang, C.; Gao, P.
This study reviews the three main approaches for predicting the formation of concentrated solid solution alloys (CSSA) and for modeling their thermodynamic properties, in particular, utilizing the methodologies of empirical thermo-physical parameters, CALPHAD method, and first-principles calculations combined with hybrid Monte Carlo/Molecular Dynamics (MC/MD) simulations. In order to speed up CSSA development, a variety of empirical parameters based on Hume-Rothery rules have been developed. Herein, these parameters have been systematically and critically evaluated for their efficiency in predicting solid solution formation. The phase stability of representative CSSA systems is then illustrated from the perspectives of phase diagrams and nucleation drivingmore » force plots of the σ phase using CALPHAD method. The temperature-dependent total entropies of the FCC, BCC, HCP, and σ phases in equimolar compositions of various systems are presented next, followed by the thermodynamic properties of mixing of the BCC phase in Al-containing and Ti-containing refractory metal systems. First-principles calculations on model FCC, BCC and HCP CSSA reveal the presence of both positive and negative vibrational entropies of mixing, while the calculated electronic entropies of mixing are negligible. Temperature dependent configurational entropy is determined from the atomic structures obtained from MC/MD simulations. Current status and challenges in using these methodologies as they pertain to thermodynamic property analysis and CSSA design are discussed.« less
Thermodynamics of concentrated solid solution alloys
Gao, Michael C.; Zhang, C.; Gao, P.; ...
2017-10-12
This study reviews the three main approaches for predicting the formation of concentrated solid solution alloys (CSSA) and for modeling their thermodynamic properties, in particular, utilizing the methodologies of empirical thermo-physical parameters, CALPHAD method, and first-principles calculations combined with hybrid Monte Carlo/Molecular Dynamics (MC/MD) simulations. In order to speed up CSSA development, a variety of empirical parameters based on Hume-Rothery rules have been developed. Herein, these parameters have been systematically and critically evaluated for their efficiency in predicting solid solution formation. The phase stability of representative CSSA systems is then illustrated from the perspectives of phase diagrams and nucleation drivingmore » force plots of the σ phase using CALPHAD method. The temperature-dependent total entropies of the FCC, BCC, HCP, and σ phases in equimolar compositions of various systems are presented next, followed by the thermodynamic properties of mixing of the BCC phase in Al-containing and Ti-containing refractory metal systems. First-principles calculations on model FCC, BCC and HCP CSSA reveal the presence of both positive and negative vibrational entropies of mixing, while the calculated electronic entropies of mixing are negligible. Temperature dependent configurational entropy is determined from the atomic structures obtained from MC/MD simulations. Current status and challenges in using these methodologies as they pertain to thermodynamic property analysis and CSSA design are discussed.« less
Financing pharmaceuticals in transition economies.
Kanavos, P
1999-06-01
This paper (a) provides a methodological taxonomy of pricing, financing, reimbursement, and cost containment methodologies for pharmaceuticals; (b) analyzes complex agency relationships and the health versus industrial policy tradeoff; (c) pinpoints financing measures to balance safety and effectiveness of medicines and their affordability by publicly funded systems in transition; and (d) highlights viable options for policy-makers for the financing of pharmaceuticals in transition. Three categories of measures and their implications for pharmaceutical policy cost containing are analyzed: supply-side measures, targeting manufacturers, proxy demand-side measures, targeting physicians and pharmacists, and demand-side measures, targeting patients. In pursuing supply side measures, we explore free pricing for pharmaceuticals, direct price controls, cost-plus and cost pricing, average pricing and international price comparisons, profit control, reference pricing, the introduction of a fourth hurdle, positive and negative lists, and other price control measures. The analysis of proxy-demand measures includes budgets for physicians, generic policies, practice guidelines, monitoring the authorizing behavior of physicians, and disease management schemes. Demand-side measures explore the effectiveness of patient co-payments, the impact of allowing products over-the-counter and health promotion programs. Global policies should operate simultaneously on the supply, the proxy demand, and the demand-side. Policy-making needs to have a continuous long-term planning. The importation of policies into transition economy may require extensive and expensive adaptation, and/or lead to sub-optimal policy outcomes.
Okumoto, Sakiko; Versaw, Wayne
2017-10-01
Nitrogen and phosphorus are macronutrients indispensable for plant growth. The acquisition and reallocation of both elements require a multitude of dedicated transporters that specifically recognize inorganic and organic forms of nitrogen and phosphorous. Although many transporters have been discovered through elegant screening processes and sequence homology, many remain uncharacterized for their functions in planta. Genetically encoded sensors for nitrogen and phosphorous molecules offer a unique opportunity for studying transport mechanisms that were previously inaccessible. In the past few years, sensors for some of the key nitrogen molecules became available, and many improvements have been made for existing sensors for phosphorus molecules. Methodologies for detailed in vivo analysis also improved. We summarize the recent improvements in genetically encoded sensors for nitrogen and phosphorus molecules, and the discoveries made by using such sensors. Copyright © 2017. Published by Elsevier Ltd.
Energy use in the marine transportation industry. Task III. Efficiency improvements. Draft report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-06-02
Research and development areas that hold promise for maritime energy conservation are identified and evaluated. The methodology used is discussed in Chapter II. The technology base of the commercial marine transportation industry relating to energy usage is made up of: main propulsion plants, propulsors, hydrodynamics, vessel operations, and fuels. Fifteen specific program areas in the first four generic technologies are identified and are evaluated. An economic and energy impact analysis and technological risk assessment was performed on the specific program areas and the results are summarized in Chapter III. The first five appendices address the generic technologies. The sixth appendixmore » contains the baseline operating and cost parameters against which the 15 program areas were evaluated, and the last appendix contains sample printouts of the MTEM model used to evaluate the energy consumption and economic impacts associated with the candidate technology areas. (MCW)« less
Using Knowledge Fusion to Analyze Avian Influenza H5N1 in East and Southeast Asia
Ge, Erjia; Haining, Robert; Li, Chi Pang; Yu, Zuguo; Waye, Miu Yee; Chu, Ka Hou; Leung, Yee
2012-01-01
Highly pathogenic avian influenza (HPAI) H5N1, a disease associated with high rates of mortality in infected human populations, poses a serious threat to public health in many parts of the world. This article reports findings from a study aimed at improving our understanding of the spatial pattern of the highly pathogenic avian influenza, H5N1, risk in East-Southeast Asia where the disease is both persistent and devastating. Though many disciplines have made important contributions to our understanding of H5N1, it remains a challenge to integrate knowledge from different disciplines. This study applies genetic analysis that identifies the evolution of the H5N1 virus in space and time, epidemiological analysis that determines socio-ecological factors associated with H5N1 occurrence, and statistical analysis that identifies outbreak clusters, and then applies a methodology to formally integrate the findings of the three sets of methodologies. The present study is novel in two respects. First it makes the initiative attempt to use genetic sequences and space-time data to create a space-time phylogenetic tree to estimate and map the virus' ability to spread. Second, by integrating the results we are able to generate insights into the space-time occurrence and spread of H5N1 that we believe have a higher level of corroboration than is possible when analysis is based on only one methodology. Our research identifies links between the occurrence of H5N1 by area and a set of socio-ecological factors including altitude, population density, poultry density, and the shortest path distances to inland water, coastlines, migrating routes, railways, and roads. This study seeks to lay a solid foundation for the interdisciplinary study of this and other influenza outbreaks. It will provide substantive information for containing H5N1 outbreaks. PMID:22615729
Using knowledge fusion to analyze avian influenza H5N1 in East and Southeast Asia.
Ge, Erjia; Haining, Robert; Li, Chi Pang; Yu, Zuguo; Waye, Miu Yee; Chu, Ka Hou; Leung, Yee
2012-01-01
Highly pathogenic avian influenza (HPAI) H5N1, a disease associated with high rates of mortality in infected human populations, poses a serious threat to public health in many parts of the world. This article reports findings from a study aimed at improving our understanding of the spatial pattern of the highly pathogenic avian influenza, H5N1, risk in East-Southeast Asia where the disease is both persistent and devastating. Though many disciplines have made important contributions to our understanding of H5N1, it remains a challenge to integrate knowledge from different disciplines. This study applies genetic analysis that identifies the evolution of the H5N1 virus in space and time, epidemiological analysis that determines socio-ecological factors associated with H5N1 occurrence, and statistical analysis that identifies outbreak clusters, and then applies a methodology to formally integrate the findings of the three sets of methodologies. The present study is novel in two respects. First it makes the initiative attempt to use genetic sequences and space-time data to create a space-time phylogenetic tree to estimate and map the virus' ability to spread. Second, by integrating the results we are able to generate insights into the space-time occurrence and spread of H5N1 that we believe have a higher level of corroboration than is possible when analysis is based on only one methodology. Our research identifies links between the occurrence of H5N1 by area and a set of socio-ecological factors including altitude, population density, poultry density, and the shortest path distances to inland water, coastlines, migrating routes, railways, and roads. This study seeks to lay a solid foundation for the interdisciplinary study of this and other influenza outbreaks. It will provide substantive information for containing H5N1 outbreaks.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
The application of the electrodynamic separator in minerals beneficiation
NASA Astrophysics Data System (ADS)
Skowron, M.; Syrek, P.; Surowiak, A.
2017-05-01
The aim of presented paper is elaboration of methodology of upgrading natural minerals in example of chalcocite and bornite sample. The results were obtained by means of laboratory drum separator. This device operates in accordance to properties of materials, which in this case was electrical conductivity. The study contains the analysis of the forces occurring inside of electrodynamic separator chamber, that act on the particles of various electrical properties. Both, the potential and electric field strength distributions were calculated, with set of separators setpoints. Theoretical analysis influenced on separator parameters, and hence impacted the empirical results too. Next, the authors conducted empirical research on chalcocite and bornite beneficiation by means of electrodynamic separation. The results of this process were shown graphically in form of upgrading curves of chalcocite considering elementary copper and lead.
Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus
2011-01-01
Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140
[Analysis of the implementation of Nursing Assistance Systematization in a rehabilitation unit].
Neves, Rinaldo de Souza; Shimizu, Helena Eri
2010-01-01
This study seeks to analyze the execution of the Infirmary Attendance Systematization Nursing stages through an exploratory, qualitative and retrospective approach. The retrospective analysis took place using 25 medic reports containing 25 historical reports, 12 diagnosis reports, 100 prescriptions and 100 nursing evolution reports. The results demonstrated the many difficulties the nurses faced to make Nursing Assistance Systematization operational. Although all Nursing Assistance Systematization stages were accomplished - historical, diagnosis, prescription, evolution and nursing - it was verified a larger frequency in filling prescription and historical related forms and a lesser one related with evolution and diagnosis related forms. In short, Nursing Assistance Systematization procedures still are fragmentized, showing the need to reorganize this attendance methodology attendance, and, above all, to invest in continuous nursing training to improve the customer care services quality.
Mediation analysis: a retrospective snapshot of practice and more recent directions.
Gelfand, Lois A; Mensinger, Janell L; Tenhave, Thomas
2009-04-01
R. Baron and D. A. Kenny's (1986) paper introducing mediation analysis has been cited over 9,000 times, but concerns have been expressed about how this method is used. The authors review past and recent methodological literature and make recommendations for how to address 3 main issues: association, temporal order, and the no omitted variables assumption. The authors briefly visit the topics of reliability and the confirmatory-exploratory distinction. In addition, to provide a sense of the extent to which the earlier literature had been absorbed into practice, the authors examined a sample of 50 articles from 2002 citing R. Baron and D. A. Kenny and containing at least 1 mediation analysis via ordinary least squares regression. A substantial proportion of these articles included problematic reporting; as of 2002, there appeared to be room for improvement in conducting such mediation analyses. Future literature reviews will demonstrate the extent to which the situation has improved.
Teresa E. Jordan
2016-08-18
*These files add to and replace same-named files found within Submission 559 (https://gdr.openei.org/submissions/559)* The files included in this submission contain all data pertinent to the methods and results of a cohesive multi-state analysis of all known potential geothermal reservoirs in sedimentary rocks in the Appalachian Basin region, ranked by their potential favorability. Favorability is quantified using three metrics: Reservoir Productivity Index for water; Reservoir Productivity Index; Reservoir Flow Capacity. The metrics are explained in the Reservoirs Methodology Memo (included in zip file). The product represents a minimum spatial extent of potential sedimentary rock geothermal reservoirs. Only natural porosity and permeability were analyzed. Shapefile and images of the spatial distributions of these reservoir quality metrics and of the uncertainty on these metrics are included as well. UPDATE: Accompanying geologic reservoirs data may be found at: https://gdr.openei.org/submissions/881 (linked below).
Methods for simulation-based analysis of fluid-structure interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew Franklin; Payne, Jeffrey L.
2005-10-01
Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less
Hazardous-waste analysis plan for LLNL operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, R.S.
The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan willmore » address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.« less
Moral deliberation and nursing ethics cases: elements of a methodological proposal.
Schneider, Dulcinéia Ghizoni; Ramos, Flávia Regina Souza
2012-11-01
A qualitative study with an exploratory, descriptive and documentary design that was conducted with the objective of identifying the elements to constitute a method for the analysis of accusations of and proceedings for professional ethics infringements. The method is based on underlying elements identified inductively during analysis of professional ethics hearings judged by and filed in the archives of the Regional Nursing Board of Santa Catarina, Brazil, between 1999 and 2007. The strategies developed were based on the results of an analysis of the findings of fact (occurrences/infractions, causes and outcomes) contained in the records of 128 professional ethics hearings and on the structural elements (statements, rules and practices) identified in five example professional ethics cases. The strategies suggested for evaluating accusations of ethics infringements and the procedures involved in deliberating on ethics hearings constitute a generic proposal that will require adaptation to the context of specific professional ethics accusations.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-08
... analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and computer science... following disciplines: demography, economics, geography, psychology, statistics, survey methodology, social... expertise in such areas as demography, economics, geography, psychology, statistics, survey methodology...
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Borri, Marco; Schmidt, Maria A; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M; Partridge, Mike; Bhide, Shreerang A; Nutting, Christopher M; Harrington, Kevin J; Newbold, Katie L; Leach, Martin O
2015-01-01
To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes.
Higgins, A; Barnett, J; Meads, C; Singh, J; Longworth, L
2014-12-01
To systematically review the existing literature on the value associated with convenience in health care delivery, independent of health outcomes, and to try to estimate the likely magnitude of any value found. A systematic search was conducted for previously published studies that reported preferences for convenience-related aspects of health care delivery in a manner that was consistent with either cost-utility analysis or cost-benefit analysis. Data were analyzed in terms of the methodologies used, the aspects of convenience considered, and the values reported. Literature searches generated 4715 records. Following a review of abstracts or full-text articles, 27 were selected for inclusion. Twenty-six studies reported some evidence of convenience-related process utility, in the form of either a positive utility or a positive willingness to pay. The aspects of convenience valued most often were mode of administration (n = 11) and location of treatment (n = 6). The most common valuation methodology was a discrete-choice experiment containing a cost component (n = 15). A preference for convenience-related process utility exists, independent of health outcomes. Given the diverse methodologies used to calculate it, and the range of aspects being valued, however, it is difficult to assess how large such a preference might be, or how it may be effectively incorporated into an economic evaluation. Increased consistency in reporting these preferences is required to assess these issues more accurately. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the Proceedings contains the following 20 papers: "Information Sufficiency and Risk Communication" (Robert J. Griffin, Kurt Neuwirth, and Sharon Dunwoody); "The Therapeutic Application of Television: An Experimental Study" (Charles Kingsley); "A Path Model Examining the…
Wild, Verina; Carina, Fourie; Frouzakis, Regula; Clarinval, Caroline; Fässler, Margrit; Elger, Bernice; Gächter, Thomas; Leu, Agnes; Spirig, Rebecca; Kleinknecht, Michael; Radovanovic, Dragana; Mouton Dorey, Corine; Burnand, Bernard; Vader, John-Paul; Januel, Jean-Marie; Biller-Andorno, Nikola; The IDoC Group
2015-01-01
The starting point of the interdisciplinary project "Assessing the impact of diagnosis related groups (DRGs) on patient care and professional practice" (IDoC) was the lack of a systematic ethical assessment for the introduction of cost containment measures in healthcare. Our aim was to contribute to the methodological and empirical basis of such an assessment. Five sub-groups conducted separate but related research within the fields of biomedical ethics, law, nursing sciences and health services, applying a number of complementary methodological approaches. The individual research projects were framed within an overall ethical matrix. Workshops and bilateral meetings were held to identify and elaborate joint research themes. Four common, ethically relevant themes emerged in the results of the studies across sub-groups: (1.) the quality and safety of patient care, (2.) the state of professional practice of physicians and nurses, (3.) changes in incentives structure, (4.) vulnerable groups and access to healthcare services. Furthermore, much-needed data for future comparative research has been collected and some early insights into the potential impact of DRGs are outlined. Based on the joint results we developed preliminary recommendations related to conceptual analysis, methodological refinement, monitoring and implementation.
4-Nonylphenol (NP) in food-contact materials: analytical methodology and occurrence.
Fernandes, A R; Rose, M; Charlton, C
2008-03-01
Nonylphenol is a recognized environmental contaminant, but it is unclear whether its occurrence in food arises only through environmental pathways or also during the processing or packaging of food, as there are reports that indicate that materials in contact with food such as rubber products and polyvinylchloride wraps can contain nonylphenol. A review of the literature has highlighted the scarcity of robust analytical methodology or data on the occurrence of nonylphenol in packaging materials. This paper describes a methodology for the determination of nonylphenol in a variety of packaging materials, which includes plastics, paper and rubber. The method uses either Soxhlet extraction or dissolution followed by solvent extraction (depending on the material type), followed by purification using adsorption chromatography. Procedures were internally standardized using 13C-labelled nonylphenol and the analytes were measured by gas chromatography-mass spectrometry. The method is validated and data relating to quality parameters such as limits of detection, recovery, precision and linearity of measurement are provided. Analysis of a range of 25 food-contact materials found nonylphenol at concentrations of 64-287 microg g(-1) in some polystyrene and polyvinylchloride samples. Far lower concentrations (<0.03-1.4 microg g(-1)) were detected in the other materials. It is possible that occurrence at the higher levels has the potential for migration to food.
NASA Astrophysics Data System (ADS)
Romero, Jonathan; Posada, Edwin; Flores-Moreno, Roberto; Reyes, Andrés
2012-08-01
In this work we propose an extended propagator theory for electrons and other types of quantum particles. This new approach has been implemented in the LOWDIN package and applied to sample calculations of atomic and small molecular systems to determine its accuracy and performance. As a first application of the method we have studied the nuclear quantum effects on electron ionization energies. We have observed that ionization energies of atoms are similar to those obtained with the electron propagator approach. However, for molecular systems containing hydrogen atoms there are improvements in the quality of the results with the inclusion of nuclear quantum effects. An energy term analysis has allowed us to conclude that nuclear quantum effects are important for zero order energies whereas propagator results correct the electron and electron-nuclear correlation terms. Results presented for a series of n-alkanes have revealed the potential of this method for the accurate calculation of ionization energies of a wide variety of molecular systems containing hydrogen nuclei. The proposed methodology will also be applicable to exotic molecular systems containing positrons or muons.
Peptide Array X-Linking (PAX): A New Peptide-Protein Identification Approach
Okada, Hirokazu; Uezu, Akiyoshi; Soderblom, Erik J.; Moseley, M. Arthur; Gertler, Frank B.; Soderling, Scott H.
2012-01-01
Many protein interaction domains bind short peptides based on canonical sequence consensus motifs. Here we report the development of a peptide array-based proteomics tool to identify proteins directly interacting with ligand peptides from cell lysates. Array-formatted bait peptides containing an amino acid-derived cross-linker are photo-induced to crosslink with interacting proteins from lysates of interest. Indirect associations are removed by high stringency washes under denaturing conditions. Covalently trapped proteins are subsequently identified by LC-MS/MS and screened by cluster analysis and domain scanning. We apply this methodology to peptides with different proline-containing consensus sequences and show successful identifications from brain lysates of known and novel proteins containing polyproline motif-binding domains such as EH, EVH1, SH3, WW domains. These results suggest the capacity of arrayed peptide ligands to capture and subsequently identify proteins by mass spectrometry is relatively broad and robust. Additionally, the approach is rapid and applicable to cell or tissue fractions from any source, making the approach a flexible tool for initial protein-protein interaction discovery. PMID:22606326
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
Advanced metrology by offline SEM data processing
NASA Astrophysics Data System (ADS)
Lakcher, Amine; Schneider, Loïc.; Le-Gratiet, Bertrand; Ducoté, Julien; Farys, Vincent; Besacier, Maxime
2017-06-01
Today's technology nodes contain more and more complex designs bringing increasing challenges to chip manufacturing process steps. It is necessary to have an efficient metrology to assess process variability of these complex patterns and thus extract relevant data to generate process aware design rules and to improve OPC models. Today process variability is mostly addressed through the analysis of in-line monitoring features which are often designed to support robust measurements and as a consequence are not always very representative of critical design rules. CD-SEM is the main CD metrology technique used in chip manufacturing process but it is challenged when it comes to measure metrics like tip to tip, tip to line, areas or necking in high quantity and with robustness. CD-SEM images contain a lot of information that is not always used in metrology. Suppliers have provided tools that allow engineers to extract the SEM contours of their features and to convert them into a GDS. Contours can be seen as the signature of the shape as it contains all the dimensional data. Thus the methodology is to use the CD-SEM to take high quality images then generate SEM contours and create a data base out of them. Contours are used to feed an offline metrology tool that will process them to extract different metrics. It was shown in two previous papers that it is possible to perform complex measurements on hotspots at different process steps (lithography, etch, copper CMP) by using SEM contours with an in-house offline metrology tool. In the current paper, the methodology presented previously will be expanded to improve its robustness and combined with the use of phylogeny to classify the SEM images according to their geometrical proximities.
Final Report Inspection of Aged/Degraded Containments Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naus, Dan J; Ellingwood, B R; Oland, C Barry
2005-09-01
The Inspection of Aged/Degraded Containments Program had primary objectives of (1) understanding the significant factors relating corrosion occurrence, efficacy of inspection, and structural capacity reduction of steel containments and liners of reinforced concrete containments; (2) providing the United States Nuclear Regulatory Commission (USNRC) reviewers a means of establishing current structural capacity margins or estimating future residual structural capacity margins for steel containments, and concrete containments as limited by liner integrity; (3) providing recommendations, as appropriate, on information to be requested of licensees for guidance that could be utilized by USNRC reviewers in assessing the seriousness of reported incidences of containmentmore » degradation; and (4) providing technical assistance to the USNRC (as requested) related to concrete technology. Primary program accomplishments have included development of a degradation assessment methodology; reviews of techniques and methods for inspection and repair of containment metallic pressure boundaries; evaluation of high-frequency acoustic imaging, magnetostrictive sensor, electromagnetic acoustic transducer, and multimode guided plate wave technologies for inspection of inaccessible regions of containment metallic pressure boundaries; development of a continuum damage mechanics-based approach for structural deterioration; establishment of a methodology for reliability-based condition assessments of steel containments and liners; and fragility assessments of steel containments with localized corrosion. In addition, data and information assembled under this program has been transferred to the technical community through review meetings and briefings, national and international conference participation, technical committee involvement, and publications of reports and journal articles. Appendix A provides a listing of program reports, papers, and publications; and Appendix B contains a listing of program-related presentations.« less
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Analysis and methodology for aeronautical systems technology program planning
NASA Technical Reports Server (NTRS)
White, M. J.; Gershkoff, I.; Lamkin, S.
1983-01-01
A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.
Investigation of culvert hydraulics related to juvenile fish passage. Final research report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, M.E.; Downs, R.C.
1996-01-01
Culverts often create barriers to the upstream migration of juvenile fish. The objective of this study was to determine hydraulic characteristics of culverts with different flow conditions. Methods of predicting flow profiles were developed by both Chiu and Mountjoy. Two equations were compared to experimental results. An area of flow corresponding to a predetermined allowable velocity can be calculated using Mountjoy equation. This can then be used in the design of culverts as fish passage guidelines. The report contains a summary of background information, experimental methodology, the results of experimental tests, and an analysis of both the Chiu and Mountjoymore » equations.« less
GPFA-AB_Phase1ReservoirTask2DataUpload
Teresa E. Jordan
2015-10-22
This submission to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin. The files included in this zip file contain all data pertinent to the methods and results of this task’s output, which is a cohesive multi-state map of all known potential geothermal reservoirs in our region, ranked by their potential favorability. Favorability is quantified using a new metric, Reservoir Productivity Index, as explained in the Reservoirs Methodology Memo (included in zip file). Shapefile and images of the Reservoir Productivity and Reservoir Uncertainty are included as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1976-07-01
The memorandum details the survey design and methodology employed in connection with a research effort that examined the role of individual's attitudes and perceptions in deciding whether or not to carpool. The study was based upon a survey of commuters in 3 major urban areas and has resulted in a sizeable new data on respondents' socio-economic and worktrip characteristics, travel perceptions, and travel preferences. The memorandum includes a copy of the survey instrument. An overview of the findings, conclusions and recommendations of this research is contained in the Summary Report, PB-261825, also available through NTIS.
Soft Fruit Traceability in Food Matrices using Real-Time PCR
Palmieri, Luisa; Bozza, Elisa; Giongo, Lara
2009-01-01
Food product authentication provides a means of monitoring and identifying products for consumer protection and regulatory compliance. There is a scarcity of analytical methods for confirming the identity of fruit pulp in products containing Soft Fruit. In the present work we have developed a very sensible qualitative and quantitative method to determine the presence of berry DNAs in different food matrices. To our knowledge, this is the first study that shows the applicability, to Soft Fruit traceability, of melting curve analysis and multiplexed fluorescent probes, in a Real-Time PCR platform. This methodology aims to protect the consumer from label misrepresentation. PMID:22253987
Set this house on fire: the self-analysis of Raymond Carver.
Tutter, Adele
2011-10-01
The convergence of features of Raymond Carver's short-story oeuvre and of psychoanalytic methodology suggests that Carver's writing served as the fulcrum and focus of a self-analytic experience. Within this model, his stories function as container and mirror of myriad aspects of the writer's self. Tracing the developmental arc of the contextual meanings of one motif--fire--through six stories and their ur-texts demonstrates gains comparable to certain analytic goals, including enhanced integration, accountability, and self-awareness. Over time, Carver's narratives of rage, impotence, and despair give way to a new story: of mourning, forgiveness, and the rekindling of hope.
A New Methodology for Turbulence Modelers Using DNS Database Analysis
NASA Technical Reports Server (NTRS)
Parneix, S.; Durbin, P.
1996-01-01
Many industrial applications in such fields as aeronautical, mechanical, thermal, and environmental engineering involve complex turbulent flows containing global separations and subsequent reattachment zones. Accurate prediction of this phenomena is very important because separations influence the whole fluid flow and may have an even bigger impact on surface heat transfer. In particular, reattaching flows are known to be responsible for large local variations of the local wall heat transfer coefficient as well as modifying the overall heat transfer. For incompressible, non-buoyant situations, the fluid mechanics have to be accurately predicted in order to have a good resolution of the temperature field.
Brain imaging registry for neurologic diagnosis and research
NASA Astrophysics Data System (ADS)
Hoo, Kent S., Jr.; Wong, Stephen T. C.; Knowlton, Robert C.; Young, Geoffrey S.; Walker, John; Cao, Xinhua; Dillon, William P.; Hawkins, Randall A.; Laxer, Kenneth D.
2002-05-01
The purpose of this paper is to demonstrate the importance of building a brain imaging registry (BIR) on top of existing medical information systems including Picture Archiving Communication Systems (PACS) environment. We describe the design framework for a cluster of data marts whose purpose is to provide clinicians and researchers efficient access to a large volume of raw and processed patient images and associated data originating from multiple operational systems over time and spread out across different hospital departments and laboratories. The framework is designed using object-oriented analysis and design methodology. The BIR data marts each contain complete image and textual data relating to patients with a particular disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James Francfort; Kevin Morrow; Dimitri Hochard
2007-02-01
This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.
NASA Astrophysics Data System (ADS)
Alperovich, Leonid; Averbuch, Amir; Eppelbaum, Lev; Zheludev, Valery
2013-04-01
Karst areas occupy about 14% of the world land. Karst terranes of different origin have caused difficult conditions for building, industrial activity and tourism, and are the source of heightened danger for environment. Mapping of karst (sinkhole) hazards, obviously, will be one of the most significant problems of engineering geophysics in the XXI century. Taking into account the complexity of geological media, some unfavourable environments and known ambiguity of geophysical data analysis, a single geophysical method examination might be insufficient. Wavelet methodology as whole has a significant impact on cardinal problems of geophysical signal processing such as: denoising of signals, enhancement of signals and distinguishing of signals with closely related characteristics and integrated analysis of different geophysical fields (satellite, airborne, earth surface or underground observed data). We developed a three-phase approach to the integrated geophysical localization of subsurface karsts (the same approach could be used for following monitoring of karst dynamics). The first phase consists of modeling devoted to compute various geophysical effects characterizing karst phenomena. The second phase determines development of the signal processing approaches to analyzing of profile or areal geophysical observations. Finally, at the third phase provides integration of these methods in order to create a new method of the combined interpretation of different geophysical data. In the base of our combine geophysical analysis we put modern developments in the wavelet technique of the signal and image processing. The development of the integrated methodology of geophysical field examination will enable to recognizing the karst terranes even by a small ratio of "useful signal - noise" in complex geological environments. For analyzing the geophysical data, we used a technique based on the algorithm to characterize a geophysical image by a limited number of parameters. This set of parameters serves as a signature of the image and is to be utilized for discrimination of images containing karst cavity (K) from the images non-containing karst (N). The constructed algorithm consists of the following main phases: (a) collection of the database, (b) characterization of geophysical images, (c) and dimensionality reduction. Then, each image is characterized by the histogram of the coherency directions. As a result of the previous steps we obtain two sets K and N of the signatures vectors for images from sections containing karst cavity and non-karst subsurface, respectively.
Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.
DOT National Transportation Integrated Search
1979-09-01
This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Solid lubrication design methodology, phase 2
NASA Technical Reports Server (NTRS)
Pallini, R. A.; Wedeven, L. D.; Ragen, M. A.; Aggarwal, B. B.
1986-01-01
The high temperature performance of solid lubricated rolling elements was conducted with a specially designed traction (friction) test apparatus. Graphite lubricants containing three additives (silver, phosphate glass, and zinc orthophosphate) were evaluated from room temperature to 540 C. Two hard coats were also evaluated. The evaluation of these lubricants, using a burnishing method of application, shows a reasonable transfer of lubricant and wear protection for short duration testing except in the 200 C temperature range. The graphite lubricants containing silver and zinc orthophosphate additives were more effective than the phosphate glass material over the test conditions examined. Traction coefficients ranged from a low of 0.07 to a high of 0.6. By curve fitting the traction data, empirical equations for slope and maximum traction coefficient as a function of contact pressure (P), rolling speed (U), and temperature (T) can be developed for each lubricant. A solid lubricant traction model was incorporated into an advanced bearing analysis code (SHABERTH). For comparison purposes, preliminary heat generation calculations were made for both oil and solid lubricated bearing operation. A preliminary analysis indicated a significantly higher heat generation for a solid lubricated ball bearing in a deep groove configuration. An analysis of a cylindrical roller bearing configuration showed a potential for a low friction solid lubricated bearing.
Rizal, Datu; Tani, Shinichi; Nishiyama, Kimitoshi; Suzuki, Kazuhiko
2006-10-11
In this paper, a novel methodology in batch plant safety and reliability analysis is proposed using a dynamic simulator. A batch process involving several safety objects (e.g. sensors, controller, valves, etc.) is activated during the operational stage. The performance of the safety objects is evaluated by the dynamic simulation and a fault propagation model is generated. By using the fault propagation model, an improved fault tree analysis (FTA) method using switching signal mode (SSM) is developed for estimating the probability of failures. The timely dependent failures can be considered as unavailability of safety objects that can cause the accidents in a plant. Finally, the rank of safety object is formulated as performance index (PI) and can be estimated using the importance measures. PI shows the prioritization of safety objects that should be investigated for safety improvement program in the plants. The output of this method can be used for optimal policy in safety object improvement and maintenance. The dynamic simulator was constructed using Visual Modeler (VM, the plant simulator, developed by Omega Simulation Corp., Japan). A case study is focused on the loss of containment (LOC) incident at polyvinyl chloride (PVC) batch process which is consumed the hazardous material, vinyl chloride monomer (VCM).
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Mokel, Melissa Jennifer; Shellman, Juliette M
2013-01-01
Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.
A self-contained, automated methodology for optimal flow control validated for transition delay
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, R. A.; Erlebacher, Gordon; Hussaini, M. Yousuff
1995-01-01
This paper describes a self-contained, automated methodology for flow control along with a validation of the methodology for the problem of boundary layer instability suppression. The objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow, e.g., Blasius boundary layer. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The present approach couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields, and control, e.g., actuators, may be determined. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc.
Hexographic Method of Complex Town-Planning Terrain Estimate
NASA Astrophysics Data System (ADS)
Khudyakov, A. Ju
2017-11-01
The article deals with the vital problem of a complex town-planning analysis based on the “hexographic” graphic analytic method, makes a comparison with conventional terrain estimate methods and contains the method application examples. It discloses a procedure of the author’s estimate of restrictions and building of a mathematical model which reflects not only conventional town-planning restrictions, but also social and aesthetic aspects of the analyzed territory. The method allows one to quickly get an idea of the territory potential. It is possible to use an unlimited number of estimated factors. The method can be used for the integrated assessment of urban areas. In addition, it is possible to use the methods of preliminary evaluation of the territory commercial attractiveness in the preparation of investment projects. The technique application results in simple informative graphics. Graphical interpretation is straightforward from the experts. A definite advantage is the free perception of the subject results as they are not prepared professionally. Thus, it is possible to build a dialogue between professionals and the public on a new level allowing to take into account the interests of various parties. At the moment, the method is used as a tool for the preparation of integrated urban development projects at the Department of Architecture in Federal State Autonomous Educational Institution of Higher Education “South Ural State University (National Research University)”, FSAEIHE SUSU (NRU). The methodology is included in a course of lectures as the material on architectural and urban design for architecture students. The same methodology was successfully tested in the preparation of business strategies for the development of some territories in the Chelyabinsk region. This publication is the first in a series of planned activities developing and describing the methodology of hexographical analysis in urban and architectural practice. It is also planned to create a software product that allows one to automate the process of site assessment on the basis of the methodology.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The user's manual for the rocket combustor interactive design (ROCCID) computer program is presented. The program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial, and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can easily be added. The analysis model in ROCCID can account for the influence of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
A Karnaugh map based approach towards systemic reviews and meta-analysis.
Hassan, Abdul Wahab; Hassan, Ahmad Kamal
2016-01-01
Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews.
Representation of scientific methodology in secondary science textbooks
NASA Astrophysics Data System (ADS)
Binns, Ian C.
The purpose of this investigation was to assess the representation of scientific methodology in secondary science textbooks. More specifically, this study looked at how textbooks introduced scientific methodology and to what degree the examples from the rest of the textbook, the investigations, and the images were consistent with the text's description of scientific methodology, if at all. The sample included eight secondary science textbooks from two publishers, McGraw-Hill/Glencoe and Harcourt/Holt, Rinehart & Winston. Data consisted of all student text and teacher text that referred to scientific methodology. Second, all investigations in the textbooks were analyzed. Finally, any images that depicted scientists working were also collected and analyzed. The text analysis and activity analysis used the ethnographic content analysis approach developed by Altheide (1996). The rubrics used for the text analysis and activity analysis were initially guided by the Benchmarks (AAAS, 1993), the NSES (NRC, 1996), and the nature of science literature. Preliminary analyses helped to refine each of the rubrics and grounded them in the data. Image analysis used stereotypes identified in the DAST literature. Findings indicated that all eight textbooks presented mixed views of scientific methodology in their initial descriptions. Five textbooks placed more emphasis on the traditional view and three placed more emphasis on the broad view. Results also revealed that the initial descriptions, examples, investigations, and images all emphasized the broad view for Glencoe Biology and the traditional view for Chemistry: Matter and Change. The initial descriptions, examples, investigations, and images in the other six textbooks were not consistent. Overall, the textbook with the most appropriate depiction of scientific methodology was Glencoe Biology and the textbook with the least appropriate depiction of scientific methodology was Physics: Principles and Problems. These findings suggest that compared to earlier investigations, textbooks have begun to improve in how they represent scientific methodology. However, there is still much room for improvement. Future research needs to consider how textbooks impact teachers' and students' understandings of scientific methodology.
Variable thickness transient ground-water flow model. Volume 3. Program listings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenauer, A.E.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-01-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, T.L.
1993-04-01
This report discusses probabilistic fracture mechanics (PFM) analysis which is a major element of the comprehensive probabilistic methodology endorsed by the NRC for evaluation of the integrity of Pressurized Water Reactor (PWR) pressure vessels subjected to pressurized-thermal-shock (PTS) transients. It is anticipated that there will be an increasing need for an improved and validated PTS PFM code which is accepted by the NRC and utilities, as more plants approach the PTS screening criteria and are required to perform plant-specific analyses. The NRC funded Heavy Section Steel Technology (HSST) Program at Oak Ridge National Laboratories is currently developing the FAVOR (Fracturemore » Analysis of Vessels: Oak Ridge) PTS PFM code, which is intended to meet this need. The FAVOR code incorporates the most important features of both OCA-P and VISA-II and contains some new capabilities such as PFM global modeling methodology, the capability to approximate the effects of thermal streaming on circumferential flaws located inside a plume region created by fluid and thermal stratification, a library of stress intensity factor influence coefficients, generated by the NQA-1 certified ABAQUS computer code, for an adequate range of two and three dimensional inside surface flaws, the flexibility to generate a variety of output reports, and user friendliness.« less
Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio
2016-10-01
We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two 13 C atoms ( 13 C 2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of 13 C 2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% 13 C 2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Lüthy, Monique; Wheldon, Mary C; Haji-Cheteh, Chehasnah; Atobe, Masakazu; Bond, Paul S; O'Brien, Peter; Hubbard, Roderick E; Fairlamb, Ian J S
2015-06-01
Synthetic routes to six 3-D scaffolds containing piperazine, pyrrolidine and piperidine cores have been developed. The synthetic methodology focused on the use of N-Boc α-lithiation-trapping chemistry. Notably, suitably protected and/or functionalised medicinal chemistry building blocks were synthesised via concise, connective methodology. This represents a rare example of lead-oriented synthesis. A virtual library of 190 compounds was then enumerated from the six scaffolds. Of these, 92 compounds (48%) fit the lead-like criteria of: (i) -1⩽AlogP⩽3; (ii) 14⩽number of heavy atoms⩽26; (iii) total polar surface area⩾50Å(2). The 3-D shapes of the 190 compounds were analysed using a triangular plot of normalised principal moments of inertia (PMI). From this, 46 compounds were identified which had lead-like properties and possessed 3-D shapes in under-represented areas of pharmaceutical space. Thus, the PMI analysis of the 190 member virtual library showed that whilst scaffolds which may appear on paper to be 3-D in shape, only 24% of the compounds actually had 3-D structures in the more interesting areas of 3-D drug space. Copyright © 2015 Elsevier Ltd. All rights reserved.
Meta-STEPP: subpopulation treatment effect pattern plot for individual patient data meta-analysis.
Wang, Xin Victoria; Cole, Bernard; Bonetti, Marco; Gelber, Richard D
2016-09-20
We have developed a method, called Meta-STEPP (subpopulation treatment effect pattern plot for meta-analysis), to explore treatment effect heterogeneity across covariate values in the meta-analysis setting for time-to-event data when the covariate of interest is continuous. Meta-STEPP forms overlapping subpopulations from individual patient data containing similar numbers of events with increasing covariate values, estimates subpopulation treatment effects using standard fixed-effects meta-analysis methodology, displays the estimated subpopulation treatment effect as a function of the covariate values, and provides a statistical test to detect possibly complex treatment-covariate interactions. Simulation studies show that this test has adequate type-I error rate recovery as well as power when reasonable window sizes are chosen. When applied to eight breast cancer trials, Meta-STEPP suggests that chemotherapy is less effective for tumors with high estrogen receptor expression compared with those with low expression. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
From LIDAR Scanning to 3d FEM Analysis for Complex Surface and Underground Excavations
NASA Astrophysics Data System (ADS)
Chun, K.; Kemeny, J.
2017-12-01
Light detection and ranging (LIDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease to use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of three-dimensional numerical model that can be used in FEM analysis. To date, however, straightforward techniques in reconstructing numerical model from the scanned data of underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating from LIDAR scanning to finite element numerical analysis, specifically converting LIDAR 3D point clouds of object containing complex surface geometry into finite element model. This methodology has been applied to the Kartchner Caverns in Arizona for the stability analysis. Numerical simulations were performed using the finite element code ABAQUS. The results indicate that the highlights of our technologies obtained from LIDAR is effective and provide reference for other similar engineering project in practice.
Comparison between two methodologies for urban drainage decision aid.
Moura, P M; Baptista, M B; Barraud, S
2006-01-01
The objective of the present work is to compare two methodologies based on multicriteria analysis for the evaluation of stormwater systems. The first methodology was developed in Brazil and is based on performance-cost analysis, the second one is ELECTRE III. Both methodologies were applied to a case study. Sensitivity and robustness analyses were then carried out. These analyses demonstrate that both methodologies have equivalent results, and present low sensitivity and high robustness. These results prove that the Brazilian methodology is consistent and can be used safely in order to select a good solution or a small set of good solutions that could be compared with more detailed methods afterwards.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the Proceedings contains the following 17 papers: "Extra! Extra! Read All About It: Attention and Memory for Deviant and Imagistic Headlines" (Jennifer Borse and others); "Refining a Uses and Gratification Scale for Television Viewing" (Jennifer Greer, Cyndi Frisby, and David…
ERIC Educational Resources Information Center
2002
The Theory and Methodology Division of the proceedings contains the following 16 papers: "The Deep Audit as an Epistemology for the Watchdog: Computer-assisted Reporting and Investigative Journalism" (John E. Newhagen); "Race and Class in 1980s Hollywood" (Chris Jordan); "The Impact of Website Campaigning on Traditional…
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the proceedings contains the following 12 selected papers: "Innovativeness and Perceptions of Faculty Innovation Champions on the Diffusion of World Wide Web Course Features" (Patrick J. Sutherland); "A Communication 'Mr. Fit'? Living with No Significant Difference" (Fiona…
Rita C.L.B. Rodrigues; William R. Kenealy; Diane Dietrich; Thomas W. Jeffries
2012-01-01
Response surface methodology (RSM), based on a 22 full factorial design, evaluated the moisture effects in recovering xylose by diethyloxalate (DEO) hydrolysis. Experiments were carried out in laboratory reactors (10 mL glass ampoules) containing corn stover (0.5 g) properly ground. The ampoules were kept at 160 °C for 90 min. Both DEO...
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the Proceedings contains the following 14 papers: "Press Releases and the 'Bscore': New Statistical Measurement Explored" (Lee Bollinger); "A Systematic Approach to Analyzing the Structure of News Texts" (Michael Schmierbach); "Setting the Proximity Frame: Distance as an…
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The communication theory and methodology section of the Proceedings contains the following 20 papers: "Political Adwatches and the Third-Person Effect" (Ekaterina Ognianova and others); "Understanding Adopters of Audio Information Services" (Kimberly A. Neuendorf and others); "A Principal-Agent Approach to the Study of…
ERIC Educational Resources Information Center
Smith, Peter; Dalton, Jennifer; Henry, John
2005-01-01
This document was produced by the author(s) based on their research for the Australian report, "Accommodating Learning Styles: Relevance and Good Practice in Vocational Education and Training," and contains three parts. Part 1, Research Methodology and Findings (Peter Smith and Jennifer Dalton), contains: (1) Research Questions; (2)…
Communication via Chalkboard and on Paper. TLA-100.00 (F.U.F.).
ERIC Educational Resources Information Center
Hardison, Margaret J.
The purpose of this cluster of learning modules is to increase the teacher intern's understanding and skills with regard to his role as a communicator. The cluster contains three modules: (a) objectives for teaching handwriting, (b) methodology of manuscript writing, and (c) practice teaching of manuscript handwriting. Each module contains a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Que Hee, S.S.; Peace, B.; Clark, C.S.
Efficient sampling methods to recover lead-containing house dust and hand dust have been evolved so that sufficient lead is collected for analysis and to ensure that correlational analyses linking these two parameters to blood lead are not dependent on the efficiency of sampling. Precise collection of loose house dust from a 1-unit area (484 cmS) with a Tygon or stainless steel sampling tube connected to a portable sampling pump (1.2 to 2.5 liters/min) required repetitive sampling (three times). The Tygon tube sampling technique for loose house dust <177 m in diameter was around 72% efficient with respect to dust weightmore » and lead collection. A representative house dust contained 81% of its total weight in this fraction. A single handwipe for applied loose hand dust was not acceptably efficient or precise, and at least three wipes were necessary to achieve recoveries of >80% of the lead applied. House dusts of different particle sizes <246 m adhered equally well to hands. Analysis of lead-containing material usually required at least three digestions/decantations using hot plate or microwave techniques to allow at least 90% of the lead to be recovered. It was recommended that other investigators validate their handwiping, house dust sampling, and digestion techniques to facilitate comparison of results across studies. The final methodology for the Cincinnati longitudinal study was three sampling passes for surface dust using a stainless steel sampling tube; three microwave digestion/decantations for analysis of dust and paint; and three wipes with handwipes with one digestion/decantation for the analysis of six handwipes together.« less
Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis
ERIC Educational Resources Information Center
Kover, Sara T.; Atwood, Amy K.
2013-01-01
This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Oliver, Penelope; Cicerale, Sara; Pang, Edwin; Keast, Russell
2018-04-01
Temporal dominance of sensations (TDS) is a rapid descriptive method that offers a different magnitude of information to traditional descriptive analysis methodologies. This methodology considers the dynamic nature of eating, assessing sensory perception of foods as they change throughout the eating event. Limited research has applied the TDS methodology to strawberries and subsequently validated the results against Quantitative Descriptive Analysis (QDA™). The aim of this research is to compare the TDS methodology using an untrained consumer panel to the results obtained via QDA™ with a trained sensory panel. The trained panelists (n = 12, minimum 60 hr each panelist) were provided with six strawberry samples (three cultivars at two maturation levels) and applied QDA™ techniques to profile each strawberry sample. Untrained consumers (n = 103) were provided with six strawberry samples (three cultivars at two maturation levels) and required to use TDS methodology to assess the dominant sensations for each sample as they change over time. Results revealed moderately comparable product configurations produced via TDS in comparison to QDA™ (RV coefficient = 0.559), as well as similar application of the sweet attribute (correlation coefficient of 0.895 at first bite). The TDS methodology however was not in agreement with the QDA™ methodology regarding more complex flavor terms. These findings support the notion that the lack of training on the definition of terms, together with the limitations of the methodology to ignore all attributes other than those dominant, provide a different magnitude of information than the QDA™ methodology. A comparison of TDS to traditional descriptive analysis indicate that TDS provides additional information to QDA™ regarding the lingering component of eating. The QDA™ results however provide more precise detail regarding singular attributes. Therefore, the TDS methodology has an application in industry when it is important to understand the lingering profile of products. However, this methodology should not be employed as a replacement to traditional descriptive analysis methods. © 2018 Institute of Food Technologists®.
D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco
2016-02-01
Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Impact of uncertainty on modeling and testing
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Brown, Kendall K.
1995-01-01
A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.
De Luca, Michele; Restuccia, Donatella; Clodoveo, Maria Lisa; Puoci, Francesco; Ragno, Gaetano
2016-07-01
Chemometric discrimination of extra virgin olive oils (EVOO) from whole and stoned olive pastes was carried out by using Fourier transform infrared (FTIR) data and partial least squares-discriminant analysis (PLS1-DA) approach. Four Italian commercial EVOO brands, all in both whole and stoned version, were considered in this study. The adopted chemometric methodologies were able to describe the different chemical features in phenolic and volatile compounds contained in the two types of oil by using unspecific IR spectral information. Principal component analysis (PCA) was employed in cluster analysis to capture data patterns and to highlight differences between technological processes and EVOO brands. The PLS1-DA algorithm was used as supervised discriminant analysis to identify the different oil extraction procedures. Discriminant analysis was extended to the evaluation of possible adulteration by addition of aliquots of oil from whole paste to the most valuable oil from stoned olives. The statistical parameters from external validation of all the PLS models were very satisfactory, with low root mean square error of prediction (RMSEP) and relative error (RE%). Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Mohan, Gyan
1969-01-01
Presents a systematization of the mathematical formulae in thermodynamics. From the set of thermodynamic variables, four equations are derived which contain the total mathematical jargon of thermodynamics. (LC)
Major Upgrades to the AIRS Version-6 Water Vapor Profile Methodology
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John; Iredell, Lena
2015-01-01
This research is a continuation of part of what was shown at the last AIRS Science Team Meeting and the AIRS 2015 NetMeeting. AIRS Version 6 was finalized in late 2012 and is now operational. Version 6 contained many significant improvements in retrieval methodology compared to Version 5. Version 6 retrieval methodology used for the water vapor profile q(p) and ozone profile O3(p) retrievals is basically unchanged from Version 5, or even from Version 4. Subsequent research has made significant improvements in both water vapor and O3 profiles compared to Version 6.
Manual Therapy in the Treatment of Idiopathic Scoliosis. Analysis of Current Knowledge.
Czaprowski, Dariusz
2016-10-28
Apart from the recommended specific physiotherapy, the treatment of idiopathic scoliosis (IS) also incorporates non-specific manual therapy (NMT). The aim of this paper is to assess the efficacy of NMT (manual therapy, chiropractic, osteopathy) used in the treatment of children and adolescents with IS. The study analysed systematic reviews (Analysis 1) and other recent scientific publications (Analysis 2). Analysis 1 encompassed papers on the use of NMT in patients with IS. Works concerning specific physiotherapy (SP) or bracing (B) and other types of scoliosis were excluded from the analysis. Inclusion criteria for Analysis 2 were: treatment with NMT; subjects aged 10-18 years with IS. The following types of papers were excluded: works analysing NMT combined with SP or B, reports concerning adult pa tients, analyses of single cases and publications included in Analysis 1. Analysis 1: six systematic reviews contained 6 papers on the efficacy of NMT in the treatment of IS. The results of these studies are contradictory, ranging from Cobb angle reduction to no treatment effects whatsoever. The papers analysed are characterised by poor methodological quality: small group sizes, incomplete descriptions of the study groups, no follow-up and no control groups. Analysis 2: in total, 217 papers were found. None of them met the criteria set for the analysis. 1. Few papers verifying the efficacy of manual therapy, chiropractic and osteopathy in the treatment of idiopathic scoliosis have been published to date. 2. The majority are experimental studies with poor methodology or observational case studies. 3. At present, the efficacy of non-specific manual therapy in the treatment of patients with idiopathic scoliosis cannot be reliably evaluated. 4. It is necessary to conduct further research based on appropriate methods (prospective, rando mi s ed, controlled studies) in order to reliably assess the usefulness of non-specific manual therapy in the treatment of idiopathic scoliosis.
When can social media lead financial markets?
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-02-27
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.
When Can Social Media Lead Financial Markets?
NASA Astrophysics Data System (ADS)
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-02-01
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.
When Can Social Media Lead Financial Markets?
Zheludev, Ilya; Smith, Robert; Aste, Tomaso
2014-01-01
Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes. PMID:24572909
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
1992-05-15
The laboratory heterogeneity of the lupus anticoagulant (LA) was investigated in a multicentre study using a panel of 78 plasma samples diagnosed as containing a LA. Consecutive samples were collected by 12 participants using various screening tests, and sent to 7 laboratories which performed one or more clotting assays among the following: activated partial thromboplastin time (APTT), dilute Russell viper venom time, kaolin clotting time (KCT), dilute tissue thromboplastin time (dTTI) and a platelet neutralization test. For APTT and dTTI, 10 versions of these tests including standard and mixing procedures were carried out. They varied by reagents, phospholipid concentration or methodology. Cut-off times were determined for each test by comparing the results of the panel to those of a control population. When the data of all clotting assays were pooled, 70 of the 78 selected plasmas were considered to contain LA, 15 of them having a low-titer inhibitor. Sensitivity, defined as the proportion of positive results among LA-containing plasmas, varied from 62 to 100% and was positively related to responsiveness (defined as the mean ratio of clotting time to cut-off time). Laboratory heterogeneity of LA-containing plasma was illustrated by a star symbol plot analysis. Different populations of samples, with LA preferentially recognized by one assay (or group of assays) irrespective of the overall sensitivity of this assay, were identified. Multiple component analysis demonstrated the heterogeneity of low-titer inhibitors, which complicates their recognition in routine laboratory investigation.
Gathright, Molly M; Thrush, Carol; Guise, J Benjamin; Krain, Lewis; Clardy, James
2016-04-01
In order to better understand the professional development of medical students during their psychiatry clerkship, this study identifies common themes and characteristics of students' critical incident narratives which are designed to capture a recount of clerkship experiences they perceived as meaningful. A total of 205 narratives submitted by psychiatry clerkship students in 2010-2011 were subjected to a thematic analysis using a methodological approach and adaptation of categories derived from prior similar research. Descriptive content analysis was also carried out to assess the valence of the narrative content, characters involved, and whether there was evidence that the experience changed students' perspectives in some way. Narratives contained a variety of positive (19%) and negative content (24%) and many contained a hybrid of both (57%). The most common theme (29%) concerned issues of respect and disrespect in patient, clinical, and coworker interactions. In general, the majority (68%) of students' meaningful experience narratives reflected a change in their perspective (e.g., I learned that...). Narratives containing positive and hybrid content were associated with a change in students' perspective (χ(2) = 10.61, df = 2, p < 0.005). Medical students are keenly aware of the learning environment. Positive and hybrid critical incident narratives were associated with a stated change in their beliefs, attitudes, or behaviors due to the experience. Understanding the events that are meaningful to students can also provide rich feedback to medical educators regarding the ways in which students perceive clinical learning environments and how to best foster their professional development.
Acoustic Enrichment of Extracellular Vesicles from Biological Fluids.
Ku, Anson; Lim, Hooi Ching; Evander, Mikael; Lilja, Hans; Laurell, Thomas; Scheding, Stefan; Ceder, Yvonne
2018-06-11
Extracellular vesicles (EVs) have emerged as a rich source of biomarkers providing diagnostic and prognostic information in diseases such as cancer. Large-scale investigations into the contents of EVs in clinical cohorts are warranted, but a major obstacle is the lack of a rapid, reproducible, efficient, and low-cost methodology to enrich EVs. Here, we demonstrate the applicability of an automated acoustic-based technique to enrich EVs, termed acoustic trapping. Using this technology, we have successfully enriched EVs from cell culture conditioned media and urine and blood plasma from healthy volunteers. The acoustically trapped samples contained EVs ranging from exosomes to microvesicles in size and contained detectable levels of intravesicular microRNAs. Importantly, this method showed high reproducibility and yielded sufficient quantities of vesicles for downstream analysis. The enrichment could be obtained from a sample volume of 300 μL or less, an equivalent to 30 min of enrichment time, depending on the sensitivity of downstream analysis. Taken together, acoustic trapping provides a rapid, automated, low-volume compatible, and robust method to enrich EVs from biofluids. Thus, it may serve as a novel tool for EV enrichment from large number of samples in a clinical setting with minimum sample preparation.
Li, Guo-Sheng; Wei, Xian-Yong
2017-01-01
Elucidation of chemical composition of biooil is essentially important to evaluate the process of lignocellulosic biomass (LCBM) conversion and its upgrading and suggest proper value-added utilization like producing fuel and feedstock for fine chemicals. Although the main components of LCBM are cellulose, hemicelluloses, and lignin, the chemicals derived from LCBM differ significantly due to the various feedstock and methods used for the decomposition. Biooil, produced from pyrolysis of LCBM, contains hundreds of organic chemicals with various classes. This review covers the methodologies used for the componential analysis of biooil, including pretreatments and instrumental analysis techniques. The use of chromatographic and spectrometric methods was highlighted, covering the conventional techniques such as gas chromatography, high performance liquid chromatography, Fourier transform infrared spectroscopy, nuclear magnetic resonance, and mass spectrometry. The combination of preseparation methods and instrumental technologies is a robust pathway for the detailed componential characterization of biooil. The organic species in biooils can be classified into alkanes, alkenes, alkynes, benzene-ring containing hydrocarbons, ethers, alcohols, phenols, aldehydes, ketones, esters, carboxylic acids, and other heteroatomic organic compounds. The recent development of high resolution mass spectrometry and multidimensional hyphenated chromatographic and spectrometric techniques has considerably elucidated the composition of biooils. PMID:29387086
Reference Model 5 (RM5): Oscillating Surge Wave Energy Converter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. H.; Jenne, D. S.; Thresher, R.
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter (OSWEC) reference model design in a complementary manner to Reference Models 1-4 contained in the above report. A conceptual design for a taut moored oscillating surge wave energy converter was developed. The design had an annual electrical power of 108 kilowatts (kW), rated power of 360 kW, and intended deployment at water depths between 50 m and 100 m. The study includes structural analysis, power output estimation, a hydraulic power conversionmore » chain system, and mooring designs. The results were used to estimate device capital cost and annual operation and maintenance costs. The device performance and costs were used for the economic analysis, following the methodology presented in SAND2013-9040 that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays up to 100 devices. The levelized cost of energy estimated for the Reference Model 5 OSWEC, presented in this report, was for a single device and arrays of 10, 50, and 100 units, and it enabled the economic analysis to account for cost reductions associated with economies of scale. The baseline commercial levelized cost of energy estimate for the Reference Model 5 device in an array comprised of 10 units is $1.44/kilowatt-hour (kWh), and the value drops to approximately $0.69/kWh for an array of 100 units.« less
NASA Technical Reports Server (NTRS)
Cragg, Clinton H.; Bowman, Howard; Wilson, John E.
2011-01-01
The NASA Engineering and Safety Center (NESC) was requested to provide computational modeling to support the establishment of a safe separation distance surrounding the Kennedy Space Center (KSC) Vehicle Assembly Building (VAB). The two major objectives of the study were 1) establish a methodology based on thermal flux to determine safe separation distances from the Kennedy Space Center's (KSC's) Vehicle Assembly Building (VAB) with large numbers of solid propellant boosters containing hazard division 1.3 classification propellants, in case of inadvertent ignition; and 2) apply this methodology to the consideration of housing eight 5-segment solid propellant boosters in the VAB. The results of the study are contained in this report.
Raut, Savita V; Yadav, Dinkar M
2018-03-28
This paper presents an fMRI signal analysis methodology using geometric mean curve decomposition (GMCD) and mutual information-based voxel selection framework. Previously, the fMRI signal analysis has been conducted using empirical mean curve decomposition (EMCD) model and voxel selection on raw fMRI signal. The erstwhile methodology loses frequency component, while the latter methodology suffers from signal redundancy. Both challenges are addressed by our methodology in which the frequency component is considered by decomposing the raw fMRI signal using geometric mean rather than arithmetic mean and the voxels are selected from EMCD signal using GMCD components, rather than raw fMRI signal. The proposed methodologies are adopted for predicting the neural response. Experimentations are conducted in the openly available fMRI data of six subjects, and comparisons are made with existing decomposition models and voxel selection frameworks. Subsequently, the effect of degree of selected voxels and the selection constraints are analyzed. The comparative results and the analysis demonstrate the superiority and the reliability of the proposed methodology.
Improved Surface Parameter Retrievals using AIRS/AMSU Data
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John
2008-01-01
The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Two very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; and 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions. In this methodology, longwave C02 channel observations in the spectral region 700 cm(exp -1) to 750 cm(exp -1) are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm(exp -1) 2395 cm(exp -1) are used for temperature sounding purposes. This allows for accurate temperature soundings under more difficult cloud conditions. This paper further improves on the methodology used in Version 5 to derive surface skin temperature and surface spectral emissivity from AIRS/AMSU observations. Now, following the approach used to improve tropospheric temperature profiles, surface skin temperature is also derived using only shortwave window channels. This produces improved surface parameters, both day and night, compared to what was obtained in Version 5. These in turn result in improved boundary layer temperatures and retrieved total O3 burden.
Tamper-indicating barcode and method
Cummings, Eric B.; Even, Jr., William R.; Simmons, Blake A.; Dentinger, Paul Michael
2005-03-22
A novel tamper-indicating barcode methodology is disclosed that allows for detection of alteration to the barcode. The tamper-indicating methodology makes use of a tamper-indicating means that may be comprised of a particulate indicator, an optical indicator, a deformable substrate, and/or may be an integrated aspect of the barcode itself. This tamper-indicating information provides greater security for the contents of containers sealed with the tamper-indicating barcodes.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the proceedings contains the following 18 papers: "The Continuing Question of Motivation in the Knowledge Gap Hypothesis" (Tom Weir); "Memory Decay and the Agenda-Setting Effect: An Examination of Three News Media" (Wayne Wanta and Melissa J. Roy); "Open, Closed, or Both:…
Risk assessment methodology applied to counter IED research & development portfolio prioritization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel W; O' Brien, David A; Zerkle, David K
2009-01-01
In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definitemore » need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.« less
Shaw, Kathryn E; Charlton, Jesse M; Perry, Christina K L; de Vries, Courtney M; Redekopp, Matthew J; White, Jordan A; Hunt, Michael A
2018-02-01
The effect of shoe-worn insoles on biomechanical variables in people with medial knee osteoarthritis has been studied extensively. The majority of research has focused specifically on the effect of lateral wedge insoles at the knee. The aim of this systematic review and meta-analysis was to summarise the known effects of different shoe-worn insoles on all biomechanical variables during level walking in this patient population to date. Four electronic databases were searched to identify studies containing biomechanical data using shoe-worn insole devices in the knee osteoarthritis population. Methodological quality was assessed and a random effects meta-analysis was performed on biomechanical variables reported in three or more studies for each insole. Twenty-seven studies of moderate-to-high methodological quality were included in this review. The primary findings were consistent reductions in the knee adduction moment with lateral wedge insoles, although increases in ankle eversion with these insoles were also found. Lateral wedge insoles produce small reductions in knee adduction angles and external moments, and moderate increases in ankle eversion. The addition of an arch support to a lateral wedge minimises ankle eversion change, and also minimises adduction moment reductions. The paucity of available data on other insole types and other biomechanical outcomes presents an opportunity for future research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Borri, Marco; Schmidt, Maria A.; Powell, Ceri; Koh, Dow-Mu; Riddell, Angela M.; Partridge, Mike; Bhide, Shreerang A.; Nutting, Christopher M.; Harrington, Kevin J.; Newbold, Katie L.; Leach, Martin O.
2015-01-01
Purpose To describe a methodology, based on cluster analysis, to partition multi-parametric functional imaging data into groups (or clusters) of similar functional characteristics, with the aim of characterizing functional heterogeneity within head and neck tumour volumes. To evaluate the performance of the proposed approach on a set of longitudinal MRI data, analysing the evolution of the obtained sub-sets with treatment. Material and Methods The cluster analysis workflow was applied to a combination of dynamic contrast-enhanced and diffusion-weighted imaging MRI data from a cohort of squamous cell carcinoma of the head and neck patients. Cumulative distributions of voxels, containing pre and post-treatment data and including both primary tumours and lymph nodes, were partitioned into k clusters (k = 2, 3 or 4). Principal component analysis and cluster validation were employed to investigate data composition and to independently determine the optimal number of clusters. The evolution of the resulting sub-regions with induction chemotherapy treatment was assessed relative to the number of clusters. Results The clustering algorithm was able to separate clusters which significantly reduced in voxel number following induction chemotherapy from clusters with a non-significant reduction. Partitioning with the optimal number of clusters (k = 4), determined with cluster validation, produced the best separation between reducing and non-reducing clusters. Conclusion The proposed methodology was able to identify tumour sub-regions with distinct functional properties, independently separating clusters which were affected differently by treatment. This work demonstrates that unsupervised cluster analysis, with no prior knowledge of the data, can be employed to provide a multi-parametric characterization of functional heterogeneity within tumour volumes. PMID:26398888
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn, R.Y.S.; Bolmarcich, J.J.
The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dombroski, M; Melius, C; Edmunds, T
2008-09-24
This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to humanmore » epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future work including validating the model against reliable historical disease data, improving contact rates, spread methods, and disease parameters through discussions with epidemiological experts, and incorporating realistic behavioral assumptions.« less
ABGs in Agriculture. Volume Two. Appendices. ACTION Evaluation.
ERIC Educational Resources Information Center
ACTION, Washington, DC.
Appendixes to a study of the effectiveness of Peace Corps volunteers in agriculture who are AB generalists (individuals with a bachelor of arts degree in English, liberal arts, or social science) are contained in this document. Section 1 contains a glossary of terms used in the study. Section 2 describes the study's methodology and includes the…
Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael
2017-09-01
The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.
ERIC Educational Resources Information Center
Bertrand, Jane T.; And Others
1989-01-01
An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)
Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
2003-06-01
delivery Data Access (1980s) "What were unit sales in New England last March?" Relational databases (RDBMS), Structured Query Language ( SQL ...macros written in Visual Basic for Applications ( VBA ). 32 Iteration Two: Class Diagram Tech OASIS Export ScriptImport Filter Data ProcessingMethod 1...MS Excel * 1 VBA Macro*1 contains sends data to co nt ai ns executes * * 1 1 contains contains Figure 20. Iteration two class diagram The
Synthesis of 5-iodo-1,2,3-triazole-containing macrocycles using copper flow reactor technology.
Bogdan, Andrew R; James, Keith
2011-08-05
A new macrocyclization strategy to synthesize 12- to 31-membered 5-iodo-1,2,3-triazole-containing macrocycles is described. The macrocycles have been generated using a simple and efficient copper-catalyzed cycloaddition in flow under environmentally friendly conditions. This methodology also permits the facile, regioselective synthesis of 1,4,5-trisubstituted-1,2,3-triazole-containing macrocyles using palladium-catalyzed cross-coupling reactions. © 2011 American Chemical Society
Reading Alien Landscapes: Thick versus Thin Descriptions in Archaeoastronomy
NASA Astrophysics Data System (ADS)
Malville, J. McKim
2015-05-01
This paper reviews the nature of "thick descriptions" promoted by Clifford Geertz and explores the application of this methodology to archaeoastronomy. The approach aims to describe and explain human behavior in the realms of the sacred and secular. Thick description emphasizes the emic signification of social action; an etic analysis would be viewed as thin. A useful application of this methodology is to consider astronomical events contained in the archaeological record as signifiers of deeper meaning and purpose within the culture. By following the string of signification one can delve deeply into the culture and attempt to explain behavior associated with ancient astronomy. Another element of thick descriptions is the use of redundancy as a test for thoroughness. An archaeoastronomical phenomenon that appears to be unique and idiosyncratic may mean that the investigator has not searched the archaeological record sufficiently thoroughly or needs to alter the basic interpretation. Examples from India and Peru are discussed in which the interpretation of astronomical phenomena could lead to misrepresentations of meaning and function if only a thin description is attempted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tortorelli, J.P.
A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains presentation material and a transcript of the workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medicalmore » uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report.« less
Hema, G S; Joshy, C G; Shyni, K; Chatterjee, Niladri S; Ninan, George; Mathew, Suseela
2017-02-01
The study optimized the hydrolysis conditions for the production of fish collagen peptides from skin of Malabar grouper ( Epinephelus malabaricus ) using response surface methodology. The hydrolysis was done with enzymes pepsin, papain and protease from bovine pancreas. Effects of process parameters viz: pH, temperature, enzyme substrate ratio and hydrolysis time of the three different enzymes on degree of hydrolysis were investigated. The optimum response of degree of hydrolysis was estimated to be 10, 20 and 28% respectively for pepsin, papain and protease. The functional properties of the product developed were analysed which showed changes in the properties from proteins to peptides. SDS-PAGE combined with MALDI TOF method was successfully applied to determine the molecular weight distribution of the hydrolysate. The electrophoretic pattern indicated that the molecular weights of peptides formed due to hydrolysis were nearly 2 kDa. MALDI TOF spectral analysis showed the developed hydrolysate contains peptides having molecular weight in the range below 2 kDa.
Methodologies for Removing/Desorbing and Transporting Particles from Surfaces to Instrumentation
NASA Astrophysics Data System (ADS)
Miller, Carla J.; Cespedes, Ernesto R.
2012-12-01
Explosive trace detection (ETD) continues to be a key technology supporting the fight against terrorist bombing threats. Very selective and sensitive ETD instruments have been developed to detect explosive threats concealed on personnel, in vehicles, in luggage, and in cargo containers, as well as for forensic analysis (e.g. post blast inspection, bomb-maker identification, etc.) in a broad range of homeland security, law enforcement, and military applications. A number of recent studies have highlighted the fact that significant improvements in ETD systems' capabilities will be achieved, not by increasing the selectivity/sensitivity of the sensors, but by improved techniques for particle/vapor sampling, pre-concentration, and transport to the sensors. This review article represents a compilation of studies focused on characterizing the adhesive properties of explosive particles, the methodologies for removing/desorbing these particles from a range of surfaces, and approaches for transporting them to the instrument. The objectives of this review are to summarize fundamental work in explosive particle characterization, to describe experimental work performed in harvesting and transport of these particles, and to highlight those approaches that indicate high potential for improving ETD capabilities.
Macro-economic assessment of flood risk in Italy under current and future climate
NASA Astrophysics Data System (ADS)
Carrera, Lorenzo; Koks, Elco; Mysiak, Jaroslav; Aerts, Jeroen; Standardi, Gabriele
2014-05-01
This paper explores an integrated methodology for assessing direct and indirect costs of fluvial flooding to estimate current and future fluvial flood risk in Italy. Our methodology combines a Geographic Information System spatial approach, with a general economic equilibrium approach using a downscaled modified version of a Computable General Equilibrium model at NUTS2 scale. Given the level of uncertainty in the behavior of disaster-affected economies, the simulation considers a wide range of business recovery periods. We calculate expected annual losses for each NUTS2 region, and exceedence probability curves to determine probable maximum losses. Given a certain acceptable level of risk, we describe the conditions of flood protection and business recovery periods under which losses are contained within this limit. Because of the difference between direct costs, which are an overestimation of stock losses, and indirect costs, which represent the macro-economic effects, our results have different policy meanings. While the former is relevant for post-disaster recovery, the latter is more relevant for public policy issues, particularly for cost-benefit analysis and resilience assessment.
Martinello, Tiago; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Taqueda, Maria Elena Santos; Consiglieri, Vladi O
2006-09-28
The poor flowability and bad compressibility characteristics of paracetamol are well known. As a result, the production of paracetamol tablets is almost exclusively by wet granulation, a disadvantageous method when compared to direct compression. The development of a new tablet formulation is still based on a large number of experiments and often relies merely on the experience of the analyst. The purpose of this study was to apply experimental design methodology (DOE) to the development and optimization of tablet formulations containing high amounts of paracetamol (more than 70%) and manufactured by direct compression. Nineteen formulations, screened by DOE methodology, were produced with different proportions of Microcel 102, Kollydon VA 64, Flowlac, Kollydon CL 30, PEG 4000, Aerosil, and magnesium stearate. Tablet properties, except friability, were in accordance with the USP 28th ed. requirements. These results were used to generate plots for optimization, mainly for friability. The physical-chemical data found from the optimized formulation were very close to those from the regression analysis, demonstrating that the mixture project is a great tool for the research and development of new formulations.
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
A knowledge base of the chemical compounds of intermediary metabolism.
Karp, P D
1992-08-01
This paper describes a publicly available knowledge base of the chemical compounds involved in intermediary metabolism. We consider the motivations for constructing a knowledge base of metabolic compounds, the methodology by which it was constructed, and the information that it currently contains. Currently the knowledge base describes 981 compounds, listing for each: synonyms for its name, a systematic name, CAS registry number, chemical formula, molecular weight, chemical structure and two-dimensional display coordinates for the structure. The Compound Knowledge Base (CompoundKB) illustrates several methodological principles that should guide the development of biological knowledge bases. I argue that biological datasets should be made available in multiple representations to increase their accessibility to end users, and I present multiple representations of the CompoundKB (knowledge base, relational data base and ASN. 1 representations). I also analyze the general characteristics of these representations to provide an understanding of their relative advantages and disadvantages. Another principle is that the error rate of biological data bases should be estimated and documented-this analysis is performed for the CompoundKB.
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John M.; Iredell, Lena; Keita, Fricky
2009-01-01
This paper describes the AIRS Science Team Version 5 retrieval algorithm in terms of its three most significant improvements over the methodology used in the AIRS Science Team Version 4 retrieval algorithm. Improved physics in Version 5 allows for use of AIRS clear column radiances in the entire 4.3 micron CO2 absorption band in the retrieval of temperature profiles T(p) during both day and night. Tropospheric sounding 15 micron CO2 observations are now used primarily in the generation of clear column radiances .R(sub i) for all channels. This new approach allows for the generation of more accurate values of .R(sub i) and T(p) under most cloud conditions. Secondly, Version 5 contains a new methodology to provide accurate case-by-case error estimates for retrieved geophysical parameters and for channel-by-channel clear column radiances. Thresholds of these error estimates are used in a new approach for Quality Control. Finally, Version 5 also contains for the first time an approach to provide AIRS soundings in partially cloudy conditions that does not require use of any microwave data. This new AIRS Only sounding methodology, referred to as AIRS Version 5 AO, was developed as a backup to AIRS Version 5 should the AMSU-A instrument fail. Results are shown comparing the relative performance of the AIRS Version 4, Version 5, and Version 5 AO for the single day, January 25, 2003. The Goddard DISC is now generating and distributing products derived using the AIRS Science Team Version 5 retrieval algorithm. This paper also described the Quality Control flags contained in the DISC AIRS/AMSU retrieval products and their intended use for scientific research purposes.
TRAC Innovative Visualization Techniques
2016-11-14
Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Structural Optimization Methodology for Rotating Disks of Aircraft Engines
NASA Technical Reports Server (NTRS)
Armand, Sasan C.
1995-01-01
In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.
Methodology, status and plans for development and assessment of Cathare code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestion, D.; Barre, F.; Faydide, B.
1997-07-01
This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less
Ngo, Tuan Anh; Lu, Zhi; Carneiro, Gustavo
2017-01-01
We introduce a new methodology that combines deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance (MR) data. This combination is relevant for segmentation problems, where the visual object of interest presents large shape and appearance variations, but the annotated training set is small, which is the case for various medical image analysis applications, including the one considered in this paper. In particular, level set methods are based on shape and appearance terms that use small training sets, but present limitations for modelling the visual object variations. Deep learning methods can model such variations using relatively small amounts of annotated training, but they often need to be regularised to produce good generalisation. Therefore, the combination of these methods brings together the advantages of both approaches, producing a methodology that needs small training sets and produces accurate segmentation results. We test our methodology on the MICCAI 2009 left ventricle segmentation challenge database (containing 15 sequences for training, 15 for validation and 15 for testing), where our approach achieves the most accurate results in the semi-automated problem and state-of-the-art results for the fully automated challenge. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
The harmful chemistry behind "krokodil": Street-like synthesis and product analysis.
Alves, Emanuele Amorim; Soares, José Xavier; Afonso, Carlos Manuel; Grund, Jean-Paul C; Agonia, Ana Sofia; Cravo, Sara Manuela; Netto, Annibal Duarte Pereira; Carvalho, Félix; Dinis-Oliveira, Ricardo Jorge
2015-12-01
"Krokodil" is the street name for a drug, which has been attracting media and researchers attention due to its increasing spread and extreme toxicity. "Krokodil" is a homemade injectable mixture being used as a cheap substitute for heroin. Its use begun in Russia and Ukraine, but it is being spread throughout other countries. The starting materials for "krokodil" synthesis are tablets containing codeine, caustic soda, gasoline, hydrochloric acid, iodine from disinfectants and red phosphorus from matchboxes, all of which are easily available in a retail market or drugstores. The resulting product is a light brown liquid that is injected without previous purification. Herein, we aimed to understand the chemistry behind "krokodil" synthesis by mimicking the steps followed by people who use this drug. The successful synthesis was assessed by the presence of desomorphine and other two morphinans. An analytical gas chromatography-electron impact/mass spectrometry (GC-EI/MS) methodology for quantification of desomorphine and codeine was also developed and validated. The methodologies presented herein provide a representative synthesis of "krokodil" street samples and the application of an effective analytical methodology for desomorphine quantification, which was the major morphinan found. Further studies are required in order to find other hypothetical by-products in "krokodil" since these may help to explain signs and symptoms presented by abusers. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Quantitative x-ray diffraction mineralogy of Los Angeles basin core samples
Hein, James R.; McIntyre, Brandie R.; Edwards, Brian D.; Lakota, Orion I.
2006-01-01
This report contains X-ray diffraction (XRD) analysis of mineralogy for 81 sediment samples from cores taken from three drill holes in the Los Angeles Basin in 2000-2001. We analyzed 26 samples from Pier F core, 29 from Pier C core, and 26 from the Webster core. These three sites provide an offshore-onshore record across the Southern California coastal zone. This report is designed to be a data repository; these data will be used in further studies, including geochemical modeling as part of the CABRILLO project. Summary tables quantify the major mineral groups, whereas detailed mineralogy is presented in three appendices. The rationale, methodology, and techniques are described in the following paper.
Azari-Anpar, M; Soltani Tehrani, N; Aghajani, N; Khomeiri, M
2017-01-01
In this study, effect of Qodume shahri ( Lepidium perfoliatum ) and cress ( Lepidium sativum ) on rheological properties of ice cream were investigated. The gums were added to the ice cream formulation and different quality attributes including pH, acidity, melting characteristics, viscosity, overrun, texture analysis and sensory evaluation were determined. Results showed that ice cream formulations containing both the gums had improved overrun, melting rate, first dripping time, viscosity, hardness and adhesiveness. The gum concentrations beyond 0.2% level led to a negative effect on gumminess and chewiness of ice cream. Both the gums addition to improved quality attributes and textural properties of ice cream.
Pindelska, Edyta; Szeleszczuk, Lukasz; Pisklak, Dariusz Maciej; Mazurek, Andrzej; Kolodziejski, Waclaw
2015-01-01
Clopidogrel hydrogensulfate (HSCL) is an antiplatelet agent, one of top-selling drugs in the world. In this paper, we have described a rapid and convenient method of verification which polymorph of HSCL is present in its final solid dosage form. Our methodology based on solid-state NMR spectroscopy and ab initio gauge-including projector-augmented wave calculations of NMR shielding constants is appropriate for currently available commercial solid dosage forms of HSCL. Furthermore, such structural characterization can assist with the development of new pharmaceutical products containing HSCL and also be useful in the identification of counterfeit drugs. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Design and Verification Guidelines for Vibroacoustic and Transient Environments
NASA Technical Reports Server (NTRS)
1986-01-01
Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.
Land-Cover Trends of the Southern California Mountains Ecoregion
Soulard, Christopher E.; Raumann, Christian G.; Wilson, Tamara S.
2007-01-01
This report presents an assessment of land-use and land-cover (LU/LC) change in the Southern California Mountains ecoregion for the period 1973-2001. The Southern California Mountains is one of 84 Level-III ecoregions as defined by the U.S. Environmental Protection Agency (EPA). Ecoregions have served as a spatial framework for environmental resource management, denoting areas that contain a geographically distinct assemblage of biotic and abiotic phenomena including geology, physiography, vegetation, climate, soils, land use, wildlife, and hydrology. The established Land Cover Trends methodology generates estimates of change for ecoregions using a probability sampling approach and change-detection analysis of thematic land-cover images derived from Landsat satellite imagery.
NASA Astrophysics Data System (ADS)
Zhang, Hongjuan; Kurtz, Wolfgang; Kollet, Stefan; Vereecken, Harry; Franssen, Harrie-Jan Hendricks
2018-01-01
The linkage between root zone soil moisture and groundwater is either neglected or simplified in most land surface models. The fully-coupled subsurface-land surface model TerrSysMP including variably saturated groundwater dynamics is used in this work. We test and compare five data assimilation methodologies for assimilating groundwater level data via the ensemble Kalman filter (EnKF) to improve root zone soil moisture estimation with TerrSysMP. Groundwater level data are assimilated in the form of pressure head or soil moisture (set equal to porosity in the saturated zone) to update state vectors. In the five assimilation methodologies, the state vector contains either (i) pressure head, or (ii) log-transformed pressure head, or (iii) soil moisture, or (iv) pressure head for the saturated zone only, or (v) a combination of pressure head and soil moisture, pressure head for the saturated zone and soil moisture for the unsaturated zone. These methodologies are evaluated in synthetic experiments which are performed for different climate conditions, soil types and plant functional types to simulate various root zone soil moisture distributions and groundwater levels. The results demonstrate that EnKF cannot properly handle strongly skewed pressure distributions which are caused by extreme negative pressure heads in the unsaturated zone during dry periods. This problem can only be alleviated by methodology (iii), (iv) and (v). The last approach gives the best results and avoids unphysical updates related to strongly skewed pressure heads in the unsaturated zone. If groundwater level data are assimilated by methodology (iii), EnKF fails to update the state vector containing the soil moisture values if for (almost) all the realizations the observation does not bring significant new information. Synthetic experiments for the joint assimilation of groundwater levels and surface soil moisture support methodology (v) and show great potential for improving the representation of root zone soil moisture.
NASA Astrophysics Data System (ADS)
Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.
2012-03-01
Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.
Seismic behavior of a low-rise horizontal cylindrical tank
NASA Astrophysics Data System (ADS)
Fiore, Alessandra; Rago, Carlo; Vanzi, Ivo; Greco, Rita; Briseghella, Bruno
2018-05-01
Cylindrical storage tanks are widely used for various types of liquids, including hazardous contents, thus requiring suitable and careful design for seismic actions. The study herein presented deals with the dynamic analysis of a ground-based horizontal cylindrical tank containing butane and with its safety verification. The analyses are based on a detailed finite element (FE) model; a simplified one-degree-of-freedom idealization is also set up and used for verification of the FE results. Particular attention is paid to sloshing and asynchronous seismic input effects. Sloshing effects are investigated according to the current literature state of the art. An efficient methodology based on an "impulsive-convective" decomposition of the container-fluid motion is adopted for the calculation of the seismic force. The effects of asynchronous ground motion are studied by suitable pseudo-static analyses. Comparison between seismic action effects, obtained with and without consideration of sloshing and asynchronous seismic input, shows a rather important influence of these conditions on the final results.
Quantification of prebiotics in commercial infant formulas.
Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia
2016-03-01
Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.
Diagnosis of skin cancer using image processing
NASA Astrophysics Data System (ADS)
Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué; Coronel-Beltrán, Ángel
2014-10-01
In this papera methodology for classifying skin cancerin images of dermatologie spots based on spectral analysis using the K-law Fourier non-lineartechnique is presented. The image is segmented and binarized to build the function that contains the interest area. The image is divided into their respective RGB channels to obtain the spectral properties of each channel. The green channel contains more information and therefore this channel is always chosen. This information is point to point multiplied by a binary mask and to this result a Fourier transform is applied written in nonlinear form. If the real part of this spectrum is positive, the spectral density takeunit values, otherwise are zero. Finally the ratio of the sum of the unit values of the spectral density with the sum of values of the binary mask are calculated. This ratio is called spectral index. When the value calculated is in the spectral index range three types of cancer can be detected. Values found out of this range are benign injure.
Pugajeva, Iveta; Rozentale, Irina; Viksna, Arturs; Bartkiene, Elena; Bartkevics, Vadims
2016-12-01
Selective methodology employing a tandem quadrupole mass spectrometer coupled to a gas chromatograph with headspace autosampler (HS-GC-MS/MS) was elaborated in this study. Application of the elaborated procedure resulted in a limit of detection of 0.021μgkg(-1) and a limit of quantification of 0.071μgkg(-1). The mean recoveries during in-house validation ranged from 89% to 109%, and coefficients of variation for repeatability ranged from 4% to 11%. The proposed analytical method was applied for monitoring the furan content of 30 commercial baby food samples available on the Latvian retail market. The level of furan found in these samples varied from 0.45 to 81.9μgkg(-1), indicating that infants whose sole diet comprises baby food sold in jars and cans are exposed constantly to furan. Samples containing vegetables and meat had higher levels of furan than those containing only fruits. Copyright © 2016 Elsevier Ltd. All rights reserved.
Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis
NASA Astrophysics Data System (ADS)
Litjens, Geert; Sánchez, Clara I.; Timofeeva, Nadya; Hermsen, Meyke; Nagtegaal, Iris; Kovacs, Iringo; Hulsbergen-van de Kaa, Christina; Bult, Peter; van Ginneken, Bram; van der Laak, Jeroen
2016-05-01
Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30-40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.
Analysis of pressure distortion testing
NASA Technical Reports Server (NTRS)
Koch, K. E.; Rees, R. L.
1976-01-01
The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.
Schroll, Jeppe Bennekou; Abdel-Sattar, Maher; Bero, Lisa
2015-01-01
To compare the accessibility, comprehensiveness, and usefulness of data available from the European Medicines Agency (EMA) and the Food and Drug Administration (FDA) drug reports. This is a cross-sectional study. All new molecular drugs approved between January 1, 2011 and December 31, 2012 from the FDA and EMA Web sites were eligible. We included 27 drug reports. Most were searchable, but the FDA table of contents did not match the file's page numbers. Several FDA documents must be searched compared with a single EMA document, but the FDA reports contain more summary data on harms. Detailed information about harms was reported for 93% of the FDA reports (25 of 27 reports) and 26% of the EMA reports (7 of 27 reports). The reports contained information about trial methodology but did not include trial registry IDs or investigator names. All reports but one contained sufficient information to be used in a meta-analysis. Detailed data on efficacy and harms are available at the two agencies. The FDA has more summary data on harms, but the documents are harder to navigate. Published by Elsevier Inc.
Modeling energy/economy interactions for conservation and renewable energy-policy analysis
NASA Astrophysics Data System (ADS)
Groncki, P. J.
Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.
Speed Accuracy Tradeoffs in Human Speech Production
2017-05-01
for considering Fitts’ law in the domain of speech production is elucidated. Methodological challenges in applying Fitts-style analysis are addressed...order to assess whether articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in...performing Fitts-style analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor
Tularosa Basin Play Fairway Analysis: Methodology Flow Charts
Adam Brandt
2015-11-15
These images show the comprehensive methodology used for creation of a Play Fairway Analysis to explore the geothermal resource potential of the Tularosa Basin, New Mexico. The deterministic methodology was originated by the petroleum industry, but was custom-modified to function as a knowledge-based geothermal exploration tool. The stochastic PFA flow chart uses weights of evidence, and is data-driven.
Regional Shelter Analysis Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, Michael B.; Dennison, Deborah; Kane, Jave
2015-08-01
The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b)more » country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.« less
Fazio, Simone; Garraín, Daniel; Mathieux, Fabrice; De la Rúa, Cristina; Recchioni, Marco; Lechón, Yolanda
2015-01-01
Under the framework of the European Platform on Life Cycle Assessment, the European Reference Life-Cycle Database (ELCD - developed by the Joint Research Centre of the European Commission), provides core Life Cycle Inventory (LCI) data from front-running EU-level business associations and other sources. The ELCD contains energy-related data on power and fuels. This study describes the methods to be used for the quality analysis of energy data for European markets (available in third-party LC databases and from authoritative sources) that are, or could be, used in the context of the ELCD. The methodology was developed and tested on the energy datasets most relevant for the EU context, derived from GaBi (the reference database used to derive datasets for the ELCD), Ecoinvent, E3 and Gemis. The criteria for the database selection were based on the availability of EU-related data, the inclusion of comprehensive datasets on energy products and services, and the general approval of the LCA community. The proposed approach was based on the quality indicators developed within the International Reference Life Cycle Data System (ILCD) Handbook, further refined to facilitate their use in the analysis of energy systems. The overall Data Quality Rating (DQR) of the energy datasets can be calculated by summing up the quality rating (ranging from 1 to 5, where 1 represents very good, and 5 very poor quality) of each of the quality criteria indicators, divided by the total number of indicators considered. The quality of each dataset can be estimated for each indicator, and then compared with the different databases/sources. The results can be used to highlight the weaknesses of each dataset and can be used to guide further improvements to enhance the data quality with regard to the established criteria. This paper describes the application of the methodology to two exemplary datasets, in order to show the potential of the methodological approach. The analysis helps LCA practitioners to evaluate the usefulness of the ELCD datasets for their purposes, and dataset developers and reviewers to derive information that will help improve the overall DQR of databases.
NASA Technical Reports Server (NTRS)
Muss, J. A.; Nguyen, T. V.; Johnson, C. W.
1991-01-01
The appendices A-K to the user's manual for the rocket combustor interactive design (ROCCID) computer program are presented. This includes installation instructions, flow charts, subroutine model documentation, and sample output files. The ROCCID program, written in Fortran 77, provides a standardized methodology using state of the art codes and procedures for the analysis of a liquid rocket engine combustor's steady state combustion performance and combustion stability. The ROCCID is currently capable of analyzing mixed element injector patterns containing impinging like doublet or unlike triplet, showerhead, shear coaxial and swirl coaxial elements as long as only one element type exists in each injector core, baffle, or barrier zone. Real propellant properties of oxygen, hydrogen, methane, propane, and RP-1 are included in ROCCID. The properties of other propellants can be easily added. The analysis models in ROCCID can account for the influences of acoustic cavities, helmholtz resonators, and radial thrust chamber baffles on combustion stability. ROCCID also contains the logic to interactively create a combustor design which meets input performance and stability goals. A preliminary design results from the application of historical correlations to the input design requirements. The steady state performance and combustion stability of this design is evaluated using the analysis models, and ROCCID guides the user as to the design changes required to satisfy the user's performance and stability goals, including the design of stability aids. Output from ROCCID includes a formatted input file for the standardized JANNAF engine performance prediction procedure.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
2017-04-30
practices in latent variable theory, it is not surprising that effective measurement programs present methodological typing and considering of experimental ...7 3.3 Methodology ...8 Revised Enterprise Modeling Methodology ................................................................ 128 9 Conclusions
Mondello, Luigi; Casilli, Alessandro; Tranchida, Peter Quinto; Lo Presti, Maria; Dugo, Paola; Dugo, Giovanni
2007-11-01
The present research is focused on the development of a comprehensive two-dimensional gas chromatography-rapid scanning quadrupole mass spectrometric (GC x GC-qMS) methodology for the analysis of trace-amount pesticides contained in a complex real-world sample. Reliable peak assignment was carried out by using a recently developed, dedicated pesticide MS library (for comprehensive GC analysis), characterized by a twin-filter search procedure, the first based on a minimum degree of spectral similarity and the second on the interactive use of linear retention indices (LRI). The library was constructed by subjecting mixtures of commonly used pesticides to GC x GC-qMS analysis and then deriving their pure mass spectra and LRI values. In order to verify the effectiveness of the approach, a pesticide-contaminated red grapefruit extract was analysed. The certainty of peak assignment was attained by exploiting both the enhanced separation power of dual-oven GC x GC and the highly effective search procedure.
Using cluster analysis for medical resource decision making.
Dilts, D; Khamalah, J; Plotkin, A
1995-01-01
Escalating costs of health care delivery have in the recent past often made the health care industry investigate, adapt, and apply those management techniques relating to budgeting, resource control, and forecasting that have long been used in the manufacturing sector. A strategy that has contributed much in this direction is the definition and classification of a hospital's output into "products" or groups of patients that impose similar resource or cost demands on the hospital. Existing classification schemes have frequently employed cluster analysis in generating these groupings. Unfortunately, the myriad articles and books on clustering and classification contain few formalized selection methodologies for choosing a technique for solving a particular problem, hence they often leave the novice investigator at a loss. This paper reviews the literature on clustering, particularly as it has been applied in the medical resource-utilization domain, addresses the critical choices facing an investigator in the medical field using cluster analysis, and offers suggestions (using the example of clustering low-vision patients) for how such choices can be made.
Multilevel sparse functional principal component analysis.
Di, Chongzhi; Crainiceanu, Ciprian M; Jank, Wolfgang S
2014-01-29
We consider analysis of sparsely sampled multilevel functional data, where the basic observational unit is a function and data have a natural hierarchy of basic units. An example is when functions are recorded at multiple visits for each subject. Multilevel functional principal component analysis (MFPCA; Di et al. 2009) was proposed for such data when functions are densely recorded. Here we consider the case when functions are sparsely sampled and may contain only a few observations per function. We exploit the multilevel structure of covariance operators and achieve data reduction by principal component decompositions at both between and within subject levels. We address inherent methodological differences in the sparse sampling context to: 1) estimate the covariance operators; 2) estimate the functional principal component scores; 3) predict the underlying curves. Through simulations the proposed method is able to discover dominating modes of variations and reconstruct underlying curves well even in sparse settings. Our approach is illustrated by two applications, the Sleep Heart Health Study and eBay auctions.
Sutherland, Devon J; Stearman, G Kim; Wells, Martha J M
2003-01-01
The transport and fate of pesticides applied to ornamental plant nursery crops are not well documented. Methodology for analysis of soil and water runoff samples concomitantly containing the herbicides simazine (1-chloro-4,6-bis(ethylamino)-s-triazine) and 2,4-D ((2,4-dichlorophenoxy)acetic acid) was developed in this research to investigate the potential for runoff and leaching from ornamental nursery plots. Solid-phase extraction was used prior to analysis by gas chromatography and liquid chromatography. Chromatographic results were compared with determination by enzyme-linked immunoassay analysis. The significant analytical contributions of this research include (1) the development of a scheme using chromatographic mode sequencing for the fractionation of simazine and 2,4-D, (2) optimization of the homogeneous derivatization of 2,4-D using the methylating agent boron trifluoride in methanol as an alternative to in situ generation of diazomethane, and (3) the practical application of these techniques to field samples.
Relevant principal component analysis applied to the characterisation of Portuguese heather honey.
Martins, Rui C; Lopes, Victor V; Valentão, Patrícia; Carvalho, João C M F; Isabel, Paulo; Amaral, Maria T; Batista, Maria T; Andrade, Paula B; Silva, Branca M
2008-01-01
The main purpose of this study was the characterisation of 'Serra da Lousã' heather honey by using novel statistical methodology, relevant principal component analysis, in order to assess the correlations between production year, locality and composition. Herein, we also report its chemical composition in terms of sugars, glycerol and ethanol, and physicochemical parameters. Sugars profiles from 'Serra da Lousã' heather and 'Terra Quente de Trás-os-Montes' lavender honeys were compared and allowed the discrimination: 'Serra da Lousã' honeys do not contain sucrose, generally exhibit lower contents of turanose, trehalose and maltose and higher contents of fructose and glucose. Different localities from 'Serra da Lousã' provided groups of samples with high and low glycerol contents. Glycerol and ethanol contents were revealed to be independent of the sugars profiles. These data and statistical models can be very useful in the comparison and detection of adulterations during the quality control analysis of 'Serra da Lousã' honey.
Testing and analysis of flat and curved panels with multiple cracks
NASA Technical Reports Server (NTRS)
Broek, David; Jeong, David Y.; Thomson, Douglas
1994-01-01
An experimental and analytical investigation of multiple cracking in various types of test specimens is described in this paper. The testing phase is comprised of a flat unstiffened panel series and curved stiffened and unstiffened panel series. The test specimens contained various configurations for initial damage. Static loading was applied to these specimens until ultimate failure, while loads and crack propagation were recorded. This data provides the basis for developing and validating methodologies for predicting linkup of multiple cracks, progression to failure, and overall residual strength. The results from twelve flat coupon and ten full scale curved panel tests are presented. In addition, an engineering analysis procedure was developed to predict multiple crack linkup. Reasonable agreement was found between predictions and actual test results for linkup and residual strength for both flat and curved panels. The results indicate that an engineering analysis approach has the potential to quantitatively assess the effect of multiple cracks in the arrest capability of an aircraft fuselage structure.
Unified methodology for airport pavement analysis and design. Vol. 1, state of the art
DOT National Transportation Integrated Search
1991-06-01
This report presents an assessment of the state of the art of airport pavement analysis : and design. The objective is to identify those areas in current airport pavement : analysis methodology that need to be substantially improved from the perspect...
DOT National Transportation Integrated Search
2006-11-01
This report discusses data acquisition and analysis for grade crossing risk analysis at the proposed San Joaquin High-Speed Rail Corridor in San Joaquin, California, and documents the data acquisition and analysis methodologies used to collect and an...
Liquid Crystal Droplet-Based Amplification of Microvesicles that are Shed by Mammalian Cells
Tan, Lie Na; Wiepz, Gregory J.; Miller, Daniel S.; Shusta, Eric V.; Abbott, Nicholas L.
2014-01-01
Membrane-derived microvesicles (MVs) shed by cells are being investigated for their role in intercellular communication and as potential biomarkers of disease, but facile and sensitive methods for their analysis do not exist. Here we demonstrate new principles for analysis of MVs that use micrometer-sized droplets of liquid crystals (LCs) to amplify MVs that are selectively captured via antibody-mediated interactions. The influence of the MVs on the micrometer-sized LC droplets is shown to be readily quantified via use of flow cytometry. The methodology was developed using MVs shed by epidermoid carcinoma A431 cells that contain epidermal growth factor receptor (EGFR) as an important and representative example of MVs containing signaling proteins that play a central role in cancer. The LC droplets were found to be sensitive to 106 MVs containing EGFR (relative to controls using isotype control antibody) and to possess a dynamic range of response across several orders of magnitude. Because the 100 nm-sized MVs captured via EGFR generate an optical response in the micrometer-sized LC droplets that can be readily detected by flow cytometry in light scattering mode, the approach possesses significant advantages over direct detection of MVs by flow cytometry. The LC droplets are also substantially more sensitive than techniques such as immunoblotting because the lipid-component of the MVs serves to amplify the antibody-mediated capture of the target proteins in the MVs. Other merits of the approach are defined and discussed in the paper. PMID:24667742
Fernandez-Ricaud, Luciano; Kourtchenko, Olga; Zackrisson, Martin; Warringer, Jonas; Blomberg, Anders
2016-06-23
Phenomics is a field in functional genomics that records variation in organismal phenotypes in the genetic, epigenetic or environmental context at a massive scale. For microbes, the key phenotype is the growth in population size because it contains information that is directly linked to fitness. Due to technical innovations and extensive automation our capacity to record complex and dynamic microbial growth data is rapidly outpacing our capacity to dissect and visualize this data and extract the fitness components it contains, hampering progress in all fields of microbiology. To automate visualization, analysis and exploration of complex and highly resolved microbial growth data as well as standardized extraction of the fitness components it contains, we developed the software PRECOG (PREsentation and Characterization Of Growth-data). PRECOG allows the user to quality control, interact with and evaluate microbial growth data with ease, speed and accuracy, also in cases of non-standard growth dynamics. Quality indices filter high- from low-quality growth experiments, reducing false positives. The pre-processing filters in PRECOG are computationally inexpensive and yet functionally comparable to more complex neural network procedures. We provide examples where data calibration, project design and feature extraction methodologies have a clear impact on the estimated growth traits, emphasising the need for proper standardization in data analysis. PRECOG is a tool that streamlines growth data pre-processing, phenotypic trait extraction, visualization, distribution and the creation of vast and informative phenomics databases.
Smieszek, Tomas W.; Granato, Gregory E.
2000-01-01
Spatial data are important for interpretation of water-quality information on a regional or national scale. Geographic information systems (GIS) facilitate interpretation and integration of spatial data. The geographic information and data compiled for the conterminous United States during the National Highway Runoff Water-Quality Data and Methodology Synthesis project is described in this document, which also includes information on the structure, file types, and the geographic information in the data files. This 'geodata' directory contains two subdirectories, labeled 'gisdata' and 'gisimage.' The 'gisdata' directory contains ArcInfo coverages, ArcInfo export files, shapefiles (used in ArcView), Spatial Data Transfer Standard Topological Vector Profile format files, and meta files in subdirectories organized by file type. The 'gisimage' directory contains the GIS data in common image-file formats. The spatial geodata includes two rain-zone region maps and a map of national ecosystems originally published by the U.S. Environmental Protection Agency; regional estimates of mean annual streamflow, and water hardness published by the Federal Highway Administration; and mean monthly temperature, mean annual precipitation, and mean monthly snowfall modified from data published by the National Climatic Data Center and made available to the public by the Oregon Climate Service at Oregon State University. These GIS files were compiled for qualitative spatial analysis of available data on a national and(or) regional scale and therefore should be considered as qualitative representations, not precise geographic location information.
NASA Astrophysics Data System (ADS)
Athiyamaan, V.; Mohan Ganesh, G.
2017-11-01
Self-Compacting Concrete is one of the special concretes that have ability to flow and consolidate on its own weight, completely fill the formwork even in the presence of dense reinforcement; whilst maintaining its homogeneity throughout the formwork without any requirement for vibration. Researchers all over the world are developing high performance concrete by adding various Fibers, admixtures in different proportions. Various different kinds Fibers like glass, steel, carbon, Poly propylene and aramid Fibers provide improvement in concrete properties like tensile strength, fatigue characteristic, durability, shrinkage, impact, erosion resistance and serviceability of concrete[6]. It includes fundamental study on fiber reinforced self-compacting concrete with admixtures; its rheological properties, mechanical properties and overview study on design methodology statistical approaches regarding optimizing the concrete performances. The study has been classified into seven basic chapters: introduction, phenomenal study on material properties review on self-compacting concrete, overview on fiber reinforced self-compacting concrete containing admixtures, review on design and analysis of experiment; a statistical approach, summary of existing works on FRSCC and statistical modeling, literature review and, conclusion. It is so eminent to know the resent studies that had been done on polymer based binder materials (fly ash, metakaolin, GGBS, etc.), fiber reinforced concrete and SCC; to do an effective research on fiber reinforced self-compacting concrete containing admixtures. The key aim of the study is to sort-out the research gap and to gain a complete knowledge on polymer based Self compacting fiber reinforced concrete.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Oshiyama, Natália F; Bassani, Rosana A; D'Ottaviano, Itala M L; Bassani, José W M
2012-04-01
As technology evolves, the role of medical equipment in the healthcare system, as well as technology management, becomes more important. Although the existence of large databases containing management information is currently common, extracting useful information from them is still difficult. A useful tool for identification of frequently failing equipment, which increases maintenance cost and downtime, would be the classification according to the corrective maintenance data. Nevertheless, establishment of classes may create inconsistencies, since an item may be close to two classes by the same extent. Paraconsistent logic might help solve this problem, as it allows the existence of inconsistent (contradictory) information without trivialization. In this paper, a methodology for medical equipment classification based on the ABC analysis of corrective maintenance data is presented, and complemented with a paraconsistent annotated logic analysis, which may enable the decision maker to take into consideration alerts created by the identification of inconsistencies and indeterminacies in the classification.
NASA TSRV essential flight control system requirements via object oriented analysis
NASA Technical Reports Server (NTRS)
Duffy, Keith S.; Hoza, Bradley J.
1992-01-01
The objective was to analyze the baseline flight control system of the Transport Systems Research Vehicle (TSRV) and to develop a system specification that offers high visibility of the essential system requirements in order to facilitate the future development of alternate, more advanced software architectures. The flight control system is defined to be the baseline software for the TSRV research flight deck, including all navigation, guidance, and control functions, and primary pilot displays. The Object Oriented Analysis (OOA) methodology developed is used to develop a system requirement definition. The scope of the requirements definition contained herein is limited to a portion of the Flight Management/Flight Control computer functionality. The development of a partial system requirements definition is documented, and includes a discussion of the tasks required to increase the scope of the requirements definition and recommendations for follow-on research.
Multidimensional stock network analysis: An Escoufier's RV coefficient approach
NASA Astrophysics Data System (ADS)
Lee, Gan Siew; Djauhari, Maman A.
2013-09-01
The current practice of stocks network analysis is based on the assumption that the time series of closed stock price could represent the behaviour of the each stock. This assumption leads to consider minimal spanning tree (MST) and sub-dominant ultrametric (SDU) as an indispensible tool to filter the economic information contained in the network. Recently, there is an attempt where researchers represent stock not only as a univariate time series of closed price but as a bivariate time series of closed price and volume. In this case, they developed the so-called multidimensional MST to filter the important economic information. However, in this paper, we show that their approach is only applicable for that bivariate time series only. This leads us to introduce a new methodology to construct MST where each stock is represented by a multivariate time series. An example of Malaysian stock exchange will be presented and discussed to illustrate the advantages of the method.
Qualitative Importance Measures of Systems Components - A New Approach and Its Applications
NASA Astrophysics Data System (ADS)
Chybowski, Leszek; Gawdzińska, Katarzyna; Wiśnicki, Bogusz
2016-12-01
The paper presents an improved methodology of analysing the qualitative importance of components in the functional and reliability structures of the system. We present basic importance measures, i.e. the Birnbaum's structural measure, the order of the smallest minimal cut-set, the repetition count of an i-th event in the Fault Tree and the streams measure. A subsystem of circulation pumps and fuel heaters in the main engine fuel supply system of a container vessel illustrates the qualitative importance analysis. We constructed a functional model and a Fault Tree which we analysed using qualitative measures. Additionally, we compared the calculated measures and introduced corrected measures as a tool for improving the analysis. We proposed scaled measures and a common measure taking into account the location of the component in the reliability and functional structures. Finally, we proposed an area where the measures could be applied.
Alves, Mateus Feitosa; Ferreira, Larissa Adilis Maria Paiva; Gadelha, Francisco Allysson Assis Ferreira; Ferreira, Laércia Karla Diega Paiva; Felix, Mayara Barbalho; Scotti, Marcus Tullius; Scotti, Luciana; de Oliveira, Kardilândia Mendes; Dos Santos, Sócrates Golzio; Diniz, Margareth de Fátima Formiga Melo
2017-12-04
The ethanolic extract of the leaves of Cissampelos sympodialis showed great pharmacological potential, with inflammatory and immunomodulatory activities, however, it showed some toxicological effects. Therefore, this study aims to verify the toxicological potential of alkaloids of the genus Cissampelos through in silico methodologies, to develop a method in LC-MS/MS verifying the presence of alkaloids in the infusion and to evaluate the toxicity of the infusion of the leaves of C. sympodialis when inhaled by Swiss mice. Results in silico showed that alkaloid 93 presented high toxicological potential along with the products of its metabolism. LC-MS/MS results showed that the infusion of the leaves of this plant contained the alkaloids warifteine and methylwarifteine. Finally, the in vivo toxicological analysis of the C. sympodialis infusion showed results, both in biochemistry, organ weights and histological analysis, that the infusion of C. sympodialis leaves presents a low toxicity.
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Creasy, John T
2015-05-12
This project has the objective to reduce and/or eliminate the use of HEU in commerce. Steps in the process include developing a target testing methodology that is bounding for all Mo-99 target irradiators, establishing a maximum target LEU-foil mass, developing a LEU-foil target qualification document, developing a bounding target failure analysis methodology (failure in reactor containment), optimizing safety vs. economics (goal is to manufacture a safe, but relatively inexpensive target to offset the inherent economic disadvantage of using LEU in place of HEU), and developing target material specifications and manufacturing QC test criteria. The slide presentation is organized under themore » following topics: Objective, Process Overview, Background, Team Structure, Key Achievements, Experiment and Activity Descriptions, and Conclusions. The High Density Target project has demonstrated: approx. 50 targets irradiated through domestic and international partners; proof of concept for two front end processing methods; fabrication of uranium foils for target manufacture; quality control procedures and steps for manufacture; multiple target assembly techniques; multiple target disassembly devices; welding of targets; thermal, hydraulic, and mechanical modeling; robust target assembly parametric studies; and target qualification analysis for insertion into very high flux environment. The High Density Target project has tested and proven several technologies that will benefit current and future Mo-99 producers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abagnale, Carmelina, E-mail: c.abagnale@unina.it; Cardone, Massimo, E-mail: massimo.cardone@unina.it; Iodice, Paolo, E-mail: paolo.iodice@unina.it
2015-07-15
This paper describes the methodologies to appraise the power requests and environmental analysis of an electrically assisted bicycle under real driving conditions, also containing regulations and technical-science-related aspects. For this purpose, in this study, the on-road test program of an electrically assisted bicycle was executed in the urban area of Naples on different test tracks, so a general assessment about its driving behavior under several driving conditions was performed. The power requirements in different typical riding situations were estimated by a procedure based on the experimental kinematic parameters that characterize the driving dynamics collected during the real-life applications. An environmentalmore » analysis was also performed, with a methodology that takes into account the environmental assessment of a moped by measuring the experimental moped exhaust emissions of the regulated pollutants. Starting from the results acquired during the different test samples, besides, an assessment of the electric traction offered by this pedelec on the driving comfort was evaluated for different riding situations. - Highlights: • The power requirements of a pedelec in typical riding conditions were identified. • The estimated electricity consumption for battery recharging was defined. • An environmental valuation of the tested pedelec and of a moped was performed. • Emissions that could be saved utilizing a pedelec instead of a moped were derived.« less
Li, Guoliang; Cui, Yanyan; You, Jinmao; Zhao, Xianen; Sun, Zhiwei; Xia, Lian; Suo, Yourui; Wang, Xiao
2011-04-01
Analysis of trace amino acids (AA) in physiological fluids has received more attention, because the analysis of these compounds could provide fundamental and important information for medical, biological, and clinical researches. More accurate method for the determination of those compounds is highly desirable and valuable. In the present study, we developed a selective and sensitive method for trace AA determination in biological samples using 2-[2-(7H-dibenzo [a,g]carbazol-7-yl)-ethoxy] ethyl chloroformate (DBCEC) as labeling reagent by HPLC-FLD-MS/MS. Response surface methodology (RSM) was first employed to optimize the derivatization reaction between DBCEC and AA. Compared with traditional single-factor design, RSM was capable of lessening laborious, time and reagents consumption. The complete derivatization can be achieved within 6.3 min at room temperature. In conjunction with a gradient elution, a baseline resolution of 20 AA containing acidic, neutral, and basic AA was achieved on a reversed-phase Hypersil BDS C(18) column. This method showed excellent reproducibility and correlation coefficient, and offered the exciting detection limits of 0.19-1.17 fmol/μL. The developed method was successfully applied to determinate AA in human serum. The sensitive and prognostic index of serum AA for liver diseases has also been discussed.
Bjornsson, Christopher S; Lin, Gang; Al-Kofahi, Yousef; Narayanaswamy, Arunachalam; Smith, Karen L; Shain, William; Roysam, Badrinath
2009-01-01
Brain structural complexity has confounded prior efforts to extract quantitative image-based measurements. We present a systematic ‘divide and conquer’ methodology for analyzing three-dimensional (3D) multi-parameter images of brain tissue to delineate and classify key structures, and compute quantitative associations among them. To demonstrate the method, thick (~100 μm) slices of rat brain tissue were labeled using 3 – 5 fluorescent signals, and imaged using spectral confocal microscopy and unmixing algorithms. Automated 3D segmentation and tracing algorithms were used to delineate cell nuclei, vasculature, and cell processes. From these segmentations, a set of 23 intrinsic and 8 associative image-based measurements was computed for each cell. These features were used to classify astrocytes, microglia, neurons, and endothelial cells. Associations among cells and between cells and vasculature were computed and represented as graphical networks to enable further analysis. The automated results were validated using a graphical interface that permits investigator inspection and corrective editing of each cell in 3D. Nuclear counting accuracy was >89%, and cell classification accuracy ranged from 81–92% depending on cell type. We present a software system named FARSIGHT implementing our methodology. Its output is a detailed XML file containing measurements that may be used for diverse quantitative hypothesis-driven and exploratory studies of the central nervous system. PMID:18294697
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, S.K.; Cole, C.R.; Bond, F.W.
1979-12-01
The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
Compound-specific stable isotope analysis of nitrogen-containing intact polar lipids.
Svensson, Elisabeth; Schouten, Stefan; Stam, Axel; Middelburg, Jack J; Sinninghe Damsté, Jaap S
2015-12-15
Compound-specific isotope analysis (CSIA) of nitrogen in amino acids has proven a valuable tool in many fields (e.g. ecology). Several intact polar lipids (IPLs) also contain nitrogen, and their nitrogen isotope ratios have the potential to elucidate food-web interactions or metabolic pathways. Here we have developed novel methodology for the determination of δ(15)N values of nitrogen-containing headgroups of IPLs using gas chromatography coupled with isotope-ratio mass spectrometry. Intact polar lipids with nitrogen-containing headgroups were hydrolyzed and the resulting compounds were derivatized by (1) acetylation with pivaloyl chloride for compounds with amine and hydroxyl groups or (2) esterification using acidified 2-propanol followed by acetylation with pivaloyl chloride for compounds with both carboxyl and amine groups. The δ(15)N values of the derivatives were subsequently determined using gas chromatography/combustion/isotope-ratio mass spectrometry. Intact polar lipids with ethanolamine and amino acid headgroups, such as phosphatidylethanolamine and phosphatidylserine, were successfully released from the IPLs and derivatized. Using commercially available pure compounds it was established that δ(15)N values of ethanolamine and glycine were not statistically different from the offline-determined values. Application of the technique to microbial cultures and a microbial mat showed that the method works well for the release and derivatization of the headgroup of phosphatidylethanolamine, a common IPL in bacteria. A method to enable CSIA of nitrogen of selected IPLs has been developed. The method is suitable for measuring natural stable nitrogen isotope ratios in microbial lipids, in particular phosphatidylethanolamine, and will be especially useful for tracing the fate of nitrogen in deliberate tracer experiments. Copyright © 2015 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nathan, S.; Loftin, B.; Abramczyk, G.
The Small Gram Quantity (SGQ) concept is based on the understanding that small amounts of hazardous materials, in this case radioactive materials (RAM), are significantly less hazardous than large amounts of the same materials. This paper describes a methodology designed to estimate an SGQ for several neutron and gamma emitting isotopes that can be shipped in a package compliant with 10 CFR Part 71 external radiation level limits regulations. These regulations require packaging for the shipment of radioactive materials, under both normal and accident conditions, to perform the essential functions of material containment, subcriticality, and maintain external radiation levels withinmore » the specified limits. By placing the contents in a helium leak-tight containment vessel, and limiting the mass to ensure subcriticality, the first two essential functions are readily met. Some isotopes emit sufficiently strong photon radiation that small amounts of material can yield a large dose rate outside the package. Quantifying the dose rate for a proposed content is a challenging issue for the SGQ approach. It is essential to quantify external radiation levels from several common gamma and neutron sources that can be safely placed in a specific packaging, to ensure compliance with federal regulations. The Packaging Certification Program (PCP) Methodology for Determining Dose Rate for Small Gram Quantities in Shipping Packagings provides bounding shielding calculations that define mass limits compliant with 10 CFR 71.47 for a set of proposed SGQ isotopes. The approach is based on energy superposition with dose response calculated for a set of spectral groups for a baseline physical packaging configuration. The methodology includes using the MCNP radiation transport code to evaluate a family of neutron and photon spectral groups using the 9977 shipping package and its associated shielded containers as the base case. This results in a set of multipliers for 'dose per particle' for each spectral group. For a given isotope, the source spectrum is folded with the response for each group. The summed contribution from all isotopes determines the total dose from the RAM in the container.« less
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
QESA: Quarantine Extraterrestrial Sample Analysis Methodology
NASA Astrophysics Data System (ADS)
Simionovici, A.; Lemelle, L.; Beck, P.; Fihman, F.; Tucoulou, R.; Kiryukhina, K.; Courtade, F.; Viso, M.
2018-04-01
Our nondestructive, nm-sized, hyperspectral analysis methodology of combined X-rays/Raman/IR probes in BSL4 quarantine, renders our patented mini-sample holder ideal for detecting extraterrestrial life. Our Stardust and Archean results validate it.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Dara, Ajay; Sangamwar, Abhay T.
2014-01-01
Background In a search for an effective anticancer therapy the R&D units from leading universities and institutes reveal numerous technologies in the form of patent documents. The article addressed comparative anticancer patent landscape and technology assessment of Council of Scientific and Industrial Research (CSIR): India’s largest R&D organisation with top twenty international public funded universities and institutes from eight different countries. Methodology/Principal Findings The methodology include quantitative and qualitative assessment based on the bibliometric parameters and manual technology categorisation to understand the changing patent trends and recent novel technologies. The research finding analysed 25,254 patent documents from the year 1993 to 2013 and reported the insights of latest anticancer technologies and targets through categorisation studies at the level of drug discovery, development and treatment & diagnosis. The article has reported the technology correlation matrix of twelve secondary class technologies with 34 tertiary sub-class research area to identify the leading technologies and scope of future research through whitespaces analysis. In addition, the results have also addressed the target analysis, leading inventor, assignee, collaboration network, geographical distribution, patent trend analysis, citation maps and technology assessment with respect to international patent classification systems such as CPC, IPC and CPI codes. Conclusions/Significance The result suggested peptide technology as the dominating research area next to gene therapy, vaccine and medical preparation containing organic compounds. The Indian CSIR has ranked itself at seventh position among the top 20 universities. Globally, the anticancer research was focused in the area of genetics and immunology, whereas Indian CSIR reported more patents related to plant extract and organic preparation. The article provided a glimpse of two decade anticancer scenario with respect to top public funded universities worldwide. PMID:25083710
Quantitative comparison of the in situ microbial communities in different biomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D.C.; Ringelberg, D.B.; Palmer, R.J.
1995-12-31
A system to define microbial communities in different biomes requires the application of non-traditional methodology. Classical microbiological methods have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing, and/or enumeration by direct microscopic counting are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. Such methods provide little insight into the in situ phenotypic activity of the extant microbiota since these techniques are dependent on microbial growth and thus select against many environmental microorganisms which are non- culturable under a wide range of conditions. It has been repeatedlymore » documented in the literature that viable counts or direct counts of bacteria attached to sediment grains are difficult to quantitative and may grossly underestimate the extent of the existing community. The traditional tests provide little indication of the in situ nutritional status or for evidence of toxicity within the microbial community. A more recent development (MIDI Microbial Identification System), measure free and ester-linked fatty acids from isolated microorganisms. Bacterial isolates are identified by comparing their fatty acid profiles to the MIKI database which contains over 8000 entries. The application of the MIKI system to the analysis of environmental samples however, has significant drawbacks. The MIDI system was developed to identify clinical microorganisms and requires their isolation and culture on trypticase soy agar at 27{degrees}C. Since many isolates are unable to grow at these restrictive growth conditions, the system does not lend itself to identification of some environmental organisms. A more applicable methodology for environmental microbial analysis is based on the liquid extrication and separation of microbial lipids from environmental samples, followed by quantitative analysis using gas chromatography/« less
Opinion: Clarifying Two Controversies about Information Mapping's Method.
ERIC Educational Resources Information Center
Horn, Robert E.
1992-01-01
Describes Information Mapping, a methodology for the analysis, organization, sequencing, and presentation of information and explains three major parts of the method: (1) content analysis, (2) project life-cycle synthesis and integration of the content analysis, and (3) sequencing and formatting. Major criticisms of the methodology are addressed.…
van Dieten, H. E M; Bos, I.; van Tulder, M. W; Lems, W.; Dijkmans, B.; Boers, M.
2000-01-01
A systematic review on the cost effectiveness of prophylactic treatments of non-steroidal anti-inflammatory drug (NSAID) induced gastropathy in patients with osteoarthritis or rheumatoid arthritis was conducted. Two reviewers conducted the literature search and the review. Both full and partial economic evaluations published in English, Dutch, or German were included. The criteria list published in the textbook of Drummond was used to determine the quality of the economic evaluations. The methodological quality of three randomised controlled trials (RCTs) in which the economic evaluations obtained probability estimates of NSAID induced gastropathy and adverse events was assessed by a list of internal validity criteria. The conclusions were based on a rating system consisting of four levels of evidence. Ten economic evaluations were included; three were based on RCTs. All evaluations studied misoprostol as prophylactic treatment: in one evaluation misoprostol was studied as a fixed component in a combination with diclofenac (Arthrotec). All economic evaluations comprised analytical studies containing a decision tree. The three trials were of high methodological quality. Nine economic evaluations were considered high quality and one economic evaluation was considered of low methodological quality. There is strong evidence (level "A") that the use of misoprostol for the prevention of NSAID induced gastropathy is cost effective, and limited evidence (level "C") that the use of Arthrotec is cost effective. Although the levels of evidence used in this review are arbitrary, it is believed that a qualitative analysis is useful: quantitative analyses in this field are hampered by the heterogeneity of economic evaluations. Existing criteria to evaluate the methodological quality of economic evaluations may need refinement for use in systematic reviews. PMID:11005773
van Dieten, H E; Korthals-de Bos, I B; van Tulder, M W; Lems, W F; Dijkmans, B A; Boers, M
2000-10-01
A systematic review on the cost effectiveness of prophylactic treatments of non-steroidal anti-inflammatory drug (NSAID) induced gastropathy in patients with osteoarthritis or rheumatoid arthritis was conducted. Two reviewers conducted the literature search and the review. Both full and partial economic evaluations published in English, Dutch, or German were included. The criteria list published in the textbook of Drummond was used to determine the quality of the economic evaluations. The methodological quality of three randomised controlled trials (RCTs) in which the economic evaluations obtained probability estimates of NSAID induced gastropathy and adverse events was assessed by a list of internal validity criteria. The conclusions were based on a rating system consisting of four levels of evidence. Ten economic evaluations were included; three were based on RCTs. All evaluations studied misoprostol as prophylactic treatment: in one evaluation misoprostol was studied as a fixed component in a combination with diclofenac (Arthrotec). All economic evaluations comprised analytical studies containing a decision tree. The three trials were of high methodological quality. Nine economic evaluations were considered high quality and one economic evaluation was considered of low methodological quality. There is strong evidence (level "A") that the use of misoprostol for the prevention of NSAID induced gastropathy is cost effective, and limited evidence (level "C") that the use of Arthrotec is cost effective. Although the levels of evidence used in this review are arbitrary, it is believed that a qualitative analysis is useful: quantitative analyses in this field are hampered by the heterogeneity of economic evaluations. Existing criteria to evaluate the methodological quality of economic evaluations may need refinement for use in systematic reviews.
Transitioning Domain Analysis: An Industry Experience.
1996-06-01
References 6 Implementation 6.1 Analysis of Operator Services’ Requirements Process 21 6.2 Preliminary Planning for FODA Training by SEI 21...an academic and industry partnership took feature oriented domain analysis ( FODA ) from a methodology that is still being defined to a well-documented...to pilot the use of the Software Engineering Institute (SEI) domain analysis methodology known as feature-oriented domain analysis ( FODA ). Supported
NASA Astrophysics Data System (ADS)
Sasaki, Syota; Yamada, Tadashi; Yamada, Tomohito J.
2014-05-01
We aim to propose a kinematic-based methodology similar with runoff analysis for readily understandable radiological protection. A merit of this methodology is to produce sufficiently accurate effective doses by basic analysis. The great earthquake attacked the north-east area in Japan on March 11, 2011. The system of electrical facilities to control Fukushima Daiichi nuclear power plant was completely destroyed by the following tsunamis. From the damaged reactor containment vessels, an amount of radioactive isotopes had leaked and been diffused in the vicinity of the plant. Radiological internal exposure caused by ingestion of food containing radioactive isotopes has become an issue of great interest to the public, and has caused excessive anxiety because of a deficiency of fundamental knowledge concerning radioactivity. Concentrations of radioactivity in the human body and internal exposure have been studied extensively. Previous radiologic studies, for example, studies by International Commission on Radiological Protection(ICRP), employ a large-scale computational simulation including actual mechanism of metabolism in the human body. While computational simulation is a standard method for calculating exposure doses among radiology specialists, these methods, although exact, are too difficult for non-specialists to grasp the whole image owing to the sophistication. In this study, the human body is treated as a vessel. The number of radioactive atoms in the human body can be described by an equation of continuity, which is the only governing equation. Half-life, the period of time required for the amount of a substance decreases by half, is only parameter to calculate the number of radioactive isotopes in the human body. Half-life depends only on the kinds of nuclides, there are no arbitrary parameters. It is known that the number of radioactive isotopes decrease exponentially by radioactive decay (physical outflow). It is also known that radioactive isotopes decrease exponentially by excretion (biological outflow). The total outflow is the sum of physical outflow and biological outflow. As a result, the number of radioactive atoms in the human body also decreases exponentially. Half-life can be determined by outflow flux from the definition. Intensity of radioactivity is linear respect to the number of radioactive atoms, both are equivalent analytically. Internal total exposure can be calculated by the time integral of intensity of radioactivity. The absorbed energy into the human body per radioactive decay and the effective dose are calculated by aid of Fermi's theory of beta decay and special relativity. The effective doses calculated by the present method almost agree with those of a study by ICRP. The present method shows that standard limit in general foods for radioactive cesium enforced in Japan, 100 Bq/kg, is too excessive. When we eat foods which contain cesium-137 of 100 Bq/kg at 1 kg/d during 50 years, we receive the effective dose less than natural exposure. Similarly, it is shown that we cannot find significant health damage medically and statistically by ingestion of rice which is harvested from a paddy field deposited current (January, 2014) radioactive cesium.
Metal ions in neurology and psychiatry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gabay, S.; Harris, J.; Ho, B.T.
1985-01-01
This book consists of five sections, each containing several papers. The section titles are: CNS Development and Aging, Clinically Related Aspects of Trace Elements, Neurochemical Aspects, Neurotoxicity and Neuropathology, and Methodology and Application.
Multi-frequency data analysis in AFM by wavelet transform
NASA Astrophysics Data System (ADS)
Pukhova, V.; Ferrini, G.
2017-10-01
Interacting cantilevers in AFM experiments generate non-stationary, multi-frequency signals consisting of numerous excited flexural and torsional modes and their harmonics. The analysis of such signals is challenging, requiring special methodological approaches and a powerful mathematical apparatus. The most common approach to the signal analysis is to apply Fourier transform analysis. However, FT gives accurate spectra for stationary signals, and for signals changing their spectral content over time, FT provides only an averaged spectrum. Hence, for non-stationary and rapidly varying signals, such as those from interacting cantilevers, a method that shows the spectral evolution in time is needed. One of the most powerful techniques, allowing detailed time-frequency representation of signals, is the wavelet transform. It is a method of analysis that allows representation of energy associated to the signal at a particular frequency and time, providing correlation between the spectral and temporal features of the signal, unlike FT. This is particularly important in AFM experiments because signals nonlinearities contains valuable information about tip-sample interactions and consequently surfaces properties. The present work is aimed to show the advantages of wavelet transform in comparison with FT using as an example the force curve analysis in dynamic force spectroscopy.
Extracting archaeal populations from iron oxidizing systems
NASA Astrophysics Data System (ADS)
Whitmore, L. M.; Hutchison, J.; Chrisler, W.; Jay, Z.; Moran, J.; Inskeep, W.; Kreuzer, H.
2013-12-01
Unique environments in Yellowstone National Park offer exceptional conditions for studying microorganisms in extreme and constrained systems. However, samples from some extreme systems often contain inorganic components that pose complications during microbial and molecular analysis. Several archaeal species are found in acidic, geothermal ferric-oxyhydroxide mats; these species have been shown to adhere to mineral surfaces in flocculated colonies. For optimal microbial analysis, (microscopy, flow cytometry, genomic extractions, proteomic analysis, stable isotope analysis, and others), improved techniques are needed to better facilitate cell detachment and separation from mineral surfaces. As a requirement, these techniques must preserve cell structure while simultaneously minimizing organic carryover to downstream analysis. Several methods have been developed for removing sediments from mixed prokaryotic populations, including ultra-centrifugation, nycodenz gradient, sucrose cushions, and cell straining. In this study we conduct a comparative analysis of mechanisms used to detach archaeal cell populations from the mineral interface. Specifically, we evaluated mechanical and chemical approaches for cell separation and homogenization. Methods were compared using confocal microscopy, flow cytometry analyses, and real-time PCR detection. The methodology and approaches identified will be used to optimize biomass collection from environmental specimens or isolates grown with solid phases.
Alabbadi, Ibrahim; Crealey, Grainne; Scott, Michael; Baird, Simon; Trouton, Tom; Mairs, Jill; McElnay, James
2006-01-01
System of Objectified Judgement Analysis (SOJA) is a structured approach to the selection of drugs for formulary inclusion. How- ever, while SOJA is a very important advance in drug selection for formulary purposes, it is hospital based and can only be applied to one indication at a time. In SOJA, cost has been given a primary role in the selection process as it has been included as a selection criterion from the start. Cost may therefore drive the selection of a particular drug product at the expense of other basic criteria such as safety or efficacy. The aims of this study were to use a modified SOJA approach in the selection of ACE inhibitors (ACEIs) for use in a joint formulary that bridges primary and secondary care within a health board in Northern Ireland, and to investigate the potential impact of the joint formulary on prescribing costs of ACEIs in that health board. The modified SOJA approach involved four phases in sequence: an evidence-based pharmacotherapeutic evaluation of all available ACEI drug entities, a separate safety/risk assessment analysis of products containing agents that exceeded the pharmacotherapeutic threshold, a budget-impact analysis and, finally, the selection of product lines. A comprehensive literature review and expert panel judgement informed the selection of criteria (and their relative weighting) for the pharmacotherapeutic evaluation. The resultant criteria/scoring system was circulated (in questionnaire format) to prescribers and stakeholders for comment. Based on statistical analysis of the latter survey results, the final scoring system was developed. Drug entities that exceeded the evidence threshold were sequentially entered into the second and third phases of the process. Five drug entities (11 currently available in the UK) exceeded the evidence threshold and 22 of 26 submitted product lines containing these drug entities satisfied the safety/risk assessment criteria. Three product lines, each containing a different drug entity, were selected for formulary inclusion after budget impact analysis was performed. The estimated potential annual cost savings for ACEIs (based on estimated annual usage in defined daily doses) for this particular health board was 42%. The modified SOJA approach has a significant contribution to make in containing the costs of ACEIs. Applying modified SOJA as a practical method for all indications will allow the development of a unified formulary that bridges secondary and primary care.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
Q-Sample Construction: A Critical Step for a Q-Methodological Study.
Paige, Jane B; Morin, Karen H
2016-01-01
Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.
2008-07-23
This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.
Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dilley, Lorie M.
2015-04-13
The purpose of this project was to: 1) evaluate the relationship between geothermal fluid processes and the compositions of the fluid inclusion gases trapped in the reservoir rocks; and 2) develop methodologies for interpreting fluid inclusion gas data in terms of the chemical, thermal and hydrological properties of geothermal reservoirs. Phase 1 of this project was designed to conduct the following: 1) model the effects of boiling, condensation, conductive cooling and mixing on selected gaseous species; using fluid compositions obtained from geothermal wells, 2) evaluate, using quantitative analyses provided by New Mexico Tech (NMT), how these processes are recorded bymore » fluid inclusions trapped in individual crystals; and 3) determine if the results obtained on individual crystals can be applied to the bulk fluid inclusion analyses determined by Fluid Inclusion Technology (FIT). Our initial studies however, suggested that numerical modeling of the data would be premature. We observed that the gas compositions, determined on bulk and individual samples were not the same as those discharged by the geothermal wells. Gases discharged from geothermal wells are CO 2-rich and contain low concentrations of light gases (i.e. H 2, He, N, Ar, CH4). In contrast many of our samples displayed enrichments in these light gases. Efforts were initiated to evaluate the reasons for the observed gas distributions. As a first step, we examined the potential importance of different reservoir processes using a variety of commonly employed gas ratios (e.g. Giggenbach plots). The second technical target was the development of interpretational methodologies. We have develop methodologies for the interpretation of fluid inclusion gas data, based on the results of Phase 1, geologic interpretation of fluid inclusion data, and integration of the data. These methodologies can be used in conjunction with the relevant geological and hydrological information on the system to create fluid models for the system. The hope is that the methodologies developed will allow bulk fluid inclusion gas analysis to be a useful tool for estimating relative temperatures, identifying the sources and origins of the geothermal fluids, and developing conceptual models that can be used to help target areas of enhanced permeability.« less
NASA Astrophysics Data System (ADS)
Mallepudi, Sri Abhishikth; Calix, Ricardo A.; Knapp, Gerald M.
2011-02-01
In recent years there has been a rapid increase in the size of video and image databases. Effective searching and retrieving of images from these databases is a significant current research area. In particular, there is a growing interest in query capabilities based on semantic image features such as objects, locations, and materials, known as content-based image retrieval. This study investigated mechanisms for identifying materials present in an image. These capabilities provide additional information impacting conditional probabilities about images (e.g. objects made of steel are more likely to be buildings). These capabilities are useful in Building Information Modeling (BIM) and in automatic enrichment of images. I2T methodologies are a way to enrich an image by generating text descriptions based on image analysis. In this work, a learning model is trained to detect certain materials in images. To train the model, an image dataset was constructed containing single material images of bricks, cloth, grass, sand, stones, and wood. For generalization purposes, an additional set of 50 images containing multiple materials (some not used in training) was constructed. Two different supervised learning classification models were investigated: a single multi-class SVM classifier, and multiple binary SVM classifiers (one per material). Image features included Gabor filter parameters for texture, and color histogram data for RGB components. All classification accuracy scores using the SVM-based method were above 85%. The second model helped in gathering more information from the images since it assigned multiple classes to the images. A framework for the I2T methodology is presented.
Current evidence of percutaneous nucleoplasty for the cervical herniated disk: a systematic review.
Wullems, Jorgen A; Halim, Willy; van der Weegen, Walter
2014-07-01
Although percutaneous cervical nucleoplasty (PCN) has been shown to be both safe and effective, its application is still debated. PCN applied in disk herniation has not been systematically reviewed before, resulting in a limited insight into its effectiveness and safety, and the quality of available evidence. Therefore, we systematically reviewed the evidence on the efficacy and safety of PCN in patients with a (contained) herniated disk. MEDLINE, EMBASE, and the Cochrane Library (Central Register of Controlled Trials) were searched for randomized controlled trials (RCTs) and nonrandomized studies using the following keywords: "Nucleoplasty," "Cervical," "Hernia," "Herniation," "Prolapse," "Protrusion," "Intervertebral disk," and "Percutaneous disk decompression." First, all articles were appraised for methodological quality, and then, RCTs were graded for the level of evidence according a best-evidence synthesis, because a meta-analysis was not possible. Finally, the RCTs' applicability and clinical relevance also was assessed. Of 75 identified abstracts, 10 full-text articles were included (3 RCTs and 7 nonrandomized studies). These studies represented a total of 1021 patients: 823 patients (≥ 892 disks) were treated by PCN. All studies showed low methodological quality, except for two. The level of evidence of the RCTs was graded as moderate, with low to moderate applicability and clinical relevance. All included studies showed PCN to be an effective and safe procedure in the treatment of (contained) herniated disks at short-, mid-, and long-term follow-up. However, the level of evidence is moderate and shows only low to moderate applicability and clinical relevance. © 2013 World Institute of Pain.
Analysis of on-line clinical laboratory manuals and practical recommendations.
Beckwith, Bruce; Schwartz, Robert; Pantanowitz, Liron
2004-04-01
On-line clinical laboratory manuals are a valuable resource for medical professionals. To our knowledge, no recommendations currently exist for their content or design. To analyze publicly accessible on-line clinical laboratory manuals and to propose guidelines for their content. We conducted an Internet search for clinical laboratory manuals written in English with individual test listings. Four individual test listings in each manual were evaluated for 16 data elements, including sample requirements, test methodology, units of measure, reference range, and critical values. Web sites were also evaluated for supplementary information and search functions. We identified 48 on-line laboratory manuals, including 24 academic or community hospital laboratories and 24 commercial or reference laboratories. All manuals had search engines and/or test indices. No single manual contained all 16 data elements evaluated. An average of 8.9 (56%) elements were present (range, 4-14). Basic sample requirements (specimen and volume needed) were the elements most commonly present (98% of manuals). The frequency of the remaining data elements varied from 10% to 90%. On-line clinical laboratory manuals originate from both hospital and commercial laboratories. While most manuals were user-friendly and contained adequate specimen-collection information, other important elements, such as reference ranges, were frequently absent. To ensure that clinical laboratory manuals are of maximal utility, we propose the following 13 data elements be included in individual test listings: test name, synonyms, test description, test methodology, sample requirements, volume requirements, collection guidelines, transport guidelines, units of measure, reference range, critical values, test availability, and date of latest revision.
Rat sperm motility analysis: methodologic considerations
The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...
An economic analysis methodology for project evaluation and programming.
DOT National Transportation Integrated Search
2013-08-01
Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Bidwell, L Cinnamon; Mueller, Raeghan; YorkWilliams, Sophie L; Hagerty, Sarah; Bryan, Angela D; Hutchison, Kent E
2018-01-01
Background: The development of novel cannabis research methods that are compatible with current federal regulations is imperative to conduct studies of the effects of legal market cannabis. There is very little research on higher strength, higher Δ9-tetrahydrocannabinol (THC), which has become increasingly available since legalization. Research on strains containing cannabidiol (CBD), a second primary, but nonpsychotomimetic, cannabinoid, is very limited. Materials and Methods: Using a novel observational methodology, regular cannabis users were asked to use one of two legal market cannabis strains that they purchased from a local dispensary (one strain containing 8% THC and 16% CBD (THC+CBD) and one containing a 17% THC concentration, but no CBD (THC). After using their suggested cannabis strain as they typically would for a 3-day period, participants returned to the laboratory immediately after their final use. Measures included a blood draw to measure cannabinoid blood levels and circulating cytokines, self-reported subjective drug effects, and verbal recall memory. Results: Analysis of CBD/THC concentration levels in the blood following the 3-day strain manipulation suggests that all, but one participant ( n =23/24) followed instructions and used their assigned strain. Individuals in the THC group ( n =11) smoked no more than their usual amount, and participants who used the THC+CBD ( n =12) strain smoked more than their reported usual amount, but did not have significantly different THC+metabolite blood levels from the THC group. The THC+CBD strain was also associated with less desire to smoke, lower levels of subjective drug effects, and lower levels of circulating cytokines (TNF-α, IL-6, and IL-1β) immediately after use. Conclusions: Initial results support the feasibility of this novel observational methodology involving brief manipulation of strain use. Preliminary findings indicate that participants may self-titrate cannabis use based on cannabinoid concentration and the THC+CBD strain was associated with lower levels of cannabis craving, subjective intoxication, and circulating cytokines.
Lewis, Cara C; Scott, Kelli; Marriott, Brigid R
2018-05-16
Tailored implementation approaches are touted as more likely to support the integration of evidence-based practices. However, to our knowledge, few methodologies for tailoring implementations exist. This manuscript will apply a model-driven, mixed methods approach to a needs assessment to identify the determinants of practice, and pilot a modified conjoint analysis method to generate an implementation blueprint using a case example of a cognitive behavioral therapy (CBT) implementation in a youth residential center. Our proposed methodology contains five steps to address two goals: (1) identify the determinants of practice and (2) select and match implementation strategies to address the identified determinants (focusing on barriers). Participants in the case example included mental health therapists and operations staff in two programs of Wolverine Human Services. For step 1, the needs assessment, they completed surveys (clinician N = 10; operations staff N = 58; other N = 7) and participated in focus groups (clinician N = 15; operations staff N = 38) guided by the domains of the Framework for Diffusion [1]. For step 2, the research team conducted mixed methods analyses following the QUAN + QUAL structure for the purpose of convergence and expansion in a connecting process, revealing 76 unique barriers. Step 3 consisted of a modified conjoint analysis. For step 3a, agency administrators prioritized the identified barriers according to feasibility and importance. For step 3b, strategies were selected from a published compilation and rated for feasibility and likelihood of impacting CBT fidelity. For step 4, sociometric surveys informed implementation team member selection and a meeting was held to identify officers and clarify goals and responsibilities. For step 5, blueprints for each of pre-implementation, implementation, and sustainment phases were generated. Forty-five unique strategies were prioritized across the 5 years and three phases representing all nine categories. Our novel methodology offers a relatively low burden collaborative approach to generating a plan for implementation that leverages advances in implementation science including measurement, models, strategy compilations, and methods from other fields.
Methodologies for launcher-payload coupled dynamic analysis
NASA Astrophysics Data System (ADS)
Fransen, S. H. J. A.
2012-06-01
An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context.
Herrero-Huerta, Mónica; Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees.
Automatic tree parameter extraction by a Mobile LiDAR System in an urban context
Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo
2018-01-01
In an urban context, tree data are used in city planning, in locating hazardous trees and in environmental monitoring. This study focuses on developing an innovative methodology to automatically estimate the most relevant individual structural parameters of urban trees sampled by a Mobile LiDAR System at city level. These parameters include the Diameter at Breast Height (DBH), which was estimated by circle fitting of the points belonging to different height bins using RANSAC. In the case of non-circular trees, DBH is calculated by the maximum distance between extreme points. Tree sizes were extracted through a connectivity analysis. Crown Base Height, defined as the length until the bottom of the live crown, was calculated by voxelization techniques. For estimating Canopy Volume, procedures of mesh generation and α-shape methods were implemented. Also, tree location coordinates were obtained by means of Principal Component Analysis. The workflow has been validated on 29 trees of different species sampling a stretch of road 750 m long in Delft (The Netherlands) and tested on a larger dataset containing 58 individual trees. The validation was done against field measurements. DBH parameter had a correlation R2 value of 0.92 for the height bin of 20 cm which provided the best results. Moreover, the influence of the number of points used for DBH estimation, considering different height bins, was investigated. The assessment of the other inventory parameters yield correlation coefficients higher than 0.91. The quality of the results confirms the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of urban trees. PMID:29689076
Developing Army Leaders through Increased Rigor in Professional Military Training and Education
2017-06-09
leadership. Research Methodology An applied, exploratory, qualitative research methodology via a structured and focused case study comparison was...research methodology via a structured and focused case study comparison. Finally, it will discuss how the methodology will be conducted to make...development models; it serves as the base data for case study comparison. 48 Research Methodology and Data Analysis A qualitative research
García-Pérez, M A
2001-11-01
This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.
NASA Astrophysics Data System (ADS)
Wray, Richard B.
1991-12-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Technical Reports Server (NTRS)
Wray, Richard B.
1991-01-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
78 FR 76657 - Notice of Action
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-18
... and PPI Handbook of Methods. DATES: The transition to the FD-ID system will occur with the release of... regular PPI release. That Web page also contains detailed methodological information for the FD-ID...
Aziz, Alfred
2009-01-01
The glycemic index (GI) is an experimental system that classifies carbohydrates (CHO) and CHO-containing foods according to their blood glucose-raising potential. It is based on the glycemic response following the ingestion of a test food containing a defined amount of available CHO relative to that of an equi-carbohydrate portion of either white bread or glucose. The concept has been extended to mixed meals and whole diets where the GI of the meal/diet is expressed as the weighted average of the GI of each food, based on the percentage of the total mealldiet CHO provided by each food. Over the last few decades, a substantial number of epidemiological and interventional studies have reported beneficial associationsleffects of lower GI diets across a wide spectrum of pathophysiological conditions, including diabetes, cardiovascular disease, obesity, and certain forms of cancer. This has prompted proponents of the GI to recommend its use for dietary planning and labeling purposes. However, the currently recommended GI methodology is not well standardized and has several flaws, which brings into question the strength of evidence attributed to the health effects of low-GI diets. This review focuses exclusively on the methodological aspects of the GI, how they might impact the interpretation of data related to the purported health benefits of low GI diets, and the considerations for the use of the GI in food labeling. In addition, alternative systems for classifying the glycemic effects of CHO-containing foods are briefly discussed.
Sasaki, Joni Y; Kim, Heejung S
2011-08-01
Religion helps people maintain a sense of control, particularly secondary control-acceptance of and adjustment to difficult situations--and contributes to strengthening social relationships in a religious community. However, little is known about how culture may influence these effects. The current research examined the interaction of culture and religion on secondary control and social affiliation, comparing people from individualistic cultures (e.g., European Americans), who tend to be more motivated toward personal agency, and people from collectivistic cultures (e.g., East Asians), who tend to be more motivated to maintain social relationships. In Study 1, an analysis of online church mission statements showed that U.S. websites contained more themes of secondary control than did Korean websites, whereas Korean websites contained more themes of social affiliation than did U.S. websites. Study 2 showed that experimental priming of religion led to acts of secondary control for European Americans but not Asian Americans. Using daily diary methodology, Study 3 showed that religious coping predicted more secondary control for European Americans but not Koreans, and religious coping predicted more social affiliation for Koreans and European Americans. These findings suggest the importance of understanding sociocultural moderators for the effects of religion.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
Lean methodology: supporting battlefield medical fitness by cutting process waste.
Huggins, Elaine J
2010-01-01
Healthcare has long looked at decreasing risk in communication and patient care processes. Increasing the simplicity in communication and patient care process is a newer concept contained in Lean methodology. Lean is a strategy for achieving improvement in performance through the elimination of steps that use resources without contributing to customer value. This is known as cutting waste or nonvalue added steps. This article outlines how the use of Lean improved a key process that supports battlefield medical fitness.
U.S. Heat Demand by Sector for Potential Application of Direct Use Geothermal
Katherine Young
2016-06-23
This dataset includes heat demand for potential application of direct use geothermal broken down into 4 sectors: agricultural, commercial, manufacturing and residential. The data for each sector are organized by county, were disaggregated specifically to assess the market demand for geothermal direct use, and were derived using methodologies customized for each sector based on the availability of data and other sector-specific factors. This dataset also includes a paper containing a full explanation of the methodologies used.
ERIC Educational Resources Information Center
Burstein, Leigh
Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…
A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children
ERIC Educational Resources Information Center
Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.
2012-01-01
Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…
Highway User Benefit Analysis System Research Project #128
DOT National Transportation Integrated Search
2000-10-01
In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...
Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'
DOT National Transportation Integrated Search
2001-02-01
This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...
Asbestos Utilization Costs on the Example of Functioning Landfill of Hazardous Waste
NASA Astrophysics Data System (ADS)
Polek, Daria
2017-12-01
Asbestos is a trademark of mineral fibres, which are the natural minerals found in nature. Products containing asbestos fibres, in accordance with the national and EU legislation, are covered by the production prohibition and forced to be removed. In Poland, the asbestos removal process started with the adaptation of the EU law by the Council of Ministers Treatment Program of the National Asbestos for the years 2009-2032. The purpose of the dissertation was to analyse the costs associated with the disposal of the costs of collection, transport and disposal of waste. Methodology consisted in obtaining information on the raw materials needed to produce asbestos sheets. The analysis allowed us to determine the asbestos removal cost and include state subsidies in the calculations.
Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong
2013-01-01
Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.
Modification and Validation of Conceptual Design Aerodynamic Prediction Method HASC95 With VTXCHN
NASA Technical Reports Server (NTRS)
Albright, Alan E.; Dixon, Charles J.; Hegedus, Martin C.
1996-01-01
A conceptual/preliminary design level subsonic aerodynamic prediction code HASC (High Angle of Attack Stability and Control) has been improved in several areas, validated, and documented. The improved code includes improved methodologies for increased accuracy and robustness, and simplified input/output files. An engineering method called VTXCHN (Vortex Chine) for prediciting nose vortex shedding from circular and non-circular forebodies with sharp chine edges has been improved and integrated into the HASC code. This report contains a summary of modifications, description of the code, user's guide, and validation of HASC. Appendices include discussion of a new HASC utility code, listings of sample input and output files, and a discussion of the application of HASC to buffet analysis.
Characterization of nutraceuticals and functional foods by innovative HPLC methods.
Corradini, Claudio; Galanti, Roberta; Nicoletti, Isabella
2002-04-01
In recent years there is a growing interest in food and food ingredient which may provide health benefits. Food as well as food ingredients containing health-preserving components, are not considered conventional food, but can be defined as functional food. To characterise such foods, as well as nutraceuticals specific, high sensitive and reproducible analytical methodologies are needed. In light of this importance we set out to develop innovative HPLC methods employing reversed phase narrow bore column and high-performance anion-exchange chromatographic methods coupled with pulsed amperometric detection (HPAEC-PAD), which are specific for carbohydrate analysis. The developed methods were applied for the separation and quantification of citrus flavonoids and to characterize fructooligosaccharide (FOS) and fructans added to functional foods and nutraceuticals.
MTF evaluation of white pixel sensors
NASA Astrophysics Data System (ADS)
Lindner, Albrecht; Atanassov, Kalin; Luo, Jiafu; Goma, Sergio
2015-01-01
We present a methodology to compare image sensors with traditional Bayer RGB layouts to sensors with alternative layouts containing white pixels. We focused on the sensors' resolving powers, which we measured in the form of a modulation transfer function for variations in both luma and chroma channels. We present the design of the test chart, the acquisition of images, the image analysis, and an interpretation of results. We demonstrate the approach at the example of two sensors that only differ in their color filter arrays. We confirmed that the sensor with white pixels and the corresponding demosaicing result in a higher resolving power in the luma channel, but a lower resolving power in the chroma channels when compared to the traditional Bayer sensor.
The phonetics of talk in interaction--introduction to the special issue.
Ogden, Richard
2012-03-01
This overview paper provides an introduction to work on naturally-occurring speech data, combining techniques of conversation analysis with techniques and methods from phonetics. The paper describes the development of the field, highlighting current challenges and progress in interdisciplinary work. It considers the role of quantification and its relationship to a qualitative methodology. It presents the conversation analytic notion of sequence as a version of context, and argues that sequences of talk constrain relevant phonetic design, and so provide one account for variability in naturally occurring speech. The paper also describes the manipulation of speech and language on many levels simultaneously. All of these themes occur and are explored in more detail in the papers contained in this special issue.
Gil Llario, M D; Vicent Catalá, Consuelo
2009-02-01
Comparative analysis of the efficacy of a playful-narrative program to teach mathematics at pre-school level. In this paper, the effectiveness of a programme comprising several components that are meant to consolidate mathematical concepts and abilities at the pre-school level is analyzed. The instructional methodology of this programme is compared to other methodologies. One-hundred 5-6 year-old children made up the sample that was distributed in the following conditions: (1) traditional methodology; (2) methodology with perceptual and manipulative components, and (3) methodology with language and playful components. Mathematical competence was assessed with the Mathematical Criterial Pre-school Test and the subtest of quantitative-numeric concepts of BADyG. Participants were evaluated before and after the academic course during which they followed one of these methodologies. The results show that the programme with language and playful components is more effective than the traditional methodology (p<.000) and also more effective than the perceptual and manipulative methodology (p<.000). Implications of the results for instructional practices are analyzed.
Tasneem, Asba; Aberle, Laura; Ananth, Hari; Chakraborty, Swati; Chiswell, Karen; McCourt, Brian J.; Pietrobon, Ricardo
2012-01-01
Background The ClinicalTrials.gov registry provides information regarding characteristics of past, current, and planned clinical studies to patients, clinicians, and researchers; in addition, registry data are available for bulk download. However, issues related to data structure, nomenclature, and changes in data collection over time present challenges to the aggregate analysis and interpretation of these data in general and to the analysis of trials according to clinical specialty in particular. Improving usability of these data could enhance the utility of ClinicalTrials.gov as a research resource. Methods/Principal Results The purpose of our project was twofold. First, we sought to extend the usability of ClinicalTrials.gov for research purposes by developing a database for aggregate analysis of ClinicalTrials.gov (AACT) that contains data from the 96,346 clinical trials registered as of September 27, 2010. Second, we developed and validated a methodology for annotating studies by clinical specialty, using a custom taxonomy employing Medical Subject Heading (MeSH) terms applied by an NLM algorithm, as well as MeSH terms and other disease condition terms provided by study sponsors. Clinical specialists reviewed and annotated MeSH and non-MeSH disease condition terms, and an algorithm was created to classify studies into clinical specialties based on both MeSH and non-MeSH annotations. False positives and false negatives were evaluated by comparing algorithmic classification with manual classification for three specialties. Conclusions/Significance The resulting AACT database features study design attributes parsed into discrete fields, integrated metadata, and an integrated MeSH thesaurus, and is available for download as Oracle extracts (.dmp file and text format). This publicly-accessible dataset will facilitate analysis of studies and permit detailed characterization and analysis of the U.S. clinical trials enterprise as a whole. In addition, the methodology we present for creating specialty datasets may facilitate other efforts to analyze studies by specialty groups. PMID:22438982
Tasneem, Asba; Aberle, Laura; Ananth, Hari; Chakraborty, Swati; Chiswell, Karen; McCourt, Brian J; Pietrobon, Ricardo
2012-01-01
The ClinicalTrials.gov registry provides information regarding characteristics of past, current, and planned clinical studies to patients, clinicians, and researchers; in addition, registry data are available for bulk download. However, issues related to data structure, nomenclature, and changes in data collection over time present challenges to the aggregate analysis and interpretation of these data in general and to the analysis of trials according to clinical specialty in particular. Improving usability of these data could enhance the utility of ClinicalTrials.gov as a research resource. The purpose of our project was twofold. First, we sought to extend the usability of ClinicalTrials.gov for research purposes by developing a database for aggregate analysis of ClinicalTrials.gov (AACT) that contains data from the 96,346 clinical trials registered as of September 27, 2010. Second, we developed and validated a methodology for annotating studies by clinical specialty, using a custom taxonomy employing Medical Subject Heading (MeSH) terms applied by an NLM algorithm, as well as MeSH terms and other disease condition terms provided by study sponsors. Clinical specialists reviewed and annotated MeSH and non-MeSH disease condition terms, and an algorithm was created to classify studies into clinical specialties based on both MeSH and non-MeSH annotations. False positives and false negatives were evaluated by comparing algorithmic classification with manual classification for three specialties. The resulting AACT database features study design attributes parsed into discrete fields, integrated metadata, and an integrated MeSH thesaurus, and is available for download as Oracle extracts (.dmp file and text format). This publicly-accessible dataset will facilitate analysis of studies and permit detailed characterization and analysis of the U.S. clinical trials enterprise as a whole. In addition, the methodology we present for creating specialty datasets may facilitate other efforts to analyze studies by specialty groups.
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
Functional Modification of Thioether Groups in Peptides, Polypeptides, and Proteins.
Deming, Timothy J
2017-03-15
Recent developments in the modification of methionine and other thioether-containing residues in peptides, polypeptides, and proteins are reviewed. Properties and potential applications of the resulting functionalized products are also discussed. While much of this work is focused on natural Met residues, modifications at other side-chain residues have also emerged as new thioether-containing amino acids have been incorporated into peptidic materials. Functional modification of thioether-containing amino acids has many advantages and is a complementary methodology to the widely utilized methods for modification at cysteine residues.
Methodologies for the Statistical Analysis of Memory Response to Radiation
NASA Astrophysics Data System (ADS)
Bosser, Alexandre L.; Gupta, Viyas; Tsiligiannis, Georgios; Frost, Christopher D.; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigné, Frédéric; Virtanen, Ari; Wrobel, Frédéric; Dilillo, Luigi
2016-08-01
Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].
Global-local methodologies and their application to nonlinear analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Sanders
2006-09-01
Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revisionmore » of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation.« less
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paff, S. W; Doody, S.
2003-02-25
This paper discusses the challenges associated with creating a data management system for waste tracking at the Advanced Mixed Waste Treatment Plant (AMWTP) at the Idaho National Engineering Lab (INEEL). The waste tracking system combines data from plant automation systems and decision points. The primary purpose of the system is to provide information to enable the plant operators and engineers to assess the risks associated with each container and determine the best method of treating it. It is also used to track the transuranic (TRU) waste containers as they move throughout the various processes at the plant. And finally, themore » goal of the system is to support paperless shipments of the waste to the Waste Isolation Pilot Plant (WIPP). This paper describes the approach, methodologies, the underlying design of the database, and the challenges of creating the Data Management System (DMS) prior to completion of design and construction of a major plant. The system was built utilizing an Oracle database platform, and Oracle Forms 6i in client-server mode. The underlying data architecture is container-centric, with separate tables and objects for each type of analysis used to characterize the waste, including real-time radiography (RTR), non-destructive assay (NDA), head-space gas sampling and analysis (HSGS), visual examination (VE) and coring. The use of separate tables facilitated the construction of automatic interfaces with the analysis instruments that enabled direct data capture. Movements are tracked using a location system describing each waste container's current location and a history table tracking the container's movement history. The movement system is designed to interface both with radio-frequency bar-code devices and the plant's integrated control system (ICS). Collections of containers or information, such as batches, were created across the various types of analyses, which enabled a single, cohesive approach to be developed for verification and validation activities. The DMS includes general system functions, including task lists, electronic signature, non-conformance reports and message systems, that cut vertically across the remaining subsystems. Oracle's security features were utilized to ensure that only authorized users were allowed to log in, and to restrict access to system functionality according to user role.« less
Azaripour, Adriano; Lagerweij, Tonny; Scharfbillig, Christina; Jadczak, Anna Elisabeth; Willershausen, Brita; Van Noorden, Cornelis J F
2016-08-01
For 3-dimensional (3D) imaging of a tissue, 3 methodological steps are essential and their successful application depends on specific characteristics of the type of tissue. The steps are 1° clearing of the opaque tissue to render it transparent for microscopy, 2° fluorescence labeling of the tissues and 3° 3D imaging. In the past decades, new methodologies were introduced for the clearing steps with their specific advantages and disadvantages. Most clearing techniques have been applied to the central nervous system and other organs that contain relatively low amounts of connective tissue including extracellular matrix. However, tissues that contain large amounts of extracellular matrix such as dermis in skin or gingiva are difficult to clear. The present survey lists methodologies that are available for clearing of tissues for 3D imaging. We report here that the BABB method using a mixture of benzyl alcohol and benzyl benzoate and iDISCO using dibenzylether (DBE) are the most successful methods for clearing connective tissue-rich gingiva and dermis of skin for 3D histochemistry and imaging of fluorescence using light-sheet microscopy. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.
Yeşiller, Gülden; Sezgintürk, Mustafa Kemal
2015-11-10
In this research, a novel enzyme activity analysis methodology is introduced as a new perspective for this area. The activity of elastase enzyme, which is a digestive enzyme mostly of found in the digestive system of vertebrates, was determined by an electrochemical device composed of carbon nanotubes and a second enzyme, glucose oxidase, which was used as a signal generator enzyme. In this novel methodology, a complex bioactive layer was constructed by using carbon nanotubes, glucose oxidase and a supporting protein, gelatin on a solid, conductive substrate. The activity of elastase was determined by monitoring the hydrolysis rate of elastase enzyme in the bioactive layer. As a result of this hydrolysis of elastase, glucose oxidase was dissociated from the bioactive layer, and following this the electrochemical signal due to glucose oxidase was decreased. The progressive elastase-catalyzed digestion of the bioactive layer containing glucose oxidase decreased the layer's enzymatic efficiency, resulting in a decrease of the glucose oxidation current as a function of the enzyme activity. The ratio of the decrease was correlated to elastase activity level. In this study, optimization experiments of bioactive components and characterization of the resulting new electrochemical device were carried out. A linear calibration range from 0.0303U/mL to 0.0729U/mL of elastase was reported. Real sample analyses were also carried out by the new electrochemical device. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Ross, Linda
2003-01-01
Recent work with automotive e-commerce clients led to the development of a performance analysis methodology called the Seven Performance Drivers, including: standards, incentives, capacity, knowledge and skill, measurement, feedback, and analysis. This methodology has been highly effective in introducing and implementing performance improvement.…
77 FR 1454 - Request for Nominations of Members To Serve on the Census Scientific Advisory Committee
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-10
..., statistical analysis, survey methodology, geospatial analysis, econometrics, cognitive psychology, and... following disciplines: Demography, economics, geography, psychology, statistics, survey methodology, social... technical expertise in such areas as demography, economics, geography, psychology, statistics, survey...
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
Roadway safety analysis methodology for Utah : final report.
DOT National Transportation Integrated Search
2016-12-01
This research focuses on the creation of a three-part Roadway Safety Analysis methodology that applies and automates the cumulative work of recently-completed roadway safety research. The first part is to prepare the roadway and crash data for analys...
Health economic assessment: a methodological primer.
Simoens, Steven
2009-12-01
This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.
Health Economic Assessment: A Methodological Primer
Simoens, Steven
2009-01-01
This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments. PMID:20049237
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
State solar initiatives. Volume 2: A review
NASA Astrophysics Data System (ADS)
Koontz, R.; Neuendorffer, J.; Green, B.; Myring, G.; Myring, L.; Perwin, E.; Gordon, N.; Small, D.; Poster, B.
1981-09-01
Background material supporting the solar energy recommendations and conclusions is provided. Research methodology, results of a computer program on state and federal tax credits, state energy goals, program lists, energy and demographic factors are contained.
Reagent Selection Methodology for a Novel Explosives Detection Platform
Warner, Marvin
2018-02-14
This video describes research being conducted by Dr. Marvin Warner, a research scientist at Pacific Northwest National Laboratory, in the individual pieces of antibodies used to set up a chemical reaction that will give off light just by mixing reagents together with a sample that contains an explosive molecule. This technology would help detect if explosives are present with just the use of a handheld system or container.
NASA Astrophysics Data System (ADS)
Tene, Yair; Tene, Noam; Tene, G.
1993-08-01
An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.
NET: a new framework for the vectorization and examination of network data.
Lasser, Jana; Katifori, Eleni
2017-01-01
The analysis of complex networks both in general and in particular as pertaining to real biological systems has been the focus of intense scientific attention in the past and present. In this paper we introduce two tools that provide fast and efficient means for the processing and quantification of biological networks like Drosophila tracheoles or leaf venation patterns: the Network Extraction Tool ( NET ) to extract data and the Graph-edit-GUI ( GeGUI ) to visualize and modify networks. NET is especially designed for high-throughput semi-automated analysis of biological datasets containing digital images of networks. The framework starts with the segmentation of the image and then proceeds to vectorization using methodologies from optical character recognition. After a series of steps to clean and improve the quality of the extracted data the framework produces a graph in which the network is represented only by its nodes and neighborhood-relations. The final output contains information about the adjacency matrix of the graph, the width of the edges and the positions of the nodes in space. NET also provides tools for statistical analysis of the network properties, such as the number of nodes or total network length. Other, more complex metrics can be calculated by importing the vectorized network to specialized network analysis packages. GeGUI is designed to facilitate manual correction of non-planar networks as these may contain artifacts or spurious junctions due to branches crossing each other. It is tailored for but not limited to the processing of networks from microscopy images of Drosophila tracheoles. The networks extracted by NET closely approximate the network depicted in the original image. NET is fast, yields reproducible results and is able to capture the full geometry of the network, including curved branches. Additionally GeGUI allows easy handling and visualization of the networks.
Irei, Satoshi
2016-01-01
Molecular marker analysis of environmental samples often requires time consuming preseparation steps. Here, analysis of low-volatile nonpolar molecular markers (5-6 ring polycyclic aromatic hydrocarbons or PAHs, hopanoids, and n-alkanes) without the preseparation procedure is presented. Analysis of artificial sample extracts was directly conducted by gas chromatography-mass spectrometry (GC-MS). After every sample injection, a standard mixture was also analyzed to make a correction on the variation of instrumental sensitivity caused by the unfavorable matrix contained in the extract. The method was further validated for the PAHs using the NIST standard reference materials (SRMs) and then applied to airborne particulate matter samples. Tests with the SRMs showed that overall our methodology was validated with the uncertainty of ~30%. The measurement results of airborne particulate matter (PM) filter samples showed a strong correlation between the PAHs, implying the contributions from the same emission source. Analysis of size-segregated PM filter samples showed that their size distributions were found to be in the PM smaller than 0.4 μm aerodynamic diameter. The observations were consistent with our expectation of their possible sources. Thus, the method was found to be useful for molecular marker studies. PMID:27127511
Direct Bio-printing with Heterogeneous Topology Design.
Ahsan, Amm Nazmul; Xie, Ruinan; Khoda, Bashir
2017-01-01
Bio-additive manufacturing is a promising tool to fabricate porous scaffold structures for expediting the tissue regeneration processes. Unlike the most traditional bulk material objects, the microstructures of tissue and organs are mostly highly anisotropic, heterogeneous, and porous in nature. However, modelling the internal heterogeneity of tissues/organs structures in the traditional CAD environment is difficult and oftentimes inaccurate. Besides, the de facto STL conversion of bio-models introduces loss of information and piles up more errors in each subsequent step (build orientation, slicing, tool-path planning) of the bio-printing process plan. We are proposing a topology based scaffold design methodology to accurately represent the heterogeneous internal architecture of tissues/organs. An image analysis technique is used that digitizes the topology information contained in medical images of tissues/organs. A weighted topology reconstruction algorithm is implemented to represent the heterogeneity with parametric functions. The parametric functions are then used to map the spatial material distribution. The generated information is directly transferred to the 3D bio-printer and heterogeneous porous tissue scaffold structure is manufactured without STL file. The proposed methodology is implemented to verify the effectiveness of the approach and the designed example structure is bio-fabricated with a deposition based bio-additive manufacturing system.
Shafi, Mohammad Shoaib; Faisal, Tayyaba; Naseem, Sajida; Javed, Sajida
2018-03-01
To evaluate understanding of biostatistics among postgraduate medical trainees before and after biostatistics workshop. Quasi experimental study. Regional Centre, Islamabad, College of Physicians and Surgeons Pakistan, from March to September 2017. Two hundred and seventy postgraduate trainees were enrolled after taking informed consent. Structured questionnaire containing 21 multiple choice questions regarding understanding and application of biostatistics was given to all participants on the first and the last day of workshop and compared pre- and post-workshop by McNemar test of significance. SPSS version 21 was used for data analysis with p-value <0.05 as significant level. The response rate was 100%. Among these participants, males were 81 (30%) and females were 189 (70%), mean age was 28.5 ±2.5 years. One hundred and twenty-five (46%) postgraduate trainees were from Islamabad. Most of the doctors were in the first year (37%) and second year (57%) of their training. With total correct answers of 42.9% (preworkshop) and 57% (post-workshop), p-value was <0.001. Understanding regarding application of biostatistics in research among PGTs improved significantly and immediately after teaching biostatistics in research methodology workshop.
Wallace, Nathan D; Ceguerra, Anna V; Breen, Andrew J; Ringer, Simon P
2018-06-01
Atom probe tomography is a powerful microscopy technique capable of reconstructing the 3D position and chemical identity of millions of atoms within engineering materials, at the atomic level. Crystallographic information contained within the data is particularly valuable for the purposes of reconstruction calibration and grain boundary analysis. Typically, analysing this data is a manual, time-consuming and error prone process. In many cases, the crystallographic signal is so weak that it is difficult to detect at all. In this study, a new automated signal processing methodology is demonstrated. We use the affine properties of the detector coordinate space, or the 'detector stack', as the basis for our calculations. The methodological framework and the visualisation tools are shown to be superior to the standard method of crystallographic pole visualisation directly from field evaporation images and there is no requirement for iterations between a full real-space initial tomographic reconstruction and the detector stack. The mapping approaches are demonstrated for aluminium, tungsten, magnesium and molybdenum. Implications for reconstruction calibration, accuracy of crystallographic measurements, reliability and repeatability are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Caloz, Misael; Kafrouni, Marilyne; Leturgie, Quentin; Corde, Stéphanie; Downes, Simon; Lehmann, Joerg; Thwaites, David
2015-01-01
There are few reported intercomparisons or audits of combinations of advanced radiotherapy methods, particularly for 4D treatments. As part of an evaluation of the implementation of advanced radiotherapy technology, a phantom and associated methods, initially developed for in-house commissioning and QA of 4D lung treatments, has been developed further with the aim of using it for end-to-end dose intercomparison of 4D treatment planning and delivery. The respiratory thorax phantom can house moving inserts with variable speed (breathing rate) and motion amplitude. In one set-up mode it contains a small ion chamber for point dose measurements, or alternatively it can hold strips of radiochromic film to measure dose distributions. Initial pilot and feasibility measurements have been carried out in one hospital to thoroughly test the methods and procedures before using it more widely across a range of hospitals and treatment systems. Overall, the results show good agreement between measured and calculated doses and distributions, supporting the use of the phantom and methodology for multi-centre intercomparisons. However, before wider use, refinements of the method and analysis are currently underway particularly for the film measurements.
Niu, Sheng; Zheng, Lijuan; Khan, Abdul Qayyum; Feng, Guang; Zeng, Heping
2018-03-01
A fast and sensitive analysis for trace level heavy metals in aqueous solution was realized by using an improved laser induced breakdown spectroscopy (LIBS) methodology. Solutions containing heavy metal elements, Ni, Cr, and Cd, were concentrated in a laser-pretreated area (25 × 20mm 2 ) of a polished aluminum target surface, wherein pretreated grooves enabled homogeneous distribution of the metallic solutions in the well-defined area, and laser ablation of the aluminum target produced unique plasma excitation of various metallic ions. For 1-mL solutions deposited, we obtained an analytical precision of about 7% relative standard deviation (RSD), and limits of detection (LODs) of 22, 19, and 184μg/L for Ni, Cr, and Cd, respectively. Moreover, the laser-pretreated metallic microstructure allowed more solution deposited with the help of a hot plate, which supported improvement of LODs to sub-μg/L level for Cr and Ni and μg/L level for Cd with about 20-mL solution engaged in the enrichment processes. The applicability of the proposed methodology was validated on certified reference materials and real river water. Copyright © 2017 Elsevier B.V. All rights reserved.
Locci, Antonio Mario; Cincotti, Alberto; Todde, Sara; Orrù, Roberto; Cao, Giacomo
2010-01-01
A novel methodology is proposed for investigating the effect of the pulsed electric current during the spark plasma sintering (SPS) of electrically conductive powders without potential misinterpretation of experimental results. First, ensemble configurations (geometry, size and material of the powder sample, die, plunger and spacers) are identified where the electric current is forced to flow only through either the sample or the die, so that the sample is heated either through the Joule effect or by thermal conduction, respectively. These ensemble configurations are selected using a recently proposed mathematical model of an SPS apparatus, which, once suitably modified, makes it possible to carry out detailed electrical and thermal analysis. Next, SPS experiments are conducted using the ensemble configurations theoretically identified. Using aluminum powders as a case study, we find that the temporal profiles of sample shrinkage, which indicate densification behavior, as well as the final density of the sample are clearly different when the electric current flows only through the sample or through the die containing it, whereas the temperature cycle and mechanical load are the same in both cases. PMID:27877354
Measurement-based analysis of error latency. [in computer operating system
NASA Technical Reports Server (NTRS)
Chillarege, Ram; Iyer, Ravishankar K.
1987-01-01
This paper demonstrates a practical methodology for the study of error latency under a real workload. The method is illustrated with sampled data on the physical memory activity, gathered by hardware instrumentation on a VAX 11/780 during the normal workload cycle of the installation. These data are used to simulate fault occurrence and to reconstruct the error discovery process in the system. The technique provides a means to study the system under different workloads and for multiple days. An approach to determine the percentage of undiscovered errors is also developed and a verification of the entire methodology is performed. This study finds that the mean error latency, in the memory containing the operating system, varies by a factor of 10 to 1 (in hours) between the low and high workloads. It is found that of all errors occurring within a day, 70 percent are detected in the same day, 82 percent within the following day, and 91 percent within the third day. The increase in failure rate due to latency is not so much a function of remaining errors but is dependent on whether or not there is a latent error.
Groundwater pollution risk assessment. Application to different carbonate aquifers in south Spain
NASA Astrophysics Data System (ADS)
Jimenez Madrid, A.; Martinez Navarrete, C.; Carrasco Cantos, F.
2009-04-01
Water protection has been considered one of the most important environmental goals in the European politics since the 2000/60/CE Water Framework Directive came into force in 2000, and more specifically in 2006 with the 2006/118/CE Directive on groundwater protection. As one of the necessary requirements to tackle groundwater protection, a pollution risk assessment has been made through the analysis of both the existing hazard human activities map and the intrinsic aquifer vulnerability map, by applying the methodologies proposed by COST Action 620 in an experimental study site in south Spain containing different carbonated aquifers, which supply 8 towns ranging from 2000 to 2500 inhabitants. In order to generate both maps it was necessary to make a field inventory over a 1:10000 topographic base map, followed by Geographic Information System (GIS) processing. The outcome maps show a clear spatial distribution of both pollution risk and intrinsic vulnerability of the carbonated aquifers studied. As a final result, a map of the intensity of groundwater pollution risk is presented, representing and important base for the development of a proper methodology for the protection of groundwater resources for human consumption protection. Keywords. Hazard, Vulnerability, Risk, SIG, Protection
Molinos-Senante, M; Garrido-Baserba, M; Reif, R; Hernández-Sancho, F; Poch, M
2012-06-15
The preliminary design and economic assessment of small wastewater treatment plants (less than 2000 population equivalent) are issues of particular interest since wastewaters from most of these agglomerations are not covered yet. This work aims to assess nine different technologies set-up for the secondary treatment in such type of facilities embracing both economic and environmental parameters. The main novelty of this work is the combination of an innovative environmental decision support system (EDSS) with a pioneer approach based on the inclusion of the environmental benefits derived from wastewater treatment. The integration of methodologies based on cost-benefit analysis tools with the vast amount of knowledge from treatment technologies contained in the EDSS was applied in nine scenarios comprising different wastewater characteristics and reuse options. Hence, a useful economic feasibility indicator is obtained for each technology including internal and external costs and, for the first time, benefits associated with the environmental damage avoided. This new methodology proved to be crucial for supporting the decision process, contributing to improve the sustainability of new treatment facilities and allows the selection of the most feasible technologies of a wide set of possibilities. Copyright © 2012 Elsevier B.V. All rights reserved.
Fluvial sediment fingerprinting: literature review and annotated bibliography
Williamson, Joyce E.; Haj, Adel E.; Stamm, John F.; Valder, Joshua F.; Prautzch, Vicki L.
2014-01-01
The U.S. Geological Survey has evaluated and adopted various field methods for collecting real-time sediment and nutrient data. These methods have proven to be valuable representations of sediment and nutrient concentrations and loads but are not able to accurately identify specific source areas. Recently, more advanced data collection and analysis techniques have been evaluated that show promise in identifying specific source areas. Application of field methods could include studies of sources of fluvial sediment, otherwise referred to as sediment “fingerprinting.” The identification of sediment is important, in part, because knowing the primary sediment source areas in watersheds ensures that best management practices are incorporated in areas that maximize reductions in sediment loadings. This report provides a literature review and annotated bibliography of existing methodologies applied in the field of fluvial sediment fingerprinting. This literature review provides a bibliography of publications where sediment fingerprinting methods have been used; however, this report is not assumed to provide an exhaustive listing. Selected publications were categorized by methodology with some additional summary information. The information contained in the summary may help researchers select methods better suited to their particular study or study area, and identify methods in need of more testing and application.
Payload training methodology study
NASA Technical Reports Server (NTRS)
1990-01-01
The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... the large break loss-of-coolant accident (LOCA) analysis methodology with a reference to WCAP-16009-P... required by 10 CFR 50.91(a), the licensee has provided its analysis of the issue of no significant hazards... Section 5.6.5 to incorporate a new large break LOCA analysis methodology. Specifically, the proposed...
Treatment of Farm Families under Need Analysis for Student Aid. Final Report.
ERIC Educational Resources Information Center
National Computer Systems, Inc., Arlington, VA.
In response to Congressional request, this report compares the treatment of student financial aid applicants from farm families and non-farm families under two need-analysis formulae. Both the need-analysis methodology for Pell Grants and the Congressional Methodology (CM) for other federal aid calculate ability to pay as a function of income and…
ERIC Educational Resources Information Center
Mukan, Nataliya; Kravets, Svitlana
2015-01-01
In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…
Security Quality Requirements Engineering (SQUARE) Methodology
2005-11-01
such as Joint Application Development and the Accelerated Requirements Method [Wood 89, Hubbard 99] • Soft Systems Methodology [Checkland 89...investigated were misuse cases [Jacobson 92], Soft Systems Methodology (SSM) [Checkland 89], Quality Function Deployment (QFD) [QFD 05], Con- trolled...html (2005). [Checkland 89] Checkland, Peter. Soft Systems Methodology . Rational Analysis for a Problematic World. New York, NY: John Wiley & Sons
Analysis of Additive Manufacturing for Sustainment of Naval Aviation Systems
2017-09-01
selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The methodology organizes...a component selection methodology to query the aviation spare-parts inventory for identification of additive manufacturing candidates. The... methodology organizes the resultant data using a top-down approach that aligns technical feasibility with programmatic objectives. Finally, a discrete event
ERIC Educational Resources Information Center
Seymour, Sharon
1991-01-01
Review of research methodologies used in studies of online public access catalog (OPAC) users finds that a variety of research methodologies--e.g., surveys, transaction log analysis, interviews--have been used with varying degrees of expertise. It is concluded that poor research methodology resulting from limited training and resources limits the…
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Aircraft optimization by a system approach: Achievements and trends
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1992-01-01
Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Hoskin, Jordan D; Miyatani, Masae; Craven, B Catharine
2017-03-30
Carotid intima-media thickness (cIMT) may be used increasingly as a cardiovascular disease (CVD) screening tool in individuals with spinal cord injury (SCI) as other routine invasive diagnostic tests are often unfeasible. However, variation in cIMT acquisition and analysis methods is an issue in the current published literature. The growth of the field is dependent on cIMT quality acquisition and analysis to ensure accurate reporting of CVD risk. The purpose of this study is to evaluate the quality of the reported methodology used to collect cIMT values in SCI. Data from 12 studies, which measured cIMT in individuals with SCI, were identified from the Medline, Embase and CINAHL databases. The quality of the reported methodologies was scored based on adherence to cIMT methodological guidelines abstracted from two consensus papers. Five studies were scored as 'moderate quality' in methodological reporting, having specified 9 to 11 of 15 quality reporting criterion. The remaining seven studies were scored as 'low quality', having reported less than 9 of 15 quality reporting criterion. No study had methodological reporting that was scored as 'high quality'. The overall reporting of quality methodology was poor in the published SCI literature. A greater adherence to current methodological guidelines is needed to advance the field of cIMT in SCI. Further research is necessary to refine cIMT acquisition and analysis guidelines to aid authors designing research and journals in screening manuscripts for publication.
Methodology for assessing the effectiveness of access management techniques : executive summary.
DOT National Transportation Integrated Search
1998-09-14
A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...
Martínez de Alba, Angel Emilio; Sägesser, Rudolf; Tabler, Martin; Tsagris, Mina
2003-01-01
For the identification of RNA-binding proteins that specifically interact with potato spindle tuber viroid (PSTVd), we subjected a tomato cDNA expression library prepared from viroid-infected leaves to an RNA ligand screening procedure. We repeatedly identified cDNA clones that expressed a protein of 602 amino acids. The protein contains a bromodomain and was termed viroid RNA-binding protein 1 (VIRP1). The specificity of interaction of VIRP1 with viroid RNA was studied by different methodologies, which included Northwestern blotting, plaque lift, and electrophoretic mobility shift assays. VIRP1 interacted strongly and specifically with monomeric and oligomeric PSTVd positive-strand RNA transcripts. Other RNAs, for example, U1 RNA, did not bind to VIRP1. Further, we could immunoprecipitate complexes from infected tomato leaves that contained VIRP1 and viroid RNA in vivo. Analysis of the protein sequence revealed that VIRP1 is a member of a newly identified family of transcriptional regulators associated with chromatin remodeling. VIRP1 is the first member of this family of proteins, for which a specific RNA-binding activity is shown. A possible role of VIRP1 in viroid replication and in RNA mediated chromatin remodeling is discussed. PMID:12915580