Sample records for field test methodology

  1. 75 FR 62403 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-08

    ... Project: 2011-2014 National Survey on Drug Use and Health: Methodological Field Tests (OMB No. 0930-0290..., SAMHSA received a three-year renewal of its generic clearance for methodological field tests. This will be a request for another renewal of the generic approval to continue methodological tests over the...

  2. 75 FR 78720 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-16

    .... Proposed Project: 2011-2014 National Survey on Drug Use and Health: Methodological Field Tests (OMB No..., SAMHSA received a 3-year renewal of its generic clearance for methodological field tests. This will be a request for another renewal of the generic approval to continue methodological tests over the next 3 years...

  3. Field Test of the Methodology for Succession Planning for Technical Experts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cain, Ronald A.; Kirk, Bernadette Lugue; Agreda, Carla L.

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, hereafter referred to as “core competencies”. The methodology has been field tested by interviewing selected retiring subject matter experts (SMEs).

  4. Single Event Test Methodologies and System Error Rate Analysis for Triple Modular Redundant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael

    2010-01-01

    We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.

  5. Integrated vehicle-based safety systems heavy truck field operational test, methodology and results report.

    DOT National Transportation Integrated Search

    2010-12-01

    "This document presents the methodology and results from the heavy-truck field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michiga...

  6. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    DOT National Transportation Integrated Search

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  7. Suggestions for Job and Curriculum Ladders in Health Center Ambulatory Care: A Pilot Test of the Health Services Mobility Study Methodology.

    ERIC Educational Resources Information Center

    Gilpatrick, Eleanor

    This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…

  8. Electric Propulsion Test and Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST)

    DTIC Science & Technology

    2016-04-14

    Swanson AEDC Path 1: Magnetized electron transport impeded across magnetic field lines; transport via electron-particle collisions Path 2*: Electron...T&E (higher pressure, metallic walls) → Impacts stability, performance, plume properties, thruster lifetime Magnetic Field Lines Plasma Plume...Development of T&E Methodologies • Current-Voltage- Magnetic Field (I-V-B) Mapping • Facility Interaction Studies • Background Pressure • Plasma Wall

  9. Overdiagnosis across medical disciplines: a scoping review

    PubMed Central

    de Groot, Joris A H; Reitsma, Johannes B; Moons, Karel G M; Hooft, Lotty; Naaktgeboren, Christiana A

    2017-01-01

    Objective To provide insight into how and in what clinical fields overdiagnosis is studied and give directions for further applied and methodological research. Design Scoping review. Data sources Medline up to August 2017. Study selection All English studies on humans, in which overdiagnosis was discussed as a dominant theme. Data extraction Studies were assessed on clinical field, study aim (ie, methodological or non-methodological), article type (eg, primary study, review), the type and role of diagnostic test(s) studied and the context in which these studies discussed overdiagnosis. Results From 4896 studies, 1851 were included for analysis. Half of all studies on overdiagnosis were performed in the field of oncology (50%). Other prevalent clinical fields included mental disorders, infectious diseases and cardiovascular diseases accounting for 9%, 8% and 6% of studies, respectively. Overdiagnosis was addressed from a methodological perspective in 20% of studies. Primary studies were the most common article type (58%). The type of diagnostic tests most commonly studied were imaging tests (32%), although these were predominantly seen in oncology and cardiovascular disease (84%). Diagnostic tests were studied in a screening setting in 43% of all studies, but as high as 75% of all oncological studies. The context in which studies addressed overdiagnosis related most frequently to its estimation, accounting for 53%. Methodology on overdiagnosis estimation and definition provided a source for extensive discussion. Other contexts of discussion included definition of disease, overdiagnosis communication, trends in increasing disease prevalence, drivers and consequences of overdiagnosis, incidental findings and genomics. Conclusions Overdiagnosis is discussed across virtually all clinical fields and in different contexts. The variability in characteristics between studies and lack of consensus on overdiagnosis definition indicate the need for a uniform typology to improve coherence and comparability of studies on overdiagnosis. PMID:29284720

  10. Mechanistic evaluation of test data from LTPP flexible pavement test sections, Vol. I

    DOT National Transportation Integrated Search

    1996-01-01

    This report summarizes the process and lessons learned from the Standardized Travel Time Surveys and Field Test project. The field tests of travel time data collection were conducted in Boston, Seattle, and Lexington in 1993. The methodologies tested...

  11. A life prediction methodology for encapsulated solar cells

    NASA Technical Reports Server (NTRS)

    Coulbert, C. D.

    1978-01-01

    This paper presents an approach to the development of a life prediction methodology for encapsulated solar cells which are intended to operate for twenty years or more in a terrestrial environment. Such a methodology, or solar cell life prediction model, requires the development of quantitative intermediate relationships between local environmental stress parameters and the basic chemical mechanisms of encapsulant aging leading to solar cell failures. The use of accelerated/abbreviated testing to develop these intermediate relationships and in revealing failure modes is discussed. Current field and demonstration tests of solar cell arrays and the present laboratory tests to qualify solar module designs provide very little data applicable to predicting the long-term performance of encapsulated solar cells. An approach to enhancing the value of such field tests to provide data for life prediction is described.

  12. Geomagnetic main field modeling using magnetohydrodynamic constraints

    NASA Technical Reports Server (NTRS)

    Estes, R. H.

    1985-01-01

    The influence of physical constraints are investigated which may be approximately satisfied by the Earth's liquid core on models of the geomagnetic main field and its secular variation. A previous report describes the methodology used to incorporate nonlinear equations of constraint into the main field model. The application of that methodology to the GSFC 12/83 field model to test the frozen-flux hypothesis and the usefulness of incorporating magnetohydrodynamic constraints for obtaining improved geomagnetic field models is described.

  13. Education Longitudinal Study of 2002 (ELS:2002/12) Third Follow-up Field Test Report. Working Paper Series. NCES 2012-03

    ERIC Educational Resources Information Center

    Ingels, Steven J.; Pratt, Daniel J.; Jewell, Donna M.; Mattox, Tiffany; Dalton, Ben; Rosen, Jeffrey; Lauff, Erich; Hill, Jason

    2012-01-01

    This report describes the methodologies and results of the third follow-up Education Longitudinal Study of 2002 (ELS:2002/12) field test which was conducted in the summer of 2011. The field test report is divided into six chapters: (1) Introduction; (2) Field Test Survey Design and Preparation; (3) Data Collection Procedures and Results; (4) Field…

  14. Policy Forum: Studying Eyewitness Investigations in the Field

    PubMed Central

    Dawes, Robyn; Jacoby, Larry L.; Kahneman, Daniel; Lempert, Richard; Roediger, Henry L.; Rosenthal, Robert

    2007-01-01

    This article considers methodological issues arising from recent efforts to provide field tests of eyewitness identification procedures. We focus in particular on a field study (Mecklenburg 2006) that examined the “double blind, sequential” technique, and consider the implications of an acknowledged methodological confound in the study. We explain why the confound has severe consequences for assessing the real-world implications of this study. PMID:17610149

  15. Overdiagnosis across medical disciplines: a scoping review.

    PubMed

    Jenniskens, Kevin; de Groot, Joris A H; Reitsma, Johannes B; Moons, Karel G M; Hooft, Lotty; Naaktgeboren, Christiana A

    2017-12-27

    To provide insight into how and in what clinical fields overdiagnosis is studied and give directions for further applied and methodological research. Scoping review. Medline up to August 2017. All English studies on humans, in which overdiagnosis was discussed as a dominant theme. Studies were assessed on clinical field, study aim (ie, methodological or non-methodological), article type (eg, primary study, review), the type and role of diagnostic test(s) studied and the context in which these studies discussed overdiagnosis. From 4896 studies, 1851 were included for analysis. Half of all studies on overdiagnosis were performed in the field of oncology (50%). Other prevalent clinical fields included mental disorders, infectious diseases and cardiovascular diseases accounting for 9%, 8% and 6% of studies, respectively. Overdiagnosis was addressed from a methodological perspective in 20% of studies. Primary studies were the most common article type (58%). The type of diagnostic tests most commonly studied were imaging tests (32%), although these were predominantly seen in oncology and cardiovascular disease (84%). Diagnostic tests were studied in a screening setting in 43% of all studies, but as high as 75% of all oncological studies. The context in which studies addressed overdiagnosis related most frequently to its estimation, accounting for 53%. Methodology on overdiagnosis estimation and definition provided a source for extensive discussion. Other contexts of discussion included definition of disease, overdiagnosis communication, trends in increasing disease prevalence, drivers and consequences of overdiagnosis, incidental findings and genomics. Overdiagnosis is discussed across virtually all clinical fields and in different contexts. The variability in characteristics between studies and lack of consensus on overdiagnosis definition indicate the need for a uniform typology to improve coherence and comparability of studies on overdiagnosis. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Methodology to Assess No Touch Audit Software Using Field Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jie; Braun, James E.; Langner, M. Rois

    The research presented in this report builds upon these previous efforts and proposes a set of tests to assess no touch audit tools using real utility bill and on-site data. The proposed assessment methodology explicitly investigates the behaviors of the monthly energy end uses with respect to outdoor temperature, i.e., the building energy signature, to help understand the Tool's disaggregation accuracy. The project team collaborated with Field Diagnosis Services, Inc. (FDSI) to identify appropriate test sites for the evaluation.

  17. Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.

    ERIC Educational Resources Information Center

    Al-Salam, Nabeel; And Others

    The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…

  18. Effects of surface chemistry on hot corrosion life

    NASA Technical Reports Server (NTRS)

    Fryxell, R. E.; Leese, G. E.

    1985-01-01

    This program has its primary objective: the development of hot corrosion life prediction methodology based on a combination of laboratory test data and evaluation of field service turbine components which show evidence of hot corrosion. The laboratory program comprises burner rig testing by TRW. A summary of results is given for two series of burner rig tests. The life prediction methodology parameters to be appraised in a final campaign of burner rig tests are outlined.

  19. Paradigms of Evaluation in Natural Language Processing: Field Linguistics for Glass Box Testing

    ERIC Educational Resources Information Center

    Cohen, Kevin Bretonnel

    2010-01-01

    Although software testing has been well-studied in computer science, it has received little attention in natural language processing. Nonetheless, a fully developed methodology for glass box evaluation and testing of language processing applications already exists in the field methods of descriptive linguistics. This work lays out a number of…

  20. Development of the methodology of exhaust emissions measurement under RDE (Real Driving Emissions) conditions for non-road mobile machinery (NRMM) vehicles

    NASA Astrophysics Data System (ADS)

    Merkisz, J.; Lijewski, P.; Fuc, P.; Siedlecki, M.; Ziolkowski, A.

    2016-09-01

    The paper analyzes the exhaust emissions from farm vehicles based on research performed under field conditions (RDE) according to the NTE procedure. This analysis has shown that it is hard to meet the NTE requirements under field conditions (engine operation in the NTE zone for at least 30 seconds). Due to a very high variability of the engine conditions, the share of a valid number of NTE windows in the field test is small throughout the entire test. For this reason, a modification of the measurement and exhaust emissions calculation methodology has been proposed for farm vehicles of the NRMM group. A test has been developed composed of the following phases: trip to the operation site (paved roads) and field operations (including u-turns and maneuvering). The range of the operation time share in individual test phases has been determined. A change in the method of calculating the real exhaust emissions has also been implemented in relation to the NTE procedure.

  1. Novel optoelectronic methodology for testing of MOEMS

    NASA Astrophysics Data System (ADS)

    Pryputniewicz, Ryszard J.; Furlong, Cosme

    2003-01-01

    Continued demands for delivery of high performance micro-optoelectromechanical systems (MOEMS) place unprecedented requirements on methods used in their development and operation. Metrology is a major and inseparable part of these methods. Optoelectronic methodology is an essential field of metrology. Due to its scalability, optoelectronic methodology is particularly suitable for testing of MOEMS where measurements must be made with ever increasing accuracy and precision. This was particularly evident during the last few years, characterized by miniaturization of devices, when requirements for measurements have rapidly increased as the emerging technologies introduced new products, especially, optical MEMS. In this paper, a novel optoelectronic methodology for testing of MOEMS is described and its applications are illustrated with representative examples. These examples demonstrate capability to measure submicron deformations of various components of the micromirror device, under operating conditions, and show viability of the optoelectronic methodology for testing of MOEMS.

  2. Testing flat plate photovoltaic modules for terrestrial environment

    NASA Technical Reports Server (NTRS)

    Hoffman, A. R.; Arnett, J. C.; Ross, R. G., Jr.

    1979-01-01

    New qualification tests have been developed for flat plate photovoltaic modules. Temperature cycling, cyclic pressure load, and humidity exposure are especially useful for detecting design and fabrication deficiencies. There is positive correlation between many of the observed field effects, such as power loss, and qualification test induced degradation. The status of research efforts for the development of test methodology for field-related problems is reviewed.

  3. Towards standardized testing methodologies for optical properties of components in concentrating solar thermal power plants

    NASA Astrophysics Data System (ADS)

    Sallaberry, Fabienne; Fernández-García, Aránzazu; Lüpfert, Eckhard; Morales, Angel; Vicente, Gema San; Sutter, Florian

    2017-06-01

    Precise knowledge of the optical properties of the components used in the solar field of concentrating solar thermal power plants is primordial to ensure their optimum power production. Those properties are measured and evaluated by different techniques and equipment, in laboratory conditions and/or in the field. Standards for such measurements and international consensus for the appropriate techniques are in preparation. The reference materials used as a standard for the calibration of the equipment are under discussion. This paper summarizes current testing methodologies and guidelines for the characterization of optical properties of solar mirrors and absorbers.

  4. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    NASA Astrophysics Data System (ADS)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  5. Advances in Educational and Psychological Testing: Theory and Applications. Evaluation in Education and Human Services Series.

    ERIC Educational Resources Information Center

    Hambleton, Ronald K., Ed.; Zaal, Jac N., Ed.

    The 14 chapters of this book focus on the technical advances, advances in applied settings, and emerging topics in the testing field. Part 1 discusses methodological advances, Part 2 considers developments in applied settings, and Part 3 reviews emerging topics in the field of testing. Part 1 papers include: (1) "Advances in…

  6. Policy Implications for Continuous Employment Decisions of High School Principals: An Alternative Methodological Approach for Using High-Stakes Testing Outcomes

    ERIC Educational Resources Information Center

    Young, I. Phillip; Fawcett, Paul

    2013-01-01

    Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…

  7. A Methodology for Surface Soil Moisture and Vegetation Optical Depth Retrieval Using the Microwave Polarization Difference Index

    NASA Technical Reports Server (NTRS)

    Owe, Manfred; deJeu, Richard; Walker, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    A methodology for retrieving surface soil moisture and vegetation optical depth from satellite microwave radiometer data is presented. The procedure is tested with historical 6.6 GHz brightness temperature observations from the Scanning Multichannel Microwave Radiometer over several test sites in Illinois. Results using only nighttime data are presented at this time, due to the greater stability of nighttime surface temperature estimation. The methodology uses a radiative transfer model to solve for surface soil moisture and vegetation optical depth simultaneously using a non-linear iterative optimization procedure. It assumes known constant values for the scattering albedo and roughness. Surface temperature is derived by a procedure using high frequency vertically polarized brightness temperatures. The methodology does not require any field observations of soil moisture or canopy biophysical properties for calibration purposes and is totally independent of wavelength. Results compare well with field observations of soil moisture and satellite-derived vegetation index data from optical sensors.

  8. NEPP Update of Independent Single Event Upset Field Programmable Gate Array Testing

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; Label, Kenneth; Campola, Michael; Pellish, Jonathan

    2017-01-01

    This presentation provides a NASA Electronic Parts and Packaging (NEPP) Program update of independent Single Event Upset (SEU) Field Programmable Gate Array (FPGA) testing including FPGA test guidelines, Microsemi RTG4 heavy-ion results, Xilinx Kintex-UltraScale heavy-ion results, Xilinx UltraScale+ single event effect (SEE) test plans, development of a new methodology for characterizing SEU system response, and NEPP involvement with FPGA security and trust.

  9. The Leeb Hardness Test for Rock: An Updated Methodology and UCS Correlation

    NASA Astrophysics Data System (ADS)

    Corkum, A. G.; Asiri, Y.; El Naggar, H.; Kinakin, D.

    2018-03-01

    The Leeb hardness test (LHT with test value of L D ) is a rebound hardness test, originally developed for metals, that has been correlated with the Unconfined Compressive Strength (test value of σ c ) of rock by several authors. The tests can be carried out rapidly, conveniently and nondestructively on core and block samples or on rock outcrops. This makes the relatively small LHT device convenient for field tests. The present study compiles test data from literature sources and presents new laboratory testing carried out by the authors to develop a substantially expanded database with wide-ranging rock types. In addition, the number of impacts that should be averaged to comprise a "test result" was revisited along with the issue of test specimen size. Correlation for L D and σ c for various rock types is provided along with recommended testing methodology. The accuracy of correlated σ c estimates was assessed and reasonable correlations were observed between L D and σ c . The study findings show that LHT can be useful particularly for field estimation of σ c and offers a significant improvement over the conventional field estimation methods outlined by the ISRM (e.g., hammer blows). This test is rapid and simple, with relatively low equipment costs, and provides a reasonably accurate estimate of σ c .

  10. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies

    PubMed Central

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S.; Thwaites, David I.; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-01-01

    Abstract The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP ‘Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques’ was conducted in 2009–2012 as an extension of previously developed audit programs. Material and methods. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. Results. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Discussion. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs. PMID:26934916

  11. Testing the methodology for dosimetry audit of heterogeneity corrections and small MLC-shaped fields: Results of IAEA multi-center studies.

    PubMed

    Izewska, Joanna; Wesolowska, Paulina; Azangwe, Godfrey; Followill, David S; Thwaites, David I; Arib, Mehenna; Stefanic, Amalia; Viegas, Claudio; Suming, Luo; Ekendahl, Daniela; Bulski, Wojciech; Georg, Dietmar

    2016-07-01

    The International Atomic Energy Agency (IAEA) has a long tradition of supporting development of methodologies for national networks providing quality audits in radiotherapy. A series of co-ordinated research projects (CRPs) has been conducted by the IAEA since 1995 assisting national external audit groups developing national audit programs. The CRP 'Development of Quality Audits for Radiotherapy Dosimetry for Complex Treatment Techniques' was conducted in 2009-2012 as an extension of previously developed audit programs. The CRP work described in this paper focused on developing and testing two steps of dosimetry audit: verification of heterogeneity corrections, and treatment planning system (TPS) modeling of small MLC fields, which are important for the initial stages of complex radiation treatments, such as IMRT. The project involved development of a new solid slab phantom with heterogeneities containing special measurement inserts for thermoluminescent dosimeters (TLD) and radiochromic films. The phantom and the audit methodology has been developed at the IAEA and tested in multi-center studies involving the CRP participants. The results of multi-center testing of methodology for two steps of dosimetry audit show that the design of audit procedures is adequate and the methodology is feasible for meeting the audit objectives. A total of 97% TLD results in heterogeneity situations obtained in the study were within 3% and all results within 5% agreement with the TPS predicted doses. In contrast, only 64% small beam profiles were within 3 mm agreement between the TPS calculated and film measured doses. Film dosimetry results have highlighted some limitations in TPS modeling of small beam profiles in the direction of MLC leave movements. Through multi-center testing, any challenges or difficulties in the proposed audit methodology were identified, and the methodology improved. Using the experience of these studies, the participants could incorporate the auditing procedures in their national programs.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  13. Estimating breeding proportions and testing hypotheses about costs of reproduction with capture-recapture data

    USGS Publications Warehouse

    Nichols, James D.; Hines, James E.; Pollock, Kenneth H.; Hinz, Robert L.; Link, William A.

    1994-01-01

    The proportion of animals in a population that breeds is an important determinant of population growth rate. Usual estimates of this quantity from field sampling data assume that the probability of appearing in the capture or count statistic is the same for animals that do and do not breed. A similar assumption is required by most existing methods used to test ecologically interesting hypotheses about reproductive costs using field sampling data. However, in many field sampling situations breeding and nonbreeding animals are likely to exhibit different probabilities of being seen or caught. In this paper, we propose the use of multistate capture-recapture models for these estimation and testing problems. This methodology permits a formal test of the hypothesis of equal capture/sighting probabilities for breeding and nonbreeding individuals. Two estimators of breeding proportion (and associated standard errors) are presented, one for the case of equal capture probabilities and one for the case of unequal capture probabilities. The multistate modeling framework also yields formal tests of hypotheses about reproductive costs to future reproduction or survival or both fitness components. The general methodology is illustrated using capture-recapture data on female meadow voles, Microtus pennsylvanicus. Resulting estimates of the proportion of reproductively active females showed strong seasonal variation, as expected, with low breeding proportions in midwinter. We found no evidence of reproductive costs extracted in subsequent survival or reproduction. We believe that this methodological framework has wide application to problems in animal ecology concerning breeding proportions and phenotypic reproductive costs.

  14. Noise measurements of highway pavements in Texas.

    DOT National Transportation Integrated Search

    2009-10-01

    This report presents the results of noise testing performed on Texas pavements between May of 2006 and the : summer of 2008. Two field test methodologies were used: roadside noise measurement with SPL meters and onvehicle : sound intensity measuremen...

  15. Methodological Capacity within the Field of "Educational Technology" Research: An Initial Investigation

    ERIC Educational Resources Information Center

    Bulfin, Scott; Henderson, Michael; Johnson, Nicola F.; Selwyn, Neil

    2014-01-01

    The academic study of educational technology is often characterised by critics as methodologically limited. In order to test this assumption, the present paper reports on data collected from a survey of 462 "research active" academic researchers working in the broad areas of educational technology and educational media. The paper…

  16. A methodology based on insecticide impregnated filter paper for monitoring resistance to deltamethrin in Triatoma infestans field populations.

    PubMed

    Remón, C; Lobbia, P; Zerba, E; Mougabure-Cueto, G

    2017-12-01

    The domiciliary presence of Triatoma infestans (Klug) (Hemiptera: Reduviidae) after control interventions was reported in recent years. Toxicological studies showed high levels of resistance to pyrethroids suggesting resistance as one of the main causes of deficient control. The aim of the present study was to develop a protocol to test resistance to deltamethrin in T. infestans collected from the field by discriminate concentration. To evaluate field insects, the effect of age (early vs. later) and nutritional state (starved vs. fed) on the deltamethrin susceptibility of each developmental stage was studied. Topical and insecticide impregnated paper bioassays were used. Using the impregnated paper, the susceptibility to deltamethrin was not affected by the age of the stadium and the nutritional states, and varied with the post-exposure time and with the different developmental stages. A discriminant concentration of deltamethrin (0.36% w/v) impregnated in filter paper was established for all developmental stages. Finally, the methodology and the discriminant concentration were evaluated in the laboratory showing high sensitivity in the discrimination of resistance. The present study developed a methodology of exposure to insecticide impregnated papers and proposes a protocol to test T. infestans in field populations with the aim to detect early evolution of resistance to deltamethrin. © 2017 The Royal Entomological Society.

  17. 47 CFR 15.717 - TVBDs that rely on spectrum sensing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... under this section must demonstrate with an extremely high degree of confidence that they will not cause... § 0.459 of this chapter. This public notice will include proposed test procedures and methodologies. (ii) The Commission will conduct laboratory and field tests of the pre-production device. This testing...

  18. Enhanced CAX Architecture, Design and Methodology - SPHINX (Architecture, definition et methodologie ameliorees des exercices assistes par ordinateur (CAX) - SPHINX)

    DTIC Science & Technology

    2016-08-01

    REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie améliorées des exercices...STO TECHNICAL REPORT TR-MSG-106 Enhanced CAX Architecture, Design and Methodology – SPHINX (Architecture, définition et méthodologie...transition, application and field-testing, experimentation and a range of related scientific activities that include systems engineering, operational

  19. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    PubMed

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P <0.003). There was increase in knowledge of the delegates after attending research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  20. Development of interactive hypermedia software for high school biology: A research and development study

    NASA Astrophysics Data System (ADS)

    Alturki, Uthman T.

    The goal of this research was to research, design, and develop a hypertext program for students who study biology. The Ecology Hypertext Program was developed using Research and Development (R&D) methodology. The purpose of this study was to place the final "product", a CD-ROM for learning biology concepts, in the hands of teachers and students to help them in learning and teaching process. The product was created through a cycle of literature review, needs assessment, development, and a cycle of field tests and revisions. I applied the ten steps of R&D process suggested by Borg and Gall (1989) which, consisted of: (1) Literature review, (2) Needs assessment, (3) Planning, (4) Develop preliminary product, (5) Preliminary field-testing, (6) Preliminary revision, (7) Main field-testing, (8) Main revision, (9) Final field-testing, and (10) Final product revision. The literature review and needs assessment provided a support and foundation for designing the preliminary product---the Ecology Hypertext Program. Participants in the needs assessment joined a focus group discussion. They were a group of graduate students in education who suggested the importance for designing this product. For the preliminary field test, the participants were a group of high school students studying biology. They were the potential user of the product. They reviewed the preliminary product and then filled out a questionnaire. Their feedback and suggestions were used to develop and improve the product in a step called preliminary revision. The second round of field tasting was the main field test in which the participants joined a focus group discussion. They were the same group who participated in needs assessment task. They reviewed the revised product and then provided ideas and suggestions to improve the product. Their feedback were categorized and implemented to develop the product as in the main revision task. Finally, a group of science teachers participated in this study by reviewing the product and then filling out the questionnaire. Their suggestions were used to conduct the final step in R&D methodology, the final product revision. The primary result of this study was the Ecology Hypertext Program. It considered a small attempt to give students an opportunity to learn through an interactive hypertext program. In addition, using the R&D methodology was an ideal procedure for designing and developing new educational products and material.

  1. Quality reporting of carotid intima-media thickness methodology; Current state of the science in the field of spinal cord injury.

    PubMed

    Hoskin, Jordan D; Miyatani, Masae; Craven, B Catharine

    2017-03-30

    Carotid intima-media thickness (cIMT) may be used increasingly as a cardiovascular disease (CVD) screening tool in individuals with spinal cord injury (SCI) as other routine invasive diagnostic tests are often unfeasible. However, variation in cIMT acquisition and analysis methods is an issue in the current published literature. The growth of the field is dependent on cIMT quality acquisition and analysis to ensure accurate reporting of CVD risk. The purpose of this study is to evaluate the quality of the reported methodology used to collect cIMT values in SCI. Data from 12 studies, which measured cIMT in individuals with SCI, were identified from the Medline, Embase and CINAHL databases. The quality of the reported methodologies was scored based on adherence to cIMT methodological guidelines abstracted from two consensus papers. Five studies were scored as 'moderate quality' in methodological reporting, having specified 9 to 11 of 15 quality reporting criterion. The remaining seven studies were scored as 'low quality', having reported less than 9 of 15 quality reporting criterion. No study had methodological reporting that was scored as 'high quality'. The overall reporting of quality methodology was poor in the published SCI literature. A greater adherence to current methodological guidelines is needed to advance the field of cIMT in SCI. Further research is necessary to refine cIMT acquisition and analysis guidelines to aid authors designing research and journals in screening manuscripts for publication.

  2. Blade Testing Trends (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desmond, M.

    2014-08-01

    As an invited guest speaker, Michael Desmond presented on NREL's NWTC structural testing methods and capabilities at the 2014 Sandia Blade Workshop held on August 26-28, 2014 in Albuquerque, NM. Although dynamometer and field testing capabilities were mentioned, the presentation focused primarily on wind turbine blade testing, including descriptions and capabilities for accredited certification testing, historical methodology and technology deployment, and current research and development activities.

  3. Advanced biosensing methodologies developed for evaluating performance quality and safety of emerging biophotonics technologies and medical devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilev, Ilko K.; Walker, Bennett; Calhoun, William; Hassan, Moinuddin

    2016-03-01

    Biophotonics is an emerging field in modern biomedical technology that has opened up new horizons for transfer of state-of-the-art techniques from the areas of lasers, fiber optics and biomedical optics to the life sciences and medicine. This field continues to vastly expand with advanced developments across the entire spectrum of biomedical applications ranging from fundamental "bench" laboratory studies to clinical patient "bedside" diagnostics and therapeutics. However, in order to translate these technologies to clinical device applications, the scientific and industrial community, and FDA are facing the requirement for a thorough evaluation and review of laser radiation safety and efficacy concerns. In many cases, however, the review process is complicated due the lack of effective means and standard test methods to precisely analyze safety and effectiveness of some of the newly developed biophotonics techniques and devices. There is, therefore, an immediate public health need for new test protocols, guidance documents and standard test methods to precisely evaluate fundamental characteristics, performance quality and safety of these technologies and devices. Here, we will overview our recent developments of novel test methodologies for safety and efficacy evaluation of some emerging biophotonics technologies and medical devices. These methodologies are based on integrating the advanced features of state-of-the-art optical sensor technologies and approaches such as high-resolution fiber-optic sensing, confocal and optical coherence tomography imaging, and infrared spectroscopy. The presentation will also illustrate some methodologies developed and implemented for testing intraocular lens implants, biochemical contaminations of medical devices, ultrahigh-resolution nanoscopy, and femtosecond laser therapeutics.

  4. Probe compensation in cylindrical near-field scanning: A novel simulation methodology

    NASA Technical Reports Server (NTRS)

    Hussein, Ziad A.; Rahmat-Samii, Yahya

    1993-01-01

    Probe pattern compensation is essential in near-field scanning geometry, where there is a great need to accurately know far-field patterns at wide angular range. This paper focuses on a novel formulation and computer simulation to determine the precise need for and effect of probe compensation in cylindrical near-field scanning. The methodology is applied to a linear test array antenna and the NASA scatterometer radar antenna. The formulation is based on representing the probe by its equivalent tangential magnetic currents. The interaction between the probe equivalent aperture currents and the test antenna fields is obtained with the application of a reciprocity theorem. This allows us to obtain the probe vector output pickup integral which is proportional to the amplitude and phase of the electric field induced in the probe aperture with respect to its position to the test antenna. The integral is evaluated for each probe position on the required sampling point on a cylindrical near-field surface enclosing the antenna. The use of a hypothetical circular-aperture probe with a different radius permits us to derive closed-form expressions for its far-field radiation patterns. These results, together with the probe vector output pickup, allow us to perform computer simulated synthetic measurements. The far-field patterns of the test antenna are formulated based on cylindrical wave expansions of both the probe and test antenna fields. In the limit as the probe radius becomes very small, the probe vector output is the direct response of the near-field at a point, and no probe compensation is needed. Useful results are generated to compare the far-field pattern of the test antenna constructed from the knowledge of the simulated near-field with and without probe pattern compensation and the exact results. These results are important since they clearly illustrate the angular range over which probe compensation is needed. It has been found that a probe with an aperture radius of 0.25(lambda), 0.5(lambda), and 1(lambda) needs a little probe compensation, if any, near the test antenna main beam. In addition, a probe with low directivity may provide a better signal-to-noise ratio than a highly directive one. This is evident in test antenna patterns without probe compensation at wide angles.

  5. Development and Field Test of an Audit Tool and Tracer Methodology for Clinician Assessment of Quality in End-of-Life Care.

    PubMed

    Bookbinder, Marilyn; Hugodot, Amandine; Freeman, Katherine; Homel, Peter; Santiago, Elisabeth; Riggs, Alexa; Gavin, Maggie; Chu, Alice; Brady, Ellen; Lesage, Pauline; Portenoy, Russell K

    2018-02-01

    Quality improvement in end-of-life care generally acquires data from charts or caregivers. "Tracer" methodology, which assesses real-time information from multiple sources, may provide complementary information. The objective of this study was to develop a valid brief audit tool that can guide assessment and rate care when used in a clinician tracer to evaluate the quality of care for the dying patient. To identify items for a brief audit tool, 248 items were created to evaluate overall quality, quality in specific content areas (e.g., symptom management), and specific practices. Collected into three instruments, these items were used to interview professional caregivers and evaluate the charts of hospitalized patients who died. Evidence that this information could be validly captured using a small number of items was obtained through factor analyses, canonical correlations, and group comparisons. A nurse manager field tested tracer methodology using candidate items to evaluate the care provided to other patients who died. The survey of 145 deaths provided chart data and data from 445 interviews (26 physicians, 108 nurses, 18 social workers, and nine chaplains). The analyses yielded evidence of construct validity for a small number of items, demonstrating significant correlations between these items and content areas identified as latent variables in factor analyses. Criterion validity was suggested by significant differences in the ratings on these items between the palliative care unit and other units. The field test evaluated 127 deaths, demonstrated the feasibility of tracer methodology, and informed reworking of the candidate items into the 14-item Tracer EoLC v1. The Tracer EoLC v1 can be used with tracer methodology to guide the assessment and rate the quality of end-of-life care. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  6. The methodological quality of diagnostic test accuracy studies for musculoskeletal conditions can be improved.

    PubMed

    Henschke, Nicholas; Keuerleber, Julia; Ferreira, Manuela; Maher, Christopher G; Verhagen, Arianne P

    2014-04-01

    To provide an overview of reporting and methodological quality in diagnostic test accuracy (DTA) studies in the musculoskeletal field and evaluate the use of the QUality Assessment of Diagnostic Accuracy Studies (QUADAS) checklist. A literature review identified all systematic reviews that evaluated the accuracy of clinical tests to diagnose musculoskeletal conditions and used the QUADAS checklist. Two authors screened all identified reviews and extracted data on the target condition, index tests, reference standard, included studies, and QUADAS items. A descriptive analysis of the QUADAS checklist was performed, along with Rasch analysis to examine the construct validity and internal reliability. A total of 19 systematic reviews were included, which provided data on individual items of the QUADAS checklist for 392 DTA studies. In the musculoskeletal field, uninterpretable or intermediate test results are commonly not reported, with 175 (45%) studies scoring "no" to this item. The proportion of studies fulfilling certain items varied from 22% (item 11) to 91% (item 3). The interrater reliability of the QUADAS checklist was good and Rasch analysis showed excellent construct validity and internal consistency. This overview identified areas where the reporting and performance of diagnostic studies within the musculoskeletal field can be improved. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. 75 FR 8646 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ... methodological research on questionnaire design and evaluation (split-ballot field tests, respondent debriefings... ASEC design. A secondary purpose is to compare estimates from the CPS and ACS test panels. Evaluations... and ACS production data, and to determine whether particular survey design features of the CPS ASEC...

  8. A Field-Based Aquatic Life Benchmark for Conductivity in Central Appalachian Streams (2010) (External Review Draft)

    EPA Science Inventory

    This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for disso...

  9. Development of field-deployable instrumentation based on “antigen–antibody” reactions for detection of hemorrhagic disease in ruminants

    USDA-ARS?s Scientific Manuscript database

    Development of field-deployable methodology utilizing antigen–antibody reactions and the surface Plasmon resonance (SPR) effect to provide a rapid diagnostic test for recognition of the blue tongue virus (BTV) and epizootic hemorrhage disease virus (EHDV) in wild and domestic ruminants is reported. ...

  10. Testing Mars-inspired operational strategies for semi-autonomous rovers on the Moon: The GeoHeuristic Operational Strategies Test in New Mexico.

    PubMed

    Yingst, R Aileen; Cohen, B A; Crumpler, L; Schmidt, M E; Schrader, C M

    2011-01-01

    We tested the science operational strategy used for the Mars Exploration Rover (MER) mission on Mars to determine its suitability for conducting remote geology on the Moon by conducting a field test at Cerro de Santa Clara, New Mexico. This region contains volcanic and sedimentary products from a variety of provenances, mimicking the variety that might be found at a lunar site such as South Pole-Aitken Basin. At each site a Science Team broke down observational "days" into a sequence of observations of features and targets of interest. The number, timing, and sequence of observations was chosen to mimic those used by the MERs when traversing. Images simulating high-resolution stereo and hand lens-scale images were taken using a professional SLR digital camera; multispectral and XRD data were acquired from samples to mimic the availability of geochemical data. A separate Tiger Team followed the Science Team and examined each site using traditional terrestrial field methods, facilitating comparison between what was revealed by human versus rover-inspired methods. We conclude from this field test that MER-inspired methodology is not conducive to utilizing all acquired data in a timely manner for the case of any lunar architecture that involves the acquisition of rover data in near real-time. We additionally conclude that a methodology similar to that used for MER can be adapted for use on the Moon if mission goals are focused on reconnaissance. If the goal is to locate and identify a specific feature or material, such as water ice, a different methodology will likely be needed.

  11. Testing Mars-inspired operational strategies for semi-autonomous rovers on the Moon: The GeoHeuristic Operational Strategies Test in New Mexico

    PubMed Central

    Yingst, R. Aileen; Cohen, B. A.; Crumpler, L.; Schmidt, M. E.; Schrader, C. M.

    2017-01-01

    Background We tested the science operational strategy used for the Mars Exploration Rover (MER) mission on Mars to determine its suitability for conducting remote geology on the Moon by conducting a field test at Cerro de Santa Clara, New Mexico. This region contains volcanic and sedimentary products from a variety of provenances, mimicking the variety that might be found at a lunar site such as South Pole-Aitken Basin. Method At each site a Science Team broke down observational “days” into a sequence of observations of features and targets of interest. The number, timing, and sequence of observations was chosen to mimic those used by the MERs when traversing. Images simulating high-resolution stereo and hand lens-scale images were taken using a professional SLR digital camera; multispectral and XRD data were acquired from samples to mimic the availability of geochemical data. A separate Tiger Team followed the Science Team and examined each site using traditional terrestrial field methods, facilitating comparison between what was revealed by human versus rover-inspired methods. Lessons Learned We conclude from this field test that MER-inspired methodology is not conducive to utilizing all acquired data in a timely manner for the case of any lunar architecture that involves the acquisition of rover data in near real-time. We additionally conclude that a methodology similar to that used for MER can be adapted for use on the Moon if mission goals are focused on reconnaissance. If the goal is to locate and identify a specific feature or material, such as water ice, a different methodology will likely be needed. PMID:29309066

  12. To Develop and Test Improved Procedures for the Development and Distribution of Quality Individualized Mediated Instructional Materials in Vocational Education. Final Report.

    ERIC Educational Resources Information Center

    State Fair Community Coll., Sedalia, MO.

    Five objectives are reported for a project to develop and test effective procedures for designing, field testing, reproducing, and disseminating individualized mediated instructional materials: (1) improvement of teacher input, (2) development of individualized instruction modules, (3) development of methodology for evaluating the effectiveness of…

  13. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  14. Towards Intelligent Interpretation of Low Strain Pile Integrity Testing Results Using Machine Learning Techniques.

    PubMed

    Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min

    2017-10-25

    Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.

  15. Transmission line relay mis-operation detection based on time-synchronized field data

    DOE PAGES

    Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen

    2015-05-04

    In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less

  16. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  17. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    NASA Astrophysics Data System (ADS)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean deformation width of 183 m and 133 m, respectively. We envisage that correlation results derived from our methodology will provide vital data for near-field deformation patterns and will be of significant use for constraining inversion solutions for fault slip at depth.

  18. Mining and Reclamation Cooperative Education Program. Progress Report.

    ERIC Educational Resources Information Center

    Barnett, Carl D.

    The exemplary project was the cooperative effort of two schools in the western Kentucky coal fields to field test a program in mining and reclamation technology. Covering the first year of the project, the report describes the problem and scope of the study, the objectives pursued, the methodology, and the results obtained. The goal of the project…

  19. SEURAT: Safety Evaluation Ultimately Replacing Animal Testing – Recommendations for future research in the field of predictive toxicology

    EPA Science Inventory

    The development of non-animal methodology to evaluate the potential for a chemical to cause systemic toxicity is one of the grand challenges of modern science. The European research programme SEURAT is active in this field and will conclude its first phase, SEURAT-1, in December ...

  20. Full-field modal analysis during base motion excitation using high-speed 3D digital image correlation

    NASA Astrophysics Data System (ADS)

    Molina-Viedma, Ángel J.; López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2017-10-01

    In recent years, many efforts have been made to exploit full-field measurement optical techniques for modal identification. Three-dimensional digital image correlation using high-speed cameras has been extensively employed for this purpose. Modal identification algorithms are applied to process the frequency response functions (FRF), which relate the displacement response of the structure to the excitation force. However, one of the most common tests for modal analysis involves the base motion excitation of a structural element instead of force excitation. In this case, the relationship between response and excitation is typically based on displacements, which are known as transmissibility functions. In this study, a methodology for experimental modal analysis using high-speed 3D digital image correlation and base motion excitation tests is proposed. In particular, a cantilever beam was excited from its base with a random signal, using a clamped edge join. Full-field transmissibility functions were obtained through the beam and converted into FRF for proper identification, considering a single degree-of-freedom theoretical conversion. Subsequently, modal identification was performed using a circle-fit approach. The proposed methodology facilitates the management of the typically large amounts of data points involved in the DIC measurement during modal identification. Moreover, it was possible to determine the natural frequencies, damping ratios and full-field mode shapes without requiring any additional tests. Finally, the results were experimentally validated by comparing them with those obtained by employing traditional accelerometers, analytical models and finite element method analyses. The comparison was performed by using the quantitative indicator modal assurance criterion. The results showed a high level of correspondence, consolidating the proposed experimental methodology.

  1. TACCDAS Testbed Human Factors Evaluation Methodology,

    DTIC Science & Technology

    1980-03-01

    3 TEST METHOD Development of performance criteria................... 8 Test participant identification ...................... 8 Control of...major milestones involved in the evaluation process leading up to the evaluation of the complete testbed in the field are identified. Test methods and...inevitably will be different in several ways from the intended system as foreseen by the system designers. The system users provide insights into these

  2. Economic evaluation of medical tests at the early phases of development: a systematic review of empirical studies.

    PubMed

    Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham

    2018-02-01

    There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.

  3. Ancient DNA studies: new perspectives on old samples

    PubMed Central

    2012-01-01

    In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611

  4. Rubble masonry response under cyclic actions: The experience of L’Aquila city (Italy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonti, Roberta, E-mail: roberta.fonti@tum.de; Barthel, Rainer, E-mail: r.barthel@lrz.tu-muenchen.de; Formisano, Antonio, E-mail: antoform@unina.it

    2015-12-31

    Several methods of analysis are available in engineering practice to study old masonry constructions. Two commonly used approaches in the field of seismic engineering are global and local analyses. Despite several years of research in this field, the various methodologies suffer from a lack of comprehensive experimental validation. This is mainly due to the difficulty in simulating the many different kinds of masonry and, accordingly, the non-linear response under horizontal actions. This issue can be addressed by examining the local response of isolated panels under monotonic and/or alternate actions. Different testing methodologies are commonly used to identify the local responsemore » of old masonry. These range from simplified pull-out tests to sophisticated in-plane monotonic tests. However, there is a lack of both knowledge and critical comparison between experimental validations and numerical simulations. This is mainly due to the difficulties in implementing irregular settings within both simplified and advanced numerical analyses. Similarly, the simulation of degradation effects within laboratory tests is difficult with respect to old masonry in-situ boundary conditions. Numerical models, particularly on rubble masonry, are commonly simplified. They are mainly based on a kinematic chain of rigid blocks able to perform different “modes of damage” of structures subjected to horizontal actions. This paper presents an innovative methodology for testing; its aim is to identify a simplified model for out-of-plane response of rubbleworks with respect to the experimental evidence. The case study of L’Aquila district is discussed.« less

  5. Computational Analyses in Support of Sub-scale Diffuser Testing for the A-3 Facility. Part 3; Aero-Acoustic Analyses and Experimental Validation

    NASA Technical Reports Server (NTRS)

    Allgood, Daniel C.; Graham, Jason S.; McVay, Greg P.; Langford, Lester L.

    2008-01-01

    A unique assessment of acoustic similarity scaling laws and acoustic analogy methodologies in predicting the far-field acoustic signature from a sub-scale altitude rocket test facility at the NASA Stennis Space Center was performed. A directional, point-source similarity analysis was implemented for predicting the acoustic far-field. In this approach, experimental acoustic data obtained from "similar" rocket engine tests were appropriately scaled using key geometric and dynamic parameters. The accuracy of this engineering-level method is discussed by comparing the predictions with acoustic far-field measurements obtained. In addition, a CFD solver was coupled with a Lilley's acoustic analogy formulation to determine the improvement of using a physics-based methodology over an experimental correlation approach. In the current work, steady-state Reynolds-averaged Navier-Stokes calculations were used to model the internal flow of the rocket engine and altitude diffuser. These internal flow simulations provided the necessary realistic input conditions for external plume simulations. The CFD plume simulations were then used to provide the spatial turbulent noise source distributions in the acoustic analogy calculations. Preliminary findings of these studies will be discussed.

  6. Measuring the Continuum of Literacy Skills among Adults: Educational Testing and the LAMP Experience

    ERIC Educational Resources Information Center

    Guadalupe, Cesar; Cardoso, Manuel

    2011-01-01

    The field of educational testing has become increasingly important for providing different stakeholders and decision-makers with information. This paper discusses basic standards for methodological approaches used in measuring literacy skills among adults. The authors address the increasing interest in skills measurement, the discourses on how…

  7. Borate protection of softwood from Coptotermes acinaciformis (Isoptera: Rhinotermitidae) damage: variation in protection thresholds explained.

    PubMed

    Peters, Brenton C; Fitzgerald, Christopher J

    2006-10-01

    Laboratory and field data reported in the literature are confusing with regard to "adequate" protection thresholds for borate timber preservatives. The confusion is compounded by differences in termite species, timber species and test methodology. Laboratory data indicate a borate retention of 0.5% mass/mass (m/m) boric acid equivalent (BAE) would cause > 90% termite mortality and restrict mass loss in test specimens to < or = 5%. Field data generally suggest that borate retentions appreciably > 0.5% m/m BAE are required. We report two field experiments with varying amounts of untreated feeder material in which Coptotermes acinaciformis (Froggatt) (Isoptera: Rhinotermitidae) responses to borate-treated radiata (Monterey) pine, Pinus radiata D. Don, were measured. The apparently conflicting results between laboratory and field data are explained by the presence or absence of untreated feeder material in the test environment. In the absence of untreated feeder material, wood containing 0.5% BAE provided adequate protection from Coptotermes sp., whereas in the presence of untreated feeder material, increased retentions were required. Furthermore, the retentions required increased with increased amounts of susceptible material present. Some termites, Nasutitermes sp. and Mastotermes darwiniensis Froggatt, for example, are borate-tolerant and borate timber preservatives are not a viable management option with these species. The lack of uniform standards for termite test methodology and assessment criteria for efficacy across the world is recognized as a difficulty with research into the performance of timber preservatives with termites. The many variables in laboratory and field assays make "prescriptive" standards difficult to recommend. The use of "performance" standards to define efficacy criteria ("adequate" protection) is discussed.

  8. Site characterization methodology for aquifers in support of bioreclamation activities. Volume 2: Borehole flowmeter technique, tracer tests, geostatistics and geology. Final report, August 1987-September 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S.C.

    1993-08-01

    This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less

  9. Real-time management of an urban groundwater well field threatened by pollution.

    PubMed

    Bauser, Gero; Franssen, Harrie-Jan Hendricks; Kaiser, Hans-Peter; Kuhlmann, Ulrich; Stauffer, Fritz; Kinzelbach, Wolfgang

    2010-09-01

    We present an optimal real-time control approach for the management of drinking water well fields. The methodology is applied to the Hardhof field in the city of Zurich, Switzerland, which is threatened by diffuse pollution. The risk of attracting pollutants is higher if the pumping rate is increased and can be reduced by increasing artificial recharge (AR) or by adaptive allocation of the AR. The method was first tested in offline simulations with a three-dimensional finite element variably saturated subsurface flow model for the period January 2004-August 2005. The simulations revealed that (1) optimal control results were more effective than the historical control results and (2) the spatial distribution of AR should be different from the historical one. Next, the methodology was extended to a real-time control method based on the Ensemble Kalman Filter method, using 87 online groundwater head measurements, and tested at the site. The real-time control of the well field resulted in a decrease of the electrical conductivity of the water at critical measurement points which indicates a reduced inflow of water originating from contaminated sites. It can be concluded that the simulation and the application confirm the feasibility of the real-time control concept.

  10. Laboratory and field testing of commercial rotational seismometers

    USGS Publications Warehouse

    Nigbor, R.L.; Evans, J.R.; Hutt, C.R.

    2009-01-01

    There are a small number of commercially available sensors to measure rotational motion in the frequency and amplitude ranges appropriate for earthquake motions on the ground and in structures. However, the performance of these rotational seismometers has not been rigorously and independently tested and characterized for earthquake monitoring purposes as is done for translational strong- and weak-motion seismometers. Quantities such as sensitivity, frequency response, resolution, and linearity are needed for the understanding of recorded rotational data. To address this need, we, with assistance from colleagues in the United States and Taiwan, have been developing performance test methodologies and equipment for rotational seismometers. In this article the performance testing methodologies are applied to samples of a commonly used commercial rotational seismometer, the eentec model R-1. Several examples were obtained for various test sequences in 2006, 2007, and 2008. Performance testing of these sensors consisted of measuring: (1) sensitivity and frequency response; (2) clip level; (3) self noise and resolution; and (4) cross-axis sensitivity, both rotational and translational. These sensor-specific results will assist in understanding the performance envelope of the R-1 rotational seismometer, and the test methodologies can be applied to other rotational seismometers.

  11. Factors Influencing the Adoption of Cloud Storage by Information Technology Decision Makers

    ERIC Educational Resources Information Center

    Wheelock, Michael D.

    2013-01-01

    This dissertation uses a survey methodology to determine the factors behind the decision to adopt cloud storage. The dependent variable in the study is the intent to adopt cloud storage. Four independent variables are utilized including need, security, cost-effectiveness and reliability. The survey includes a pilot test, field test and statistical…

  12. The CMC/3DPNS computer program for prediction of three-dimension, subsonic, turbulent aerodynamic juncture region flow. Volume 2: Users' manual

    NASA Technical Reports Server (NTRS)

    Manhardt, P. D.

    1982-01-01

    The CMC fluid mechanics program system was developed to transmit the theoretical solution of finite element numerical solution methodology, applied to nonlinear field problems into a versatile computer code for comprehensive flow field analysis. Data procedures for the CMC 3 dimensional Parabolic Navier-Stokes (PNS) algorithm are presented. General data procedures a juncture corner flow standard test case data deck is described. A listing of the data deck and an explanation of grid generation methodology are presented. Tabulations of all commands and variables available to the user are described. These are in alphabetical order with cross reference numbers which refer to storage addresses.

  13. Open framework for objective evaluation of crater detection algorithms with first test-field subsystem based on MOLA data

    NASA Astrophysics Data System (ADS)

    Salamunićcar, G.; Lončarić, S.

    2008-07-01

    Crater Detection Algorithms (CDAs) applications range from estimation of lunar/planetary surface age to autonomous landing on planets and asteroids and advanced statistical analyses. A large amount of work on CDAs has already been published. However, problems arise when evaluation results of some new CDA have to be compared with already published evaluation results. The problem is that different authors use different test-fields, different Ground-Truth (GT) catalogues, and even different methodologies for evaluation of their CDAs. Re-implementation of already published CDAs or its evaluation environment is a time-consuming and unpractical solution to this problem. In addition, implementation details are often insufficiently described in publications. As a result, there is a need in research community to develop a framework for objective evaluation of CDAs. A scientific question is how CDAs should be evaluated so that the results are easily and reliably comparable. In attempt to solve this issue we first analyzed previously published work on CDAs. In this paper, we propose a framework for solution of the problem of objective CDA evaluation. The framework includes: (1) a definition of the measure for differences between craters; (2) test-field topography based on the 1/64° MOLA data; (3) the GT catalogue wherein each of 17,582 craters is aligned with MOLA data and confirmed with catalogues by N.G. Barlow et al. and J.F. Rodionova et al.; (4) selection of methodology for training and testing; and (5) a Free-response Receiver Operating Characteristics (F-ROC) curves as a way to measure CDA performance. The handling of possible improvements of the framework in the future is additionally addressed as a part of discussion of results. Possible extensions with additional test-field subsystems based on visual images, data sets for other planets, evaluation methodologies for CDAs developed for different purposes than cataloguing of craters, are proposed as well. The goal of the proposed framework is to contribute to the research community by establishing guidelines for objective evaluation of CDAs.

  14. Validations of Coupled CSD/CFD and Particle Vortex Transport Method for Rotorcraft Applications: Hover, Transition, and High Speed Flights

    NASA Technical Reports Server (NTRS)

    Anusonti-Inthra, Phuriwat

    2010-01-01

    This paper presents validations of a novel rotorcraft analysis that coupled Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and Particle Vortex Transport Method (PVTM) methodologies. The CSD with associated vehicle trim analysis is used to calculate blade deformations and trim parameters. The near body CFD analysis is employed to provide detailed near body flow field information which is used to obtain high-fidelity blade aerodynamic loadings. The far field wake dominated region is simulated using the PVTM analysis which provides accurate prediction of the evolution of the rotor wake released from the near body CFD domains. A loose coupling methodology between the CSD and CFD/PVTM modules are used with appropriate information exchange amongst the CSD/CFD/PVTM modules. The coupled CSD/CFD/PVTM methodology is used to simulate various rotorcraft flight conditions (i.e. hover, transition, and high speed flights), and the results are compared with several sets of experimental data. For the hover condition, the results are compared with hover data for the HART II rotor tested at DLR Institute of Flight Systems, Germany. For the forward flight conditions, the results are validated with the UH-60A flight test data.

  15. Statistical Anomalies of Bitflips in SRAMs to Discriminate SBUs From MCUs

    NASA Astrophysics Data System (ADS)

    Clemente, Juan Antonio; Franco, Francisco J.; Villa, Francesca; Baylac, Maud; Rey, Solenne; Mecha, Hortensia; Agapito, Juan A.; Puchner, Helmut; Hubert, Guillaume; Velazco, Raoul

    2016-08-01

    Recently, the occurrence of multiple events in static tests has been investigated by checking the statistical distribution of the difference between the addresses of the words containing bitflips. That method has been successfully applied to Field Programmable Gate Arrays (FPGAs) and the original authors indicate that it is also valid for SRAMs. This paper presents a modified methodology that is based on checking the XORed addresses with bitflips, rather than on the difference. Irradiation tests on CMOS 130 & 90 nm SRAMs with 14-MeV neutrons have been performed to validate this methodology. Results in high-altitude environments are also presented and cross-checked with theoretical predictions. In addition, this methodology has also been used to detect modifications in the organization of said memories. Theoretical predictions have been validated with actual data provided by the manufacturer.

  16. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  17. Effects of surface chemistry on hot corrosion life

    NASA Technical Reports Server (NTRS)

    Fryxell, R. E.; Gupta, B. K.

    1984-01-01

    Hot corrosion life prediction methodology based on a combination of laboratory test data and field service turbine components, which show evidence of hot corrosion, were examined. Components were evaluated by optical metallography, scanning electron microscopy (SEM), and electron micropulse (EMP) examination.

  18. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    NASA Astrophysics Data System (ADS)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  19. Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1

    DTIC Science & Technology

    1989-03-01

    American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to

  20. Integrated Vehicle-Based Safety Systems (IVBSS) Light Vehicle Field Operational Test Independent Evaluation

    DOT National Transportation Integrated Search

    2011-10-01

    This report presents the methodology and results of the independent evaluation of a prototype integrated crash warning system for : light vehicles as part of the Integrated Vehicle-Based Safety Systems initiative of the United States Department of : ...

  1. Sheet metals characterization using the virtual fields method

    NASA Astrophysics Data System (ADS)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2018-05-01

    In this work, a characterisation method involving a deep-notched specimen subjected to a tensile loading is introduced. This specimen leads to heterogeneous states of stress and strain, the latter being measured using a stereo DIC system (MatchID). This heterogeneity enables the identification of multiple material parameters in a single test. In order to identify material parameters from the DIC data, an inverse method called the Virtual Fields Method is employed. The method combined with recently developed sensitivity-based virtual fields allows to optimally locate areas in the test where information about each material parameter is encoded, improving accuracy of the identification over the traditional user-defined virtual fields. It is shown that a single test performed at 45° to the rolling direction is sufficient to obtain all anisotropic plastic parameters, thus reducing experimental effort involved in characterisation. The paper presents the methodology and some numerical validation.

  2. A Pilot Study of a Picture- and Audio-Assisted Self-Interviewing Method (PIASI) for the Study of Sensitive Questions on HIV in the Field

    ERIC Educational Resources Information Center

    Aarnio, Pauliina; Kulmala, Teija

    2016-01-01

    Self-interview methods such as audio computer-assisted self-interviewing (ACASI) are used to improve the accuracy of interview data on sensitive topics in large trials. Small field studies on sensitive topics would benefit from methodological alternatives. In a study on male involvement in antenatal HIV testing in a largely illiterate population…

  3. Yield estimation of corn with multispectral data and the potential of using imaging spectrometers

    NASA Astrophysics Data System (ADS)

    Bach, Heike

    1997-05-01

    In the frame of the special yield estimation, a regular procedure conducted for the European Union to more accurately estimate agricultural yield, a project was conducted for the state minister for Rural Environment, Food and Forestry of Baden-Wuerttemberg, Germany) to test remote sensing data with advanced yield formation models for accuracy and timelines of yield estimation of corn. The methodology employed uses field-based plant parameter estimation from atmospherically corrected multitemporal/multispectral LANDSAT-TM data. An agrometeorological plant-production-model is used for yield prediction. Based solely on 4 LANDSAT-derived estimates and daily meteorological data the grain yield of corn stands was determined for 1995. The modeled yield was compared with results independently gathered within the special yield estimation for 23 test fields in the Upper Rhine Valley. The agrement between LANDSAT-based estimates and Special Yield Estimation shows a relative error of 2.3 percent. The comparison of the results for single fields shows, that six weeks before harvest the grain yield of single corn fields was estimated with a mean relative accuracy of 13 percent using satellite information. The presented methodology can be transferred to other crops and geographical regions. For future applications hyperspectral sensors show great potential to further enhance the results or yield prediction with remote sensing.

  4. [Methods for evaluating diagnostic tests in Enfermedades Infecciosas y Microbiología Clínica].

    PubMed

    Ramos, J M; Hernández, I

    1998-04-01

    In the field of infectious diseases and clinical microbiology, the evaluation of diagnostic tests (DT) is an important research area. The specific difficulties of this type of research has motivated that have not caught the severity methodological of others areas of clinical research. This article try to asses and characterize the methodology of articles about DT published in Enfermedades Infecciosas y Microbiología Clínica (EIMC) journal. Forty-five articles was selected in the EIMC journal during the 1990-1996 period, because of determinate the sensitivity and specificity of different DT. Methodological standards, extensively accepted was used. In all of articles, except one (98%) the gold standard was specified yours use, however in 4 studies (9%) include the DT in the gold standard (incorporation bias). The correct description of DT was reported in 75% of cases, but only in 11% cases the reproducibility of test was evaluated. The description of source of reference population, standard of inclusion and spectrum of composition was described in 58, 33 and 40% of articles, respectively. In 33% of studies presented workup bias, only 6% commented blind-analysis of results, and 11% presented indeterminate test results. Half of the studies reported test indexes for clinical subgroups, only one article (2%) provided numerical precision for test indexes, and only 7% reported receiver operating characteristics curves. The methodological quality of DT research in the EIMC journal may improve in different aspects of design and presentation of results.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  6. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  7. 76 FR 4096 - Notice of Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ..., including the validity of the methodology and assumptions used; (3) Enhance the quality, utility, and...: Revision. Title of Collection: 2011-12 National Postsecondary Student Aid Study (NPSAS:12) Field Test...: Annually. Affected Public: Individuals or households; Businesses or other for-profit; Not-for-profit...

  8. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  9. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  10. Field significance of performance measures in the context of regional climate model evaluation. Part 2: precipitation

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker

    2018-04-01

    A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.

  11. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    NASA Astrophysics Data System (ADS)

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-08-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.

  12. Integrated Aero-Propulsion CFD Methodology for the Hyper-X Flight Experiment

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.; Engelund, Walter C.; Bittner, Robert D.; Dilley, Arthur D.; Jentink, Tom N.; Frendi, Abdelkader

    2000-01-01

    Computational fluid dynamics (CFD) tools have been used extensively in the analysis and development of the X-43A Hyper-X Research Vehicle (HXRV). A significant element of this analysis is the prediction of integrated vehicle aero-propulsive performance, which includes an integration of aerodynamic and propulsion flow fields. This paper describes analysis tools used and the methodology for obtaining pre-flight predictions of longitudinal performance increments. The use of higher-fidelity methods to examine flow-field characteristics and scramjet flowpath component performance is also discussed. Limited comparisons with available ground test data are shown to illustrate the approach used to calibrate methods and assess solution accuracy. Inviscid calculations to evaluate lateral-directional stability characteristics are discussed. The methodology behind 3D tip-to-tail calculations is described and the impact of 3D exhaust plume expansion in the afterbody region is illustrated. Finally, future technology development needs in the area of hypersonic propulsion-airframe integration analysis are discussed.

  13. Towards Intelligent Interpretation of Low Strain Pile Integrity Testing Results Using Machine Learning Techniques

    PubMed Central

    Cui, De-Mi; Wang, Xiao-Quan; Lu, Lie-Min

    2017-01-01

    Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT’s turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts’ interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology’s effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction. PMID:29068431

  14. Optimization of magnetic field-assisted ultrasonication for the disintegration of waste activated sludge using Box-Behnken design with response surface methodology.

    PubMed

    Guan, Su; Deng, Feng; Huang, Si-Qi; Liu, Shu-Yang; Ai, Le-Xian; She, Pu-Ying

    2017-09-01

    This study investigated for the first time the feasibility of using a magnetic field for sludge disintegration. Approximately 41.01% disintegration degree (DD) was reached after 30min at 180mT magnetic field intensity upon separate magnetic field treatment. Protein and polysaccharide contents significantly increased. This test was optimized using a Box-Behnken design (BBD) with response surface methodology (RSM) to fit the multiple equation of the DD. The maximum DD was 43.75% and the protein and polysaccharide contents increased to 56.71 and 119.44mg/L, respectively, when the magnetic field strength was 119.69mT, reaction time was 30.49min, and pH was 9.82 in the optimization experiment. We then analyzed the effects of ultrasound alone. We are the first to combine magnetic field with ultrasound to disintegrate waste-activated sludge (WAS). The optimum effect was obtained with the application of ultrasound alone at 45kHz frequency, with a DD of about 58.09%. By contrast, 62.62% DD was reached in combined magnetic field and ultrasound treatment. This combined test was also optimized using BBD with RSM to fit the multiple equation of DD. The maximum DD of 64.59% was achieved when the magnetic field intensity was 197.87mT, ultrasonic frequency was 42.28kHz, reaction time was 33.96min, and pH was 8.90. These results were consistent with those of particle size and electron microscopy analyses. This research proved that a magnetic field can effectively disintegrate WAS and can be combined with other physical techniques such as ultrasound for optimal results. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  16. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strons, Philip; Bailey, James L.; Davis, John

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  17. Methodology for the passive detection and discrimination of chemical and biological aerosols

    NASA Astrophysics Data System (ADS)

    Marinelli, William J.; Shokhirev, Kirill N.; Konno, Daisei; Rossi, David C.; Richardson, Martin

    2013-05-01

    The standoff detection and discrimination of aerosolized biological and chemical agents has traditionally been addressed through LIDAR approaches, but sensor systems using these methods have yet to be deployed. We discuss the development and testing of an approach to detect these aerosols using the deployed base of passive infrared hyperspectral sensors used for chemical vapor detection. The detection of aerosols requires the inclusion of down welling sky and up welling ground radiation in the description of the radiative transfer process. The wavelength and size dependent ratio of absorption to scattering provides much of the discrimination capability. The approach to the detection of aerosols utilizes much of the same phenomenology employed in vapor detection; however, the sensor system must acquire information on non-line-of-sight sources of radiation contributing to the scattering process. We describe the general methodology developed to detect chemical or biological aerosols, including justifications for the simplifying assumptions that enable the development of a real-time sensor system. Mie scattering calculations, aerosol size distribution dependence, and the angular dependence of the scattering on the aerosol signature will be discussed. This methodology will then be applied to two test cases: the ground level release of a biological aerosol (BG) and a nonbiological confuser (kaolin clay) as well as the debris field resulting from the intercept of a cruise missile carrying a thickened VX warhead. A field measurement, conducted at the Utah Test and Training Range will be used to illustrate the issues associated with the use of the method.

  18. A new methodology for hydro-abrasive erosion tests simulating penstock erosive flow

    NASA Astrophysics Data System (ADS)

    Aumelas, V.; Maj, G.; Le Calvé, P.; Smith, M.; Gambiez, B.; Mourrat, X.

    2016-11-01

    Hydro-abrasive resistance is an important property requirement for hydroelectric power plant penstock coating systems used by EDF. The selection of durable coating systems requires an experimental characterization of coating performance. This can be achieved by performing accelerated and representative laboratory tests. In case of severe erosion induced by a penstock flow, there is no suitable method or standard representative of real erosive flow conditions. The presented study aims at developing a new methodology and an associated laboratory experimental device. The objective of the laboratory apparatus is to subject coated test specimens to wear conditions similar to the ones generated at the penstock lower generatrix in actual flow conditions. Thirteen preselected coating solutions were first been tested during a 45 hours erosion test. A ranking of the thirteen coating solutions was then determined after characterisation. To complete this first evaluation and to determine the wear kinetic of the four best coating solutions, additional erosion tests were conducted with a longer duration of 216 hours. A comparison of this new method with standardized tests and with real service operating flow conditions is also discussed. To complete the final ranking based on hydro-abrasive erosion tests, some trial tests were carried out on penstock samples to check the application method of selected coating systems. The paper gives some perspectives related to erosion test methodologies for materials and coating solutions for hydraulic applications. The developed test method can also be applied in other fields.

  19. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piekarski, D.; Brad, D.

    This report is about a work effort where the overall objectives were to establish a methodology and approach for selected transmission and distribution (T&D) grid modernization; monitor the results; and report on the findings, recommendations, and lessons learned. The work reported addressed T&D problems and solutions, related reliability issues, equipment and operation upgrades, and respective field testing.

  1. Comparing two ground-cover measurement methodologies for semiarid rangelands

    USDA-ARS?s Scientific Manuscript database

    The limited field-of-view (FOV) associated with single-resolution very-large scale aerial (VLSA) imagery requires users to balance FOV and resolution needs. This balance varies by the specific questions being asked of the data. Here, we tested a FOV-resolution question by comparing ground-cover meas...

  2. About subjective evaluation of adaptive video streaming

    NASA Astrophysics Data System (ADS)

    Tavakoli, Samira; Brunnström, Kjell; Garcia, Narciso

    2015-03-01

    The usage of HTTP Adaptive Streaming (HAS) technology by content providers is increasing rapidly. Having available the video content in multiple qualities, using HAS allows to adapt the quality of downloaded video to the current network conditions providing smooth video-playback. However, the time-varying video quality by itself introduces a new type of impairment. The quality adaptation can be done in different ways. In order to find the best adaptation strategy maximizing users perceptual quality it is necessary to investigate about the subjective perception of adaptation-related impairments. However, the novelties of these impairments and their comparably long time duration make most of the standardized assessment methodologies fall less suited for studying HAS degradation. Furthermore, in traditional testing methodologies, the quality of the video in audiovisual services is often evaluated separated and not in the presence of audio. Nevertheless, the requirement of jointly evaluating the audio and the video within a subjective test is a relatively under-explored research field. In this work, we address the research question of determining the appropriate assessment methodology to evaluate the sequences with time-varying quality due to the adaptation. This was done by studying the influence of different adaptation related parameters through two different subjective experiments using a methodology developed to evaluate long test sequences. In order to study the impact of audio presence on quality assessment by the test subjects, one of the experiments was done in the presence of audio stimuli. The experimental results were subsequently compared with another experiment using the standardized single stimulus Absolute Category Rating (ACR) methodology.

  3. Challenges in the use of treatment to investigate cognition.

    PubMed

    Nickels, Lyndsey; Rapp, Brenda; Kohnen, Saskia

    2015-01-01

    The use of data from people with cognitive impairments to inform theories of cognition is an established methodology, particularly in the field of cognitive neuropsychology. However, it is less well known that studies that aim to improve cognitive functioning using treatment can also inform our understanding of cognition. This paper discusses a range of challenges that researchers face when testing theories of cognition and particularly when using treatment as a tool for doing so. It highlights the strengths of treatment methodology for testing causal relations and additionally discusses how generalization of treatment effects can shed light on the nature of cognitive representations and processes. These points are illustrated using examples from the Special Issue of Cognitive Neuropsychology entitled Treatment as a tool for investigating cognition.

  4. Next Generation Ship-Borne ASW-System: An Exemplary Exertion of Methodologies and Tools Applied According to the German Military Acquisition Guidelines

    DTIC Science & Technology

    2013-06-01

    as well as the evaluation of product parameters, operational test and functional limits. The product will be handed over to the designated ...which results in a system design that can be tested , produced, and fielded to satisfy the need. The concept development phase enables us to determine...specifications that can be tested or verified. The requirements presented earlier are the minimum necessary to allow the design process to find

  5. Integrating field methodology and web-based data collection to assess the reliability of the Alcohol Use Disorders Identification Test (AUDIT).

    PubMed

    Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P

    2011-12-01

    Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. Yield estimation of corn based on multitemporal LANDSAT-TM data as input for an agrometeorological model

    NASA Astrophysics Data System (ADS)

    Bach, Heike

    1998-07-01

    In order to test remote sensing data with advanced yield formation models for accuracy and timeliness of yield estimation of corn, a project was conducted for the State Ministry for Rural Environment, Food, and Forestry of Baden-Württemberg (Germany). This project was carried out during the course of the `Special Yield Estimation', a regular procedure conducted for the European Union, to more accurately estimate agricultural yield. The methodology employed uses field-based plant parameter estimation from atmospherically corrected multitemporal/multispectral LANDSAT-TM data. An agrometeorological plant-production-model is used for yield prediction. Based solely on four LANDSAT-derived estimates (between May and August) and daily meteorological data, the grain yield of corn fields was determined for 1995. The modelled yields were compared with results gathered independently within the Special Yield Estimation for 23 test fields in the upper Rhine valley. The agreement between LANDSAT-based estimates (six weeks before harvest) and Special Yield Estimation (at harvest) shows a relative error of 2.3%. The comparison of the results for single fields shows that six weeks before harvest, the grain yield of corn was estimated with a mean relative accuracy of 13% using satellite information. The presented methodology can be transferred to other crops and geographical regions. For future applications hyperspectral sensors show great potential to further enhance the results for yield prediction with remote sensing.

  7. Sampling design for groundwater solute transport: Tests of methods and analysis of Cape Cod tracer test data

    USGS Publications Warehouse

    Knopman, Debra S.; Voss, Clifford I.; Garabedian, Stephen P.

    1991-01-01

    Tests of a one-dimensional sampling design methodology on measurements of bromide concentration collected during the natural gradient tracer test conducted by the U.S. Geological Survey on Cape Cod, Massachusetts, demonstrate its efficacy for field studies of solute transport in groundwater and the utility of one-dimensional analysis. The methodology was applied to design of sparse two-dimensional networks of fully screened wells typical of those often used in engineering practice. In one-dimensional analysis, designs consist of the downstream distances to rows of wells oriented perpendicular to the groundwater flow direction and the timing of sampling to be carried out on each row. The power of a sampling design is measured by its effectiveness in simultaneously meeting objectives of model discrimination, parameter estimation, and cost minimization. One-dimensional models of solute transport, differing in processes affecting the solute and assumptions about the structure of the flow field, were considered for description of tracer cloud migration. When fitting each model using nonlinear regression, additive and multiplicative error forms were allowed for the residuals which consist of both random and model errors. The one-dimensional single-layer model of a nonreactive solute with multiplicative error was judged to be the best of those tested. Results show the efficacy of the methodology in designing sparse but powerful sampling networks. Designs that sample five rows of wells at five or fewer times in any given row performed as well for model discrimination as the full set of samples taken up to eight times in a given row from as many as 89 rows. Also, designs for parameter estimation judged to be good by the methodology were as effective in reducing the variance of parameter estimates as arbitrary designs with many more samples. Results further showed that estimates of velocity and longitudinal dispersivity in one-dimensional models based on data from only five rows of fully screened wells each sampled five or fewer times were practically equivalent to values determined from moments analysis of the complete three-dimensional set of 29,285 samples taken during 16 sampling times.

  8. Field Test: Results from the One Year Mission

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Kozlovskaya, I. B.; Kofman, I. S.; Tomilovskaya, E. S.; Cerisano, J. M.; Rosenberg, M. J. F.; Bloomberg, J. J.; Stenger, M. B.; Lee, S. M. C.; Laurie, S. S.; hide

    2017-01-01

    The One Year Mission was designed to aid in determining the effect that extending the duration on orbit aboard the International Space Station (ISS) would have on a number of biological and physiological systems. Two crewmembers were selected to participate in this endeavor, one U.S. On-Orbit Segment (USOS) astronaut and one Russian cosmonaut. The Neuroscience and Cardiovascular and Vision Laboratories at the Johnson Space Center and the Sensory-Motor and Countermeasures Division within the Institute for Biomedical Problems were selected to investigate vestibular, sensorimotor and cardiovascular function with the two long-duration crewmembers using the established methodology developed for the Field Test (FT).

  9. Developing a one-semester course in forensic chemical science for university undergraduates

    NASA Astrophysics Data System (ADS)

    Salem, Roberta Sue

    The purpose of this study was to research, develop and validate a one-semester course for the general education of university undergraduates in forensic chemical education. The course outline was developed using the research and development (R&D) methodology recommended by Gall, Borg, and Gall, (2003) and Dick and Carey, (2001) through a three step developmental cycle. Information was gathered and analyzed through review of literature and proof of concept interviews, laying the foundation for the framework of the course outline. A preliminary course outline was developed after a needs assessment showed need for such a course. Professors expert in the area of forensic science participated in the first field test of the course. Their feedback was recorded, and the course was revised for a main field test. Potential users of the guide served as readers for the main field test and offered more feedback to improve the course.

  10. Sweating Rate and Sweat Sodium Concentration in Athletes: A Review of Methodology and Intra/Interindividual Variability.

    PubMed

    Baker, Lindsay B

    2017-03-01

    Athletes lose water and electrolytes as a consequence of thermoregulatory sweating during exercise and it is well known that the rate and composition of sweat loss can vary considerably within and among individuals. Many scientists and practitioners conduct sweat tests to determine sweat water and electrolyte losses of athletes during practice and competition. The information gleaned from sweat testing is often used to guide personalized fluid and electrolyte replacement recommendations for athletes; however, unstandardized methodological practices and challenging field conditions can produce inconsistent/inaccurate results. The primary objective of this paper is to provide a review of the literature regarding the effect of laboratory and field sweat-testing methodological variations on sweating rate (SR) and sweat composition (primarily sodium concentration [Na + ]). The simplest and most accurate method to assess whole-body SR is via changes in body mass during exercise; however, potential confounding factors to consider are non-sweat sources of mass change and trapped sweat in clothing. In addition, variability in sweat [Na + ] can result from differences in the type of collection system used (whole body or localized), the timing/duration of sweat collection, skin cleaning procedure, sample storage/handling, and analytical technique. Another aim of this paper is to briefly review factors that may impact intra/interindividual variability in SR and sweat [Na + ] during exercise, including exercise intensity, environmental conditions, heat acclimation, aerobic capacity, body size/composition, wearing of protective equipment, sex, maturation, aging, diet, and/or hydration status. In summary, sweat testing can be a useful tool to estimate athletes' SR and sweat Na + loss to help guide fluid/electrolyte replacement strategies, provided that data are collected, analyzed, and interpreted appropriately.

  11. Establishing equivalence: methodological progress in group-matching design and analysis.

    PubMed

    Kover, Sara T; Atwoo, Amy K

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, Fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios.

  12. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    PubMed Central

    Kover, Sara T.; Atwood, Amy K.

    2017-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs utilized in behavioral research on cognition and language in neurodevelopmental disorders, including autism spectrum disorder, fragile X syndrome, Down syndrome, and Williams syndrome. The limitations of relying on p-values to establish group equivalence are discussed in the context of other existing methods: equivalence tests, propensity scores, and regression-based analyses. Our primary recommendation for advancing research on intellectual and developmental disabilities is the use of descriptive indices of adequate group matching: effect sizes (i.e., standardized mean differences) and variance ratios. PMID:23301899

  13. Combining 3D Hydraulic Tomography with Tracer Tests for Improved Transport Characterization.

    PubMed

    Sanchez-León, E; Leven, C; Haslauer, C P; Cirpka, O A

    2016-07-01

    Hydraulic tomography (HT) is a method for resolving the spatial distribution of hydraulic parameters to some extent, but many details important for solute transport usually remain unresolved. We present a methodology to improve solute transport predictions by combining data from HT with the breakthrough curve (BTC) of a single forced-gradient tracer test. We estimated the three dimensional (3D) hydraulic-conductivity field in an alluvial aquifer by inverting tomographic pumping tests performed at the Hydrogeological Research Site Lauswiesen close to Tübingen, Germany, using a regularized pilot-point method. We compared the estimated parameter field to available profiles of hydraulic-conductivity variations from direct-push injection logging (DPIL), and validated the hydraulic-conductivity field with hydraulic-head measurements of tests not used in the inversion. After validation, spatially uniform parameters for dual-domain transport were estimated by fitting tracer data collected during a forced-gradient tracer test. The dual-domain assumption was used to parameterize effects of the unresolved heterogeneity of the aquifer and deemed necessary to fit the shape of the BTC using reasonable parameter values. The estimated hydraulic-conductivity field and transport parameters were subsequently used to successfully predict a second independent tracer test. Our work provides an efficient and practical approach to predict solute transport in heterogeneous aquifers without performing elaborate field tracer tests with a tomographic layout. © 2015, National Ground Water Association.

  14. Research Methodology in Second Language Studies: Trends, Concerns, and New Directions

    ERIC Educational Resources Information Center

    King, Kendall A.; Mackey, Alison

    2016-01-01

    The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…

  15. Performance testing of 3D point cloud software

    NASA Astrophysics Data System (ADS)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  16. TRENDS IN FLOODS AND LOW FLOWS IN THE UNITED STATES: IMPACT OF SPATIAL CORRELATION. (R824992,R826888)

    EPA Science Inventory

    Trends in flood and low flows in the US were evaluated using a regional average Kendall's S trend test at two spatial scales and over two timeframes. Field significance was assessed using a bootstrap methodology to account for the observed regional cross-correlation of streamflow...

  17. Application of CCG Sensors to a High-Temperature Structure Subjected to Thermo-Mechanical Load.

    PubMed

    Xie, Weihua; Meng, Songhe; Jin, Hua; Du, Chong; Wang, Libin; Peng, Tao; Scarpa, Fabrizio; Xu, Chenghai

    2016-10-13

    This paper presents a simple methodology to perform a high temperature coupled thermo-mechanical test using ultra-high temperature ceramic material specimens (UHTCs), which are equipped with chemical composition gratings sensors (CCGs). The methodology also considers the presence of coupled loading within the response provided by the CCG sensors. The theoretical strain of the UHTCs specimens calculated with this technique shows a maximum relative error of 2.15% between the analytical and experimental data. To further verify the validity of the results from the tests, a Finite Element (FE) model has been developed to simulate the temperature, stress and strain fields within the UHTC structure equipped with the CCG. The results show that the compressive stress exceeds the material strength at the bonding area, and this originates a failure by fracture of the supporting structure in the hot environment. The results related to the strain fields show that the relative error with the experimental data decrease with an increase of temperature. The relative error is less than 15% when the temperature is higher than 200 °C, and only 6.71% at 695 °C.

  18. A Field-Based Aquatic Life Benchmark for Conductivity in ...

    EPA Pesticide Factsheets

    This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for dissolved salts as measured by conductivity in Central Appalachian streams using data from West Virginia and Kentucky. This report provides scientific evidence for a conductivity benchmark in a specific region rather than for the entire United States.

  19. Volatile organic compound sensor system

    DOEpatents

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.; Bomstad, Theresa M [Laramie, WY; Sorini-Wong, Susan S [Laramie, WY

    2009-02-10

    Generally, this invention relates to the development of field monitoring methodology for new substances and sensing chemical warfare agents (CWAs) and terrorist substances. It also relates to a portable test kit which may be utilized to measure concentrations of halogenated volatile organic compounds (VOCs) in the field. Specifically it relates to systems for reliably field sensing the potential presence of such items while also distinguishing them from other elements potentially present. It also relates to overall systems and processes for sensing, reacting, and responding to an indicated presence of such substance, including modifications of existing halogenated sensors and arrayed sensing systems and methods.

  20. Volatile organic compound sensor system

    DOEpatents

    Schabron, John F.; Rovani, Jr., Joseph F.; Bomstad, Theresa M.; Sorini-Wong, Susan S.; Wong, Gregory K.

    2011-03-01

    Generally, this invention relates to the development of field monitoring methodology for new substances and sensing chemical warfare agents (CWAs) and terrorist substances. It also relates to a portable test kit which may be utilized to measure concentrations of halogenated volatile organic compounds (VOCs) in the field. Specifically it relates to systems for reliably field sensing the potential presence of such items while also distinguishing them from other elements potentially present. It also relates to overall systems and processes for sensing, reacting, and responding to an indicated presence of such substance, including modifications of existing halogenated sensors and arrayed sensing systems and methods.

  1. Enabling reliability assessments of pre-commercial perovskite photovoltaics with lessons learned from industrial standards

    NASA Astrophysics Data System (ADS)

    Snaith, Henry J.; Hacke, Peter

    2018-06-01

    Photovoltaic modules are expected to operate in the field for more than 25 years, so reliability assessment is critical for the commercialization of new photovoltaic technologies. In early development stages, understanding and addressing the device degradation mechanisms are the priorities. However, any technology targeting large-scale deployment must eventually pass industry-standard qualification tests and undergo reliability testing to validate the module lifetime. In this Perspective, we review the methodologies used to assess the reliability of established photovoltaics technologies and to develop standardized qualification tests. We present the stress factors and stress levels for degradation mechanisms currently identified in pre-commercial perovskite devices, along with engineering concepts for mitigation of those degradation modes. Recommendations for complete and transparent reporting of stability tests are given, to facilitate future inter-laboratory comparisons and to further the understanding of field-relevant degradation mechanisms, which will benefit the development of accelerated stress tests.

  2. Development of an autodissemination strategy for the deployment of novel control agents targeting the common malaria mosquito, Anopheles quadrimaculatus say (Diptera: Culicidae).

    PubMed

    Swale, Daniel R; Li, Zhilin; Kraft, Jake Z; Healy, Kristen; Liu, Mei; David, Connie M; Liu, Zhijun; Foil, Lane D

    2018-04-01

    The reduced efficacy of current Anopheline mosquito control methods underscores the need to develop new methods of control that exploit unique target sites and/or utilizes novel deployment methods. Autodissemination methodologies using insect growth regulators (IGRs) is growing in interest and has been shown to be effective at controlling Aedes mosquitoes in semi-field and field environments, yet little information exists for Anopheline mosquitoes. Therefore, we tested the hypothesis that female-driven autodissemination of an IGR combined with a new mechanism of action insecticide (Kir channel inhibitor) could be employed to reduce Anopheline populations. We studied the ability of three IGRs to be transferred to the larval habitat during oviposition in laboratory and semi-field environments. Adult mosquitoes were exposed to the chemicals for 4 hours immediately after blood feeding and efficacy was tested using classical methodologies, including adult emergence inhibition and High Performance Liquid Chromatography (HPLC). A complete autodissemination design was tested in a semi-field environment. Larval survivability and adult emergence were significantly reduced in habitats that were visited by novaluron treated adults, but no statistical differences were observed with pyriproxyfen or triflumuron. These data suggested novaluron, but not pyriproxyfen or triflumuron, was horizontally transferred from the adult mosquito to the larval habitat during oviposition. HPLC studies supported the toxicity data and showed that novaluron was present in the majority of larval habitats, suggesting that novaluron can be horizontally transferred by Anopheles quadrimaculatus. Importantly, the combination of novaluron and the Kir channel inhibitor, VU041, was capable of reducing adult and larval populations in semi-field environments. Novaluron can be transferred to the adult at a greater efficacy and/or is not degraded as quickly during the gonotropic cycle when compared to pyriproxyfen or triflumuron. Pending field confirmation, autodissemination approaches with novaluron may be a suitable tool to manage Anopheles populations.

  3. Willkommen, Mr. Chance: Methodologische Betrachtungen zur Gute empirischer Forschung in der Padagogik, diskutiert vor allem an der neueren Untersuchung uber Gewalt von Heitmeyer u.a. (1995) = Welcome, Mr. Chance: Methodological Considerations Concerning the Quality of Empirical Research in Educational Science Based on a Recent Study on Violence Published by Heitmeyer et al. (1995).

    ERIC Educational Resources Information Center

    Wellenreuther, Martin

    1997-01-01

    Argues that the usefulness of strictly quantitative research is still questioned in educational studies, primarily due to deficiencies in methodological training. Uses a critique of a recent study by Heitmeyer et al. (1995) to illustrate the requirements of "good" empirical research. Considers the problems of hypothesis testing in field research.…

  4. A systems-based food safety evaluation: an experimental approach.

    PubMed

    Higgins, Charles L; Hartfield, Barry S

    2004-11-01

    Food establishments are complex systems with inputs, subsystems, underlying forces that affect the system, outputs, and feedback. Building on past exploration of the hazard analysis critical control point concept and Ludwig von Bertalanffy General Systems Theory, the National Park Service (NPS) is attempting to translate these ideas into a realistic field assessment of food service establishments and to use information gathered by these methods in efforts to improve food safety. Over the course of the last two years, an experimental systems-based methodology has been drafted, developed, and tested by the NPS Public Health Program. This methodology is described in this paper.

  5. An empirical method to estimate shear wave velocity of soils in the New Madrid seismic zone

    USGS Publications Warehouse

    Wei, B.-Z.; Pezeshk, S.; Chang, T.-S.; Hall, K.H.; Liu, Huaibao P.

    1996-01-01

    In this study, a set of charts are developed to estimate shear wave velocity of soils in the New Madrid seismic zone (NMSZ), using the standard penetration test (SPT) N values and soil depths. Laboratory dynamic test results of soil samples collected from the NMSZ showed that the shear wave velocity of soils is related to the void ratio and the effective confining pressure applied to the soils. The void ratio of soils can be estimated from the SPT N values and the effective confining pressure depends on the depth of soils. Therefore, the shear wave velocity of soils can be estimated from the SPT N value and the soil depth. To make the methodology practical, two corrections should be made. One is that field SPT N values of soils must be adjusted to an unified SPT N??? value to account the effects of overburden pressure and equipment. The second is that the effect of water table to effective overburden pressure of soils must be considered. To verify the methodology, shear wave velocities of five sites in the NMSZ are estimated and compared with those obtained from field measurements. The comparison shows that our approach and the field tests are consistent with an error of less than of 15%. Thus, the method developed in this study is useful for dynamic study and practical designs in the NMSZ region. Copyright ?? 1996 Elsevier Science Limited.

  6. Application of cokriging techniques for the estimation of hail size

    NASA Astrophysics Data System (ADS)

    Farnell, Carme; Rigo, Tomeu; Martin-Vide, Javier

    2018-01-01

    There are primarily two ways of estimating hail size: the first is the direct interpolation of point observations, and the second is the transformation of remote sensing fields into measurements of hail properties. Both techniques have advantages and limitations as regards generating the resultant map of hail damage. This paper presents a new methodology that combines the above mentioned techniques in an attempt to minimise the limitations and take advantage of the benefits of interpolation and the use of remote sensing data. The methodology was tested for several episodes with good results being obtained for the estimation of hail size at practically all the points analysed. The study area presents a large database of hail episodes, and for this reason, it constitutes an optimal test bench.

  7. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  8. Development of Detonation Modeling Capabilities for Rocket Test Facilities: Hydrogen-Oxygen-Nitrogen Mixtures

    NASA Technical Reports Server (NTRS)

    Allgood, Daniel C.

    2016-01-01

    The objective of the presented work was to develop validated computational fluid dynamics (CFD) based methodologies for predicting propellant detonations and their associated blast environments. Applications of interest were scenarios relevant to rocket propulsion test and launch facilities. All model development was conducted within the framework of the Loci/CHEM CFD tool due to its reliability and robustness in predicting high-speed combusting flow-fields associated with rocket engines and plumes. During the course of the project, verification and validation studies were completed for hydrogen-fueled detonation phenomena such as shock-induced combustion, confined detonation waves, vapor cloud explosions, and deflagration-to-detonation transition (DDT) processes. The DDT validation cases included predicting flame acceleration mechanisms associated with turbulent flame-jets and flow-obstacles. Excellent comparison between test data and model predictions were observed. The proposed CFD methodology was then successfully applied to model a detonation event that occurred during liquid oxygen/gaseous hydrogen rocket diffuser testing at NASA Stennis Space Center.

  9. Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Yungster, S.

    1996-01-01

    A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.

  10. Establishing a Ballistic Test Methodology for Documenting the Containment Capability of Small Gas Turbine Engine Compressors

    NASA Technical Reports Server (NTRS)

    Heady, Joel; Pereira, J. Michael; Ruggeri, Charles R.; Bobula, George A.

    2009-01-01

    A test methodology currently employed for large engines was extended to quantify the ballistic containment capability of a small turboshaft engine compressor case. The approach involved impacting the inside of a compressor case with a compressor blade. A gas gun propelled the blade into the case at energy levels representative of failed compressor blades. The test target was a full compressor case. The aft flange was rigidly attached to a test stand and the forward flange was attached to a main frame to provide accurate boundary conditions. A window machined in the case allowed the projectile to pass through and impact the case wall from the inside with the orientation, direction and speed that would occur in a blade-out event. High-peed, digital-video cameras provided accurate velocity and orientation data. Calibrated cameras and digital image correlation software generated full field displacement and strain information at the back side of the impact point.

  11. Toward a new methodological paradigm for testing theories of health behavior and health behavior change.

    PubMed

    Noar, Seth M; Mehrotra, Purnima

    2011-03-01

    Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Residential Two-Stage Gas Furnaces - Do They Save Energy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lekov, Alex; Franco, Victor; Lutz, James

    2006-05-12

    Residential two-stage gas furnaces account for almost a quarter of the total number of models listed in the March 2005 GAMA directory of equipment certified for sale in the United States. Two-stage furnaces are expanding their presence in the market mostly because they meet consumer expectations for improved comfort. Currently, the U.S. Department of Energy (DOE) test procedure serves as the method for reporting furnace total fuel and electricity consumption under laboratory conditions. In 2006, American Society of Heating Refrigeration and Air-conditioning Engineers (ASHRAE) proposed an update to its test procedure which corrects some of the discrepancies found in themore » DOE test procedure and provides an improved methodology for calculating the energy consumption of two-stage furnaces. The objectives of this paper are to explore the differences in the methods for calculating two-stage residential gas furnace energy consumption in the DOE test procedure and in the 2006 ASHRAE test procedure and to compare test results to research results from field tests. Overall, the DOE test procedure shows a reduction in the total site energy consumption of about 3 percent for two-stage compared to single-stage furnaces at the same efficiency level. In contrast, the 2006 ASHRAE test procedure shows almost no difference in the total site energy consumption. The 2006 ASHRAE test procedure appears to provide a better methodology for calculating the energy consumption of two-stage furnaces. The results indicate that, although two-stage technology by itself does not save site energy, the combination of two-stage furnaces with BPM motors provides electricity savings, which are confirmed by field studies.« less

  13. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires.

    PubMed

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  14. Pilot Testing of a Sampling Methodology for Assessing Seed Attachment Propensity and Transport Rate in a Soil Matrix Carried on Boot Soles and Bike Tires

    NASA Astrophysics Data System (ADS)

    Hardiman, Nigel; Dietz, Kristina Charlotte; Bride, Ian; Passfield, Louis

    2017-01-01

    Land managers of natural areas are under pressure to balance demands for increased recreation access with protection of the natural resource. Unintended dispersal of seeds by visitors to natural areas has high potential for weedy plant invasions, with initial seed attachment an important step in the dispersal process. Although walking and mountain biking are popular nature-based recreation activities, there are few studies quantifying propensity for seed attachment and transport rate on boot soles and none for bike tires. Attachment and transport rate can potentially be affected by a wide range of factors for which field testing can be time-consuming and expensive. We pilot tested a sampling methodology for measuring seed attachment and transport rate in a soil matrix carried on boot soles and bike tires traversing a known quantity and density of a seed analog (beads) over different distances and soil conditions. We found % attachment rate on boot soles was much lower overall than previously reported, but that boot soles had a higher propensity for seed attachment than bike tires in almost all conditions. We believe our methodology offers a cost-effective option for researchers seeking to manipulate and test effects of different influencing factors on these two dispersal vectors.

  15. Developing an oropharyngeal cancer (OPC) knowledge and behaviors survey.

    PubMed

    Dodd, Virginia J; Riley Iii, Joseph L; Logan, Henrietta L

    2012-09-01

    To use the community participation research model to (1) develop a survey assessing knowledge about mouth and throat cancer and (2) field test and establish test-retest reliability with newly developed instrument. Cognitive interviews with primarily rural African American adults to assess their perception and interpretation of survey items. Test-retest reliability was established with a racially diverse rural population. Test-retest reliabilities ranged from .79 to .40 for screening awareness and .74 to .19 for knowledge. Coefficients increased for composite scores. Community participation methodology provided a culturally appropriate survey instrument that demonstrated acceptable levels of reliability.

  16. Methodology in diagnostic laboratory test research in clinical chemistry and clinical chemistry and laboratory medicine.

    PubMed

    Lumbreras-Lacarra, Blanca; Ramos-Rincón, José Manuel; Hernández-Aguado, Ildefonso

    2004-03-01

    The application of epidemiologic principles to clinical diagnosis has been less developed than in other clinical areas. Knowledge of the main flaws affecting diagnostic laboratory test research is the first step for improving its quality. We assessed the methodologic aspects of articles on laboratory tests. We included articles that estimated indexes of diagnostic accuracy (sensitivity and specificity) and were published in Clinical Chemistry or Clinical Chemistry and Laboratory Medicine in 1996, 2001, and 2002. Clinical Chemistry has paid special attention to this field of research since 1996 by publishing recommendations, checklists, and reviews. Articles were identified through electronic searches in Medline. The strategy combined the Mesh term "sensitivity and specificity" (exploded) with the text words "specificity", "false negative", and "accuracy". We examined adherence to seven methodologic criteria used in the study by Reid et al. (JAMA1995;274:645-51) of papers published in general medical journals. Three observers evaluated each article independently. Seventy-nine articles fulfilled the inclusion criteria. The percentage of studies that satisfied each criterion improved from 1996 to 2002. Substantial improvement was observed in reporting of the statistical uncertainty of indices of diagnostic accuracy, in criteria based on clinical information from the study population (spectrum composition), and in avoidance of workup bias. Analytical reproducibility was reported frequently (68%), whereas information about indeterminate results was rarely provided. The mean number of methodologic criteria satisfied showed a statistically significant increase over the 3 years in Clinical Chemistry but not in Clinical Chemistry and Laboratory Medicine. The methodologic quality of the articles on diagnostic test research published in Clinical Chemistry and Clinical Chemistry and Laboratory Medicine is comparable to the quality observed in the best general medical journals. The methodologic aspects that most need improvement are those linked to the clinical information of the populations studied. Editorial actions aimed to increase the quality of reporting of diagnostic studies could have a relevant positive effect, as shown by the improvement observed in Clinical Chemistry.

  17. Air Force Energy Plan 2010

    DTIC Science & Technology

    2009-11-24

    production on Air Bases  Field the Critical Asset Prioritization Methodology ( CAPM ) tool  Manage costs  Provide energy leadership throughout the Air...residing on military installations • Field the Critical Asset Prioritization Methodology ( CAPM ) tool. This CAPM tool will allow prioritization of Air...fielding of the Critical Asset Prioritization Methodology ( CAPM ) tool and the adoption of financial standards to enable transparency across Air

  18. Field Quality and Fabrication Analysis of HQ02 Reconstructed Nb3Sn Coil Cross Sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Eddie Frank; Ambrosio, Giorgio; Carbonara, Andrea

    2017-01-23

    The US LHC Accelerator Research Program (LARP) quadrupole HQ02 was designed and fully tested as part of the low-beta quad development for Hi-Lumi LHC. HQ02’s design is well documented with full fabrication accounting along with full field analysis at low and high current. With this history, HQ02 is an excellent test bed for developing a methodology for measuring turn locations from magnet cross sections and comparing with CAD models and measured field. All 4 coils of HQ02 were cut in identical locations along the magnetic length corresponding to magnetic field measurement and coil metrology. A real-time camera and coordinate measuringmore » equipment was used to plot turn corners. Measurements include systematic and random displacements of winding blocks and individual turns along the magnetic length. The range of cable shifts and the field harmonic range along the length are in agreement, although correlating turn locations and measured harmonics in each cross section is challenging.« less

  19. Methodology, status and plans for development and assessment of Cathare code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests ormore » integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.« less

  20. Methodological approach for substantiating disease freedom in a heterogeneous small population. Application to ovine scrapie, a disease with a strong genetic susceptibility.

    PubMed

    Martinez, Marie-José; Durand, Benoit; Calavas, Didier; Ducrot, Christian

    2010-06-01

    Demonstrating disease freedom is becoming important in different fields including animal disease control. Most methods consider sampling only from a homogeneous population in which each animal has the same probability of becoming infected. In this paper, we propose a new methodology to calculate the probability of detecting the disease if it is present in a heterogeneous population of small size with potentially different risk groups, differences in risk being defined using relative risks. To calculate this probability, for each possible arrangement of the infected animals in the different groups, the probability that all the animals tested are test-negative given this arrangement is multiplied by the probability that this arrangement occurs. The probability formula is developed using the assumption of a perfect test and hypergeometric sampling for finite small size populations. The methodology is applied to scrapie, a disease affecting small ruminants and characterized in sheep by a strong genetic susceptibility defining different risk groups. It illustrates that the genotypes of the tested animals influence heavily the confidence level of detecting scrapie. The results present the statistical power for substantiating disease freedom in a small heterogeneous population as a function of the design prevalence, the structure of the sample tested, the structure of the herd and the associated relative risks. (c) 2010 Elsevier B.V. All rights reserved.

  1. Characterization of gamma field in the JSI TRIGA reactor

    NASA Astrophysics Data System (ADS)

    Ambrožič, Klemen; Radulović, Vladimir; Snoj, Luka; Gruel, Adrien; Guillou, Mael Le; Blaise, Patrick; Destouches, Christophe; Barbot, Loïc

    2018-01-01

    Research reactors such as the "Jožzef Stefan" Institute TRIGA reactor have primarily been designed for experimentation and sample irradiation with neutrons. However recent developments in incorporating additional instrumentation for nuclear power plant support and with novel high flux material testing reactor designs, γ field characterization has become of great interest for the characterization of the changes in operational parameters of electronic devices and for the evaluation of γ heating of MTR's structural materials in a representative reactor Γ spectrum. In this paper, we present ongoing work on γ field characterization both experimentally, by performing γ field measurements, and by simulations, using Monte Carlo particle transport codes in conjunction with R2S methodology for delayed γ field characterization.

  2. Consumer Protection Strategies: A Literature Review and Synthesis. Improving the Consumer Protection Function in Postsecondary Education.

    ERIC Educational Resources Information Center

    Helliwell, Carolyn B.; Jung, Steven M.

    Summarized are the findings of an American Institutes for Research (AIR) project to field test a data capture and dissemination system that would provide information for improving consumer protection in postsecondary education. Presented is a discussion of the methodology used, examples of consumer abuses cited in the literature, an analysis of…

  3. National Study of Postsecondary Faculty (NSOPF:04) Field Test Methodology Report, 2004. Working Paper Series. NCES 2004-01

    ERIC Educational Resources Information Center

    Heuer, R. E.; Cahalan, M.; Fahimi, M.; Curry-Tucker, J. L.; Carley-Baxter, L.; Curtin, T. R.; Hinsdale, M.; Jewell, D. M.; Kuhr, B. D.; McLean, L.

    2004-01-01

    The 2004 National Study of Postsecondary Faculty (NSOPF:04), conducted by RTI International (RTI) and sponsored by the U.S. Department of Education's National Center for Education Statistics (NCES), is a nationally representative study that collects data regarding the characteristics, workload, and career paths of full- and part-time…

  4. High-throughput fabrication and screening improves gold nanoparticle chemiresistor sensor performance.

    PubMed

    Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard

    2015-02-09

    Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.

  5. [Effect of genotype and day or night time of testing on mice behavior in the light-dark box and the open-field tests].

    PubMed

    Morozova, M V; Kulikov, A V

    2010-01-01

    The light-dark box (LDB) and the open-field (OF) tests are widespread experimental models for studying locomotion and anxiety in laboratory rats and mice. The fact that rodents are nocturnal animals and more active at night raises a critical question of whether behavioral experiments carried out in the light phase are methodologically correct. Parameters of behavior of four mouse strains (C57BL/6J, DBA2/J, AKR/J and CBA/LacJ) in the light-dark box and open-field tests in the light and dark phases were compared. No significant influence of the phase of testing on anxiety in LDB and OF tests was revealed. In the OF test CBA mice showed increased locomotor activity, whereas AKR and C57BL/6 mice showed increased defecation in the dark phase. It was concluded that: 1) the phase of testing is not crucial for the expression of anxiety in LDB and OF; 2) the sensitivity to the phase of testing depends on the genotype; 3) the indices of behavior in the genotypes sensitive to the phase of testing (locomotion in the CBA and defecation in the AKR and C57BL/6 mouse strains) are increased in the dark phase.

  6. A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinn, Heather M; Graham, Paul S; Morgan, Keith S

    2008-01-01

    Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less

  7. Mediators and moderators in early intervention research.

    PubMed

    Breitborde, Nicholas J K; Srihari, Vinod H; Pollard, Jessica M; Addington, Donald N; Woods, Scott W

    2010-05-01

    The goal of this paper is to provide clarification with regard to the nature of mediator and moderator variables and the statistical methods used to test for the existence of these variables. Particular attention will be devoted to discussing the ways in which the identification of mediator and moderator variables may help to advance the field of early intervention in psychiatry. We completed a literature review of the methodological strategies used to test for mediator and moderator variables. Although several tests for mediator variables are currently available, recent evaluations suggest that tests which directly evaluate the indirect effect are superior. With regard to moderator variables, two approaches ('pick-a-point' and regions of significance) are available, and we provide guidelines with regard to how researchers can determine which approach may be most appropriate to use for their specific study. Finally, we discuss how to evaluate the clinical importance of mediator and moderator relationships as well as the methodology to calculate statistical power for tests of mediation and moderation. Further exploration of mediator and moderator variables may provide valuable information with regard to interventions provided early in the course of a psychiatric illness.

  8. Investigating transport pathways in the ocean

    NASA Astrophysics Data System (ADS)

    Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.

    2013-01-01

    The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.

  9. Geomatics for Maritime Parks and Preserved Areas

    NASA Astrophysics Data System (ADS)

    Lo Tauro, Agata

    2009-11-01

    The aim of this research is to use hyperspectral MIVIS data for protection of sensitive cultural, natural resources, Nature Reserves and maritime parks. A knowledge of the distribution of submerged vegetation is useful to monitor the health of ecosystems in coastal areas. The objective of this project was to develop a new methodology within geomatic environment to facilitate the analysis and application of Local Institutions who are not familiar with Spatial Analysis softwares in order to implement new research activities in this field of study. Field controls may be carried out with the support of accurate and novel in situ analysis in order to determine the training sites for the novel tested classification. The methodology applied demonstrates that the combination of hyperspectral sensors and ESA Remote Sensing (RS) data can be used to analyse thematic cartography of submerged vegetation and land use analysis for Sustainable Development. This project will be implemented for Innovative Educational and Research Programmes.

  10. HEMATOPOIETIC STEM CELL GENE THERAPY: ASSESSING THE RELEVANCE OF PRE-CLINICAL MODELS

    PubMed Central

    Larochelle, Andre; Dunbar, Cynthia E.

    2013-01-01

    The modern laboratory mouse has become a central tool for biomedical research with a notable influence in the field of hematopoiesis. Application of retroviral-based gene transfer approaches to mouse hematopoietic stem cells (HSCs) has led to a sophisticated understanding of the hematopoietic hierarchy in this model. However, the assumption that gene transfer methodologies developed in the mouse could be similarly applied to human HSCs for the treatment of human diseases left the field of gene therapy in a decade-long quandary. It is not until more relevant humanized xenograft mouse models and phylogenetically related large animal species were used to optimize gene transfer methodologies that unequivocal clinical successes were achieved. However, the subsequent reporting of severe adverse events in these clinical trials casted doubts on the predictive value of conventional pre-clinical testing, and encouraged the development of new assays for assessing the relative genotoxicity of various vector designs. PMID:24014892

  11. Diagnosing kidney disease in the genetic era.

    PubMed

    Prakash, Sindhuri; Gharavi, Ali G

    2015-07-01

    Recent technological improvements have increased the use of genetic testing in the clinic. This review serves to summarize the many practical benefits of genetic testing, discusses various methodologies that can be used clinically, and exemplifies ways in which genetics is propelling the field forward in nephrology. The advent of next-generation sequencing and microarray technologies has heralded an unprecedented number of discoveries in the field of nephrology, providing many opportunities for incorporating genomic diagnostics into clinical care. The use of genetic testing, particularly in pediatrics, can provide accurate diagnoses in puzzling cases, resolve misclassification of disease, and identify subsets of individuals with treatable conditions. Genetic testing may have broad benefits for patients and their families. Knowing the precise molecular etiology of disease can help clinicians determine the exact therapeutic course, and counsel patients and their families about prognosis. Genetic discoveries can also improve the classification of kidney disease and identify new targets for therapy.

  12. Sweep-twist adaptive rotor blade : final project report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashwill, Thomas D.

    2010-02-01

    Knight & Carver was contracted by Sandia National Laboratories to develop a Sweep Twist Adaptive Rotor (STAR) blade that reduced operating loads, thereby allowing a larger, more productive rotor. The blade design used outer blade sweep to create twist coupling without angled fiber. Knight & Carver successfully designed, fabricated, tested and evaluated STAR prototype blades. Through laboratory and field tests, Knight & Carver showed the STAR blade met the engineering design criteria and economic goals for the program. A STAR prototype was successfully tested in Tehachapi during 2008 and a large data set was collected to support engineering and commercialmore » development of the technology. This report documents the methodology used to develop the STAR blade design and reviews the approach used for laboratory and field testing. The effort demonstrated that STAR technology can provide significantly greater energy capture without higher operating loads on the turbine.« less

  13. Transport and attenuation of carboxylate-modified latex microspheres in fractured rock laboratory and field tracer tests

    USGS Publications Warehouse

    Becker, M.W.; Reimus, P.W.; Vilks, P.

    1999-01-01

    Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks of pathogen contamination in drinking water wells. Much has been learned through laboratory and field-scale colloid tracer tests, but progress has been hampered by a lack of consistent tracer testing methodology at different scales and fluid velocities. This paper presents laboratory and field tracer tests in fractured rock that use the same type of colloid tracer over an almost three orders-of-magnitude range in scale and fluid velocity. Fluorescently-dyed carboxylate-modified latex (CML) microspheres (0.19 to 0.98 ??m diameter) were used as tracers in (1) a naturally fractured tuff sample, (2) a large block of naturally fractured granite, (3) a fractured granite field site, and (4) another fractured granite/schist field site. In all cases, the mean transport time of the microspheres was shorter than the solutes, regardless of detection limit. In all but the smallest scale test, only a fraction of the injected microsphere mass was recovered, with the smaller microspheres being recovered to a greater extent than the larger microspheres. Using existing theory, we hypothesize that the observed microsphere early arrival was due to volume exclusion and attenuation was due to aggregation and/or settling during transport. In most tests, microspheres were detected using flow cytometry, which proved to be an excellent method of analysis. CML microspheres appear to be useful tracers for fractured rock in forced gradient and short-term natural gradient tests, but longer residence times may result in small microsphere recoveries.Understanding colloid transport in ground water is essential to assessing the migration of colloid-size contaminants, the facilitation of dissolved contaminant transport by colloids, in situ bioremediation, and the health risks of pathogen contamination in drinking water wells. Much has been learned through laboratory and field-scale colloid tracer tests, but progress has been hampered by a lack of consistent tracer testing methodology at different scales and fluid velocities. This paper presents laboratory and field tracer tests in fractured rock that use the same type of colloid tracer over an almost three orders-of-magnitude range in scale and fluid velocity. Fluorescently-dyed carboxylate-modified latex (CML) microspheres (0.19 to 0.98 ??m diameter) were used as tracers in (1) a naturally fractured tuff sample, (2) a large block of naturally fractured granite, (3) a fractured granite field site, and (4) another fractured granite/schist field site. In all cases, the mean transport time of the microspheres was shorter than the solutes, regardless of detection limit. In all but the smallest scale test, only a fraction of the injected microsphere mass was recovered, with the smaller microspheres being recovered to a greater extent than the larger microspheres. Using existing theory, we hypothesize that the observed microsphere early arrival was due to volume exclusion and attenuation was due to aggregation and/or settling during transport. In most tests, microspheres were detected using flow cytometry, which proved to be an excellent method of analysis. CML microspheres appear to be useful tracers for fractured rock in forced gradient and short-term natural gradient tests, but longer residence times may result in small microsphere recoveries.

  14. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures

    DTIC Science & Technology

    2009-11-24

    assisted by the Brigade Combat Team (BCT) Modernization effort, the use of Models and Simulations ( M &S) becomes more crucial in supporting major...in 2008 via a slice of the Current Force (CF) BCT structure. To ensure realistic operational context, a M &S System-of- Systems (SoS) level...messages, and constructive representation of platforms, vehicles, and terrain. The M &S federation also provided test control, data collection, and live

  15. Methodology discourses as boundary work in the construction of engineering education.

    PubMed

    Beddoes, Kacey

    2014-04-01

    Engineering education research is a new field that emerged in the social sciences over the past 10 years. This analysis of engineering education research demonstrates that methodology discourses have played a central role in the construction and development of the field of engineering education, and that they have done so primarily through boundary work. This article thus contributes to science and technology studies literature by examining the role of methodology discourses in an emerging social science field. I begin with an overview of engineering education research before situating the case within relevant bodies of literature on methodology discourses and boundary work. I then identify two methodology discourses--rigor and methodological diversity--and discuss how they contribute to the construction and development of engineering education research. The article concludes with a discussion of how the findings relate to prior research on methodology discourses and boundary work and implications for future research.

  16. Kinetic Monte Carlo simulations for transient thermal fields: Computational methodology and application to the submicrosecond laser processes in implanted silicon.

    PubMed

    Fisicaro, G; Pelaz, L; Lopez, P; La Magna, A

    2012-09-01

    Pulsed laser irradiation of damaged solids promotes ultrafast nonequilibrium kinetics, on the submicrosecond scale, leading to microscopic modifications of the material state. Reliable theoretical predictions of this evolution can be achieved only by simulating particle interactions in the presence of large and transient gradients of the thermal field. We propose a kinetic Monte Carlo (KMC) method for the simulation of damaged systems in the extremely far-from-equilibrium conditions caused by the laser irradiation. The reference systems are nonideal crystals containing point defect excesses, an order of magnitude larger than the equilibrium density, due to a preirradiation ion implantation process. The thermal and, eventual, melting problem is solved within the phase-field methodology, and the numerical solutions for the space- and time-dependent thermal field were then dynamically coupled to the KMC code. The formalism, implementation, and related tests of our computational code are discussed in detail. As an application example we analyze the evolution of the defect system caused by P ion implantation in Si under nanosecond pulsed irradiation. The simulation results suggest a significant annihilation of the implantation damage which can be well controlled by the laser fluence.

  17. A field assessment of the value of steady shape hydraulic tomography for characterization of aquifer heterogeneities

    USGS Publications Warehouse

    Bohling, Geoffrey C.; Butler, James J.; Zhan, Xiaoyong; Knoll, Michael D.

    2007-01-01

    Hydraulic tomography is a promising approach for obtaining information on variations in hydraulic conductivity on the scale of relevance for contaminant transport investigations. This approach involves performing a series of pumping tests in a format similar to tomography. We present a field‐scale assessment of hydraulic tomography in a porous aquifer, with an emphasis on the steady shape analysis methodology. The hydraulic conductivity (K) estimates from steady shape and transient analyses of the tomographic data compare well with those from a tracer test and direct‐push permeameter tests, providing a field validation of the method. Zonations based on equal‐thickness layers and cross‐hole radar surveys are used to regularize the inverse problem. The results indicate that the radar surveys provide some useful information regarding the geometry of the K field. The steady shape analysis provides results similar to the transient analysis at a fraction of the computational burden. This study clearly demonstrates the advantages of hydraulic tomography over conventional pumping tests, which provide only large‐scale averages, and small‐scale hydraulic tests (e.g., slug tests), which cannot assess strata connectivity and may fail to sample the most important pathways or barriers to flow.

  18. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    NASA Astrophysics Data System (ADS)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  19. Efficient testing methodologies for microcameras in a gigapixel imaging system

    NASA Astrophysics Data System (ADS)

    Youn, Seo Ho; Marks, Daniel L.; McLaughlin, Paul O.; Brady, David J.; Kim, Jungsang

    2013-04-01

    Multiscale parallel imaging--based on a monocentric optical design--promises revolutionary advances in diverse imaging applications by enabling high resolution, real-time image capture over a wide field-of-view (FOV), including sport broadcast, wide-field microscopy, astronomy, and security surveillance. Recently demonstrated AWARE-2 is a gigapixel camera consisting of an objective lens and 98 microcameras spherically arranged to capture an image over FOV of 120° by 50°, using computational image processing to form a composite image of 0.96 gigapixels. Since microcameras are capable of individually adjusting exposure, gain, and focus, true parallel imaging is achieved with a high dynamic range. From the integration perspective, manufacturing and verifying consistent quality of microcameras is a key to successful realization of AWARE cameras. We have developed an efficient testing methodology that utilizes a precisely fabricated dot grid chart as a calibration target to extract critical optical properties such as optical distortion, veiling glare index, and modulation transfer function to validate imaging performance of microcameras. This approach utilizes an AWARE objective lens simulator which mimics the actual objective lens but operates with a short object distance, suitable for a laboratory environment. Here we describe the principles of the methodologies developed for AWARE microcameras and discuss the experimental results with our prototype microcameras. Reference Brady, D. J., Gehm, M. E., Stack, R. A., Marks, D. L., Kittle, D. S., Golish, D. R., Vera, E. M., and Feller, S. D., "Multiscale gigapixel photography," Nature 486, 386--389 (2012).

  20. HDTS 2017.1 Testing and Verification Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, T.

    2017-12-01

    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012; Whiteside, 2017b). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test casesmore » to reproduce the defect and ensure that code changes correct the defect.« less

  1. Institutions and national development in Latin America: a comparative study

    PubMed Central

    Portes, Alejandro; Smith, Lori D.

    2013-01-01

    We review the theoretical and empirical literatures on the role of institutions on national development as a prelude to present a more rigorous and measurable definition of the concept and a methodology to study this relationship at the national and subnational levels. The existing research literature features conflicting definitions of the concept of “institutions” and empirical tests based mostly on reputational indices, with countries as units of analysis. The present study’s methodology is based on a set of five strategic organizations studied comparatively in five Latin American countries. These include key federal agencies, public administrative organizations, and stock exchanges. Systematic analysis of results show a pattern of differences between economically-oriented institutions and those entrusted with providing basic services to the general population. Consistent differences in institutional quality also emerge across countries, despite similar levels of economic development. Using the algebraic methods developed by Ragin, we test six hypotheses about factors determining the developmental character of particular institutions. Implications of results for theory and for methodological practices of future studies in this field are discussed. PMID:26543407

  2. Seasonal hazards and health risks in lower-income countries: field testing a multi-disciplinary approach.

    PubMed

    Few, Roger; Lake, Iain; Hunter, Paul R; Tran, Pham Gia; Thien, Vu Trong

    2009-12-21

    Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.

  3. Aerothermal Ground Testing of Flexible Thermal Protection Systems for Hypersonic Inflatable Aerodynamic Decelerators

    NASA Technical Reports Server (NTRS)

    Bruce, Walter E., III; Mesick, Nathaniel J.; Ferlemann, Paul G.; Siemers, Paul M., III; DelCorso, Joseph A.; Hughes, Stephen J.; Tobin, Steven A.; Kardell, Matthew P.

    2012-01-01

    Flexible TPS development involves ground testing and analysis necessary to characterize performance of the FTPS candidates prior to flight testing. This paper provides an overview of the analysis and ground testing efforts performed over the last year at the NASA Langley Research Center and in the Boeing Large-Core Arc Tunnel (LCAT). In the LCAT test series, material layups were subjected to aerothermal loads commensurate with peak re-entry conditions enveloping a range of HIAD mission trajectories. The FTPS layups were tested over a heat flux range from 20 to 50 W/cm with associated surface pressures of 3 to 8 kPa. To support the testing effort a significant redesign of the existing shear (wedge) model holder from previous testing efforts was undertaken to develop a new test technique for supporting and evaluating the FTPS in the high-temperature, arc jet flow. Since the FTPS test samples typically experience a geometry change during testing, computational fluid dynamic (CFD) models of the arc jet flow field and test model were developed to support the testing effort. The CFD results were used to help determine the test conditions experienced by the test samples as the surface geometry changes. This paper includes an overview of the Boeing LCAT facility, the general approach for testing FTPS, CFD analysis methodology and results, model holder design and test methodology, and selected thermal results of several FTPS layups.

  4. Joint proceedings, Second Genetics Workshop of the Society of American Foresters and Seventh Lake States Forest Tree Improvement Conference, October 21-23, 1965.

    Treesearch

    NCFES

    1966-01-01

    Included are (1) 22 technical papers (by researchers from many sections of the United States and Canada) pertaining to selection and progeny testing, radiation genetics, intraspecific variation, natural and artificial hybridization, breeding systems, breeding methodology and specialized tree breeding techniques, and applied breeding and allied fields; (2) concise...

  5. Soil properties differently influence estimates of soil CO2 efflux from three chamber-based measurement systems

    Treesearch

    John R. Butnor; Kurt H. Johnsen; Chris A. Maier

    2005-01-01

    Soil C02 efflux is a major component of net ecosystem productivity (NEP) of forest systems. Combining data from multiple researchers for larger-scale modeling and assessment will only be valid if their methodologies provide directly comparable results. We conducted a series of laboratory and field tests to assess the presence and magnitude of...

  6. A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.

    PubMed

    Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut

    2017-08-01

    Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.

  7. Application of CCG Sensors to a High-Temperature Structure Subjected to Thermo-Mechanical Load

    PubMed Central

    Xie, Weihua; Meng, Songhe; Jin, Hua; Du, Chong; Wang, Libin; Peng, Tao; Scarpa, Fabrizio; Xu, Chenghai

    2016-01-01

    This paper presents a simple methodology to perform a high temperature coupled thermo-mechanical test using ultra-high temperature ceramic material specimens (UHTCs), which are equipped with chemical composition gratings sensors (CCGs). The methodology also considers the presence of coupled loading within the response provided by the CCG sensors. The theoretical strain of the UHTCs specimens calculated with this technique shows a maximum relative error of 2.15% between the analytical and experimental data. To further verify the validity of the results from the tests, a Finite Element (FE) model has been developed to simulate the temperature, stress and strain fields within the UHTC structure equipped with the CCG. The results show that the compressive stress exceeds the material strength at the bonding area, and this originates a failure by fracture of the supporting structure in the hot environment. The results related to the strain fields show that the relative error with the experimental data decrease with an increase of temperature. The relative error is less than 15% when the temperature is higher than 200 °C, and only 6.71% at 695 °C. PMID:27754356

  8. EAACI position paper for practical patch testing in allergic contact dermatitis in children.

    PubMed

    de Waard-van der Spek, Flora B; Darsow, Ulf; Mortz, Charlotte G; Orton, David; Worm, Margitta; Muraro, Antonella; Schmid-Grendelmeier, Peter; Grimalt, Ramon; Spiewak, Radoslaw; Rudzeviciene, Odilija; Flohr, Carsten; Halken, Susanne; Fiocchi, Alessandro; Borrego, Luis Miguel; Oranje, Arnold P

    2015-11-01

    Allergic contact dermatitis (ACD) in children appears to be on the increase, and contact sensitization may already begin in infancy. The diagnosis of contact dermatitis requires a careful evaluation of a patient's clinical history, physical examination, and skin testing. Patch testing is the gold standard diagnostic test. Based on consensus, the EAACI Task Force on Allergic Contact Dermatitis in Children produced this document to provide details on clinical aspects, the standardization of patch test methodology, and suggestions for future research in the field. We provide a baseline list of test allergens to be tested in children with suspected ACD. Additional tests should be performed only on specific indications. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Engine dynamic analysis with general nonlinear finite element codes. II - Bearing element implementation, overall numerical characteristics and benchmarking

    NASA Technical Reports Server (NTRS)

    Padovan, J.; Adams, M.; Lam, P.; Fertis, D.; Zeid, I.

    1982-01-01

    Second-year efforts within a three-year study to develop and extend finite element (FE) methodology to efficiently handle the transient/steady state response of rotor-bearing-stator structure associated with gas turbine engines are outlined. The two main areas aim at (1) implanting the squeeze film damper element into a general purpose FE code for testing and evaluation; and (2) determining the numerical characteristics of the FE-generated rotor-bearing-stator simulation scheme. The governing FE field equations are set out and the solution methodology is presented. The choice of ADINA as the general-purpose FE code is explained, and the numerical operational characteristics of the direct integration approach of FE-generated rotor-bearing-stator simulations is determined, including benchmarking, comparison of explicit vs. implicit methodologies of direct integration, and demonstration problems.

  10. Field significance of performance measures in the context of regional climate model evaluation. Part 1: temperature

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker

    2018-04-01

    A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as "field" or "global" significance. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Monthly temperature climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. In winter and in most regions in summer, the downscaled distributions are statistically indistinguishable from the observed ones. A systematic cold summer bias occurs in deep river valleys due to overestimated elevations, in coastal areas due probably to enhanced sea breeze circulation, and over large lakes due to the interpolation of water temperatures. Urban areas in concave topography forms have a warm summer bias due to the strong heat islands, not reflected in the observations. WRF-NOAH generates appropriate fine-scale features in the monthly temperature field over regions of complex topography, but over spatially homogeneous areas even small biases can lead to significant deteriorations relative to the driving reanalysis. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.

  11. The effects of man-made smokes and battlefield-induced smokes on the propagation of electromagnetic radiation

    NASA Astrophysics Data System (ADS)

    Vandewal, Anthony

    1993-11-01

    This paper provides an unclassified overview of the U.S. Army program that collects and disseminates information about the effects of battlefied smokes and obscurants on weapon system performance. The primary mechanism for collecting field data is an annual exercise called SMOKE WEEK. In SMOKE WEEK testing, a complete characterization is made of the ambient test conditions, of the electromagnetic radiation propagation in clear and obscured conditions, and of the obscuring cloud that the particles that comprise the cloud. This paper describes the instrumentation and methodology employed to make these field measurements, methods of analysis, and some typical results. The effects of these realistic battlefield environments on weapons system performance are discussed generically.

  12. Design Methodology of an Equalizer for Unipolar Non Return to Zero Binary Signals in the Presence of Additive White Gaussian Noise Using a Time Delay Neural Network on a Field Programmable Gate Array

    PubMed Central

    Pérez Suárez, Santiago T.; Travieso González, Carlos M.; Alonso Hernández, Jesús B.

    2013-01-01

    This article presents a design methodology for designing an artificial neural network as an equalizer for a binary signal. Firstly, the system is modelled in floating point format using Matlab. Afterward, the design is described for a Field Programmable Gate Array (FPGA) using fixed point format. The FPGA design is based on the System Generator from Xilinx, which is a design tool over Simulink of Matlab. System Generator allows one to design in a fast and flexible way. It uses low level details of the circuits and the functionality of the system can be fully tested. System Generator can be used to check the architecture and to analyse the effect of the number of bits on the system performance. Finally the System Generator design is compiled for the Xilinx Integrated System Environment (ISE) and the system is described using a hardware description language. In ISE the circuits are managed with high level details and physical performances are obtained. In the Conclusions section, some modifications are proposed to improve the methodology and to ensure portability across FPGA manufacturers.

  13. Anthropogenic microfibres pollution in marine biota. A new and simple methodology to minimize airborne contamination.

    PubMed

    Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi

    2016-12-15

    Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Developing Methods for Detection of Munitions and Explosives of Concern in Offshore Wind Energy Areas

    NASA Astrophysics Data System (ADS)

    DuVal, C.; Trembanis, A. C.; Miller, J. K.; Carton, G.

    2016-12-01

    Munitions and Explosives of Concern (MEC) have been acknowledged globally as a topic of concern. Increasing use of coastal and continental shelf environments for renewable energy development and other activities has and continues to place humans in contact with legacy military munitions. The Bureau of Ocean Energy Management (BOEM) recognized the need to develop guidance concerning methods for MEC detection in the case of offshore energy development. The study was designed to identify the most likely MEC to be encountered in the Atlantic Outer Continental Shelf (OCS) Wind Energy Areas (WEA), review available technologies and develop a process for selecting appropriate technologies and methodologies for their detection. The process for selecting and optimizing technologies and methods for detection of MEC in BOEM OCS WEAs was developed and tested through the synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. To test the selected approach, designated personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations, tested and optimized the selected methodology. The effectiveness of a methodology will be related to ease of detection and other associated parameters. The approach for the in-field trial consists of a combination of wide-area assessment surveying by vessel mounted 230/550 kHz Edgetech 6205 Phase Measuring sonar and near-seafloor surveying using a Teledyne Gavia autonomous underwater vehicle (AUV) equipped with high-resolution 900/1800 kHz Marine Sonics side-scan sonar, Geometrics G880-AUV cesium-vapor magnetometer, and 2 megapixel Point Grey color camera. Survey parameters (e.g. track-line spacing, coverage overlap, AUV altitude) were varied to determine the optimal survey methods, as well as simulate MEC burial to test magnetometer range performance. Preliminary results indicate the combination of high-resolution, near-bed side-scan sonar and magnetometry yields promising results for MEC identification, addressing the potential for both surficial and buried MEC.

  15. On the design of innovative heterogeneous tests using a shape optimization approach

    NASA Astrophysics Data System (ADS)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    The development of full-field measurement methods enabled a new trend of mechanical tests. By providing the inhomogeneous strain field from the tests, these techniques are being widely used in sheet metal identification strategies, through heterogeneous mechanical tests. In this work, a heterogeneous mechanical test with an innovative tool/specimen shape, capable of producing rich heterogeneous strain paths providing extensive information on material behavior, is aimed. The specimen is found using a shape optimization process where a dedicated indicator that evaluates the richness of strain information is used. The methodology and results here presented are extended to non-specimen geometry dependence and to the non-dependence of the geometry parametrization through the use of the Ritz method for boundary value problems. Different curve models, such as Splines, B-Splines and NURBS, are used and C1 continuity throughout the specimen is guaranteed. Moreover, various optimization methods are used, deterministic and stochastic, in order to find the method or a combination of methods able to effectively minimize the cost function.

  16. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  17. Comparative tests of bench equipment for fuel control system testing of gas-turbine engine

    NASA Astrophysics Data System (ADS)

    Shendaleva, E. V.

    2018-04-01

    The relevance of interlaboratory comparative researches is confirmed by attention of world metrological community to this field of activity. Use of the interlaboratory comparative research methodology not only for single gages collation, but also for bench equipment complexes, such as modeling stands for fuel control system testing of gas-turbine engine, is offered. In this case a comparative measure of different bench equipment will be the control fuel pump. Ensuring traceability of measuring result received at test benches of various air enterprises, development and introduction of national standards to practice of bench tests and, eventually, improvement of quality and safety of a aircraft equipment is result of this approach.

  18. Automated x-ray/light field congruence using the LINAC EPID panel.

    PubMed

    Polak, Wojciech; O'Doherty, Jim; Jones, Matt

    2013-03-01

    X-ray/light field alignment is a test described in many guidelines for the routine quality control of clinical linear accelerators (LINAC). Currently, the gold standard method for measuring alignment is through utilization of radiographic film. However, many modern LINACs are equipped with an electronic portal imaging device (EPID) that may be used to perform this test and thus subsequently reducing overall cost, processing, and analysis time, removing operator dependency and the requirement to sustain the departmental film processor. This work describes a novel method of utilizing the EPID together with a custom inhouse designed jig and automatic image processing software allowing measurement of the light field size, x-ray field size, and congruence between them. The authors present results of testing the method for aS1000 and aS500 Varian EPID detectors for six LINACs at a range of energies (6, 10, and 15 MV) in comparison with the results obtained from the use of radiographic film. Reproducibility of the software in fully automatic operation under a range of operating conditions for a single image showed a congruence of 0.01 cm with a coefficient of variation of 0. Slight variation in congruence repeatability was noted through semiautomatic processing by four independent operators due to manual marking of positions on the jig. Testing of the methodology using the automatic method shows a high precision of 0.02 mm compared to a maximum of 0.06 mm determined by film processing. Intraindividual examination of operator measurements of congruence was shown to vary as much as 0.75 mm. Similar congruence measurements of 0.02 mm were also determined for a lower resolution EPID (aS500 model), after rescaling of the image to the aS1000 image size. The designed methodology was proven to be time efficient, cost effective, and at least as accurate as using the gold standard radiographic film. Additionally, congruence testing can be easily performed for all four cardinal gantry angles which can be difficult when using radiographic film. Therefore, the authors propose it can be used as an alternative to the radiographic film method allowing decommissioning of the film processor.

  19. Munitions and Explosives of Concern Survey Methodology and In-field Testing for Wind Energy Areas on the Atlantic Outer Continental Shelf

    NASA Astrophysics Data System (ADS)

    DuVal, C.; Carton, G.; Trembanis, A. C.; Edwards, M.; Miller, J. K.

    2017-12-01

    Munitions and explosives of concern (MEC) are present in U.S. waters as a result of past and ongoing live-fire testing and training, combat operations, and sea disposal. To identify MEC that may pose a risk to human safety during development of offshore wind facilities on the Atlantic Outer Continental Shelf (OCS), the Bureau of Ocean Energy Management (BOEM) is preparing to develop guidance on risk analysis and selection processes for methods and technologies to identify MEC in Wind Energy Areas (WEA). This study developed a process for selecting appropriate technologies and methodologies for MEC detection using a synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. Personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations tested and optimized the selected methodology to find and identify the placed targets. This in-field trial, conducted in July 2016, emphasized the use of multiple sensors for MEC detection, and led to further guidance for future MEC detection efforts on the Atlantic OCS. An April 2017 follow on study determined the fate of the munitions surrogates after the Atlantic storm season had passed. Using regional hydrodynamic models and incorporating the recommendations from the 2016 field trial, the follow on study examined the fate of the MEC and compared the findings to existing research on munitions mobility, as well as models developed as part of the Office of Naval Research Mine-Burial Program. Focus was given to characterizing the influence of sediment type on surrogate munitions behavior and the influence of mophodynamics and object burial on MEC detection. Supporting Mine-Burial models, ripple bedforms were observed to impede surrogate scour and burial in coarse sediments, while surrogate burial was both predicted and observed in finer sediments. Further, incorporation of recommendations from the previous trial in the 2017 study led to fourfold improvement of MEC detection rates over the 2016 approach. The use of modeling to characterize local morphodynamics, MEC burial or mobility, and the impact of seasonal or episodic storm events are discussed in light of technology selection and timing for future MEC detection surveys.

  20. Weapon Simulator Test Methodology Investigation: Comparison of Live Fire and Weapon Simulator Test Methodologies and the Effects of Clothing and Individual Equipment on Marksmanship

    DTIC Science & Technology

    2016-09-15

    METHODOLOGY INVESTIGATION: COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON...2. REPORT TYPE Final 3. DATES COVERED (From - To) October 2014 – August 2015 4. TITLE AND SUBTITLE WEAPON SIMULATOR TEST METHODOLOGY INVESTIGATION...COMPARISON OF LIVE FIRE AND WEAPON SIMULATOR TEST METHODOLOGIES AND THE EFFECTS OF CLOTHING AND INDIVIDUAL EQUIPMENT ON MARKSMANSHIP 5a. CONTRACT

  1. A systematic review of model-based economic evaluations of diagnostic and therapeutic strategies for lower extremity artery disease.

    PubMed

    Vaidya, Anil; Joore, Manuela A; ten Cate-Hoek, Arina J; Kleinegris, Marie-Claire; ten Cate, Hugo; Severens, Johan L

    2014-01-01

    Lower extremity artery disease (LEAD) is a sign of wide spread atherosclerosis also affecting coronary, cerebral and renal arteries and is associated with increased risk of cardiovascular events. Many economic evaluations have been published for LEAD due to its clinical, social and economic importance. The aim of this systematic review was to assess modelling methods used in published economic evaluations in the field of LEAD. Our review appraised and compared the general characteristics, model structure and methodological quality of published models. Electronic databases MEDLINE and EMBASE were searched until February 2013 via OVID interface. Cochrane database of systematic reviews, Health Technology Assessment database hosted by National Institute for Health research and National Health Services Economic Evaluation Database (NHSEED) were also searched. The methodological quality of the included studies was assessed by using the Philips' checklist. Sixteen model-based economic evaluations were identified and included. Eleven models compared therapeutic health technologies; three models compared diagnostic tests and two models compared a combination of diagnostic and therapeutic options for LEAD. Results of this systematic review revealed an acceptable to low methodological quality of the included studies. Methodological diversity and insufficient information posed a challenge for valid comparison of the included studies. In conclusion, there is a need for transparent, methodologically comparable and scientifically credible model-based economic evaluations in the field of LEAD. Future modelling studies should include clinically and economically important cardiovascular outcomes to reflect the wider impact of LEAD on individual patients and on the society.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jornet, N; Carrasco de Fez, P; Jordi, O

    Purpose: To evaluate the accuracy in total scatter factor (Sc,p) determination for small fields using commercial plastic scintillator detector (PSD). The manufacturer's spectral discrimination method to subtract Cerenkov light from the signal is discussed. Methods: Sc,p for field sizes ranging from 0.5 to 10 cm were measured using PSD Exradin (Standard Imaging) connected to two channel electrometer measuring the signals in two different spectral regions to subtract the Cerenkov signal from the PSD signal. A Pinpoint ionisation chamber 31006 (PTW) and a non-shielded semiconductor detector EFD (Scanditronix) were used for comparison. Measures were performed for a 6 MV X-ray beam.more » The Sc,p are measured at 10 cm depth in water for a SSD=100 cm and normalized to a 10'10 cm{sup 2} field size at the isocenter. All detectors were placed with their symmetry axis parallel to the beam axis.We followed the manufacturer's recommended calibration methodology to subtract the Cerenkov contribution to the signal as well as a modified method using smaller field sizes. The Sc,p calculated by using both calibration methodologies were compared. Results: Sc,p measured with the semiconductor and the PinPoint detectors agree, within 1.5%, for field sizes between 10'10 and 1'1 cm{sup 2}. Sc,p measured with the PSD using the manufacturer's calibration methodology were systematically 4% higher than those measured with the semiconductor detector for field sizes smaller than 5'5 cm{sup 2}. By using a modified calibration methodology for smalls fields and keeping the manufacturer calibration methodology for fields larger than 5'5cm{sup 2} field Sc,p matched semiconductor results within 2% field sizes larger than 1.5 cm. Conclusion: The calibration methodology proposed by the manufacturer is not appropriate for dose measurements in small fields. The calibration parameters are not independent of the incident radiation spectrum for this PSD. This work was partially financed by grant 2012 of Barcelona board of the AECC.« less

  3. Prediction of work metabolism from heart rate measurements in forest work: some practical methodological issues.

    PubMed

    Dubé, Philippe-Antoine; Imbeau, Daniel; Dubeau, Denise; Auger, Isabelle; Leone, Mario

    2015-01-01

    Individual heart rate (HR) to workload relationships were determined using 93 submaximal step-tests administered to 26 healthy participants attending physical activities in a university training centre (laboratory study) and 41 experienced forest workers (field study). Predicted maximum aerobic capacity (MAC) was compared to measured MAC from a maximal treadmill test (laboratory study) to test the effect of two age-predicted maximum HR Equations (220-age and 207-0.7 × age) and two clothing insulation levels (0.4 and 0.91 clo) during the step-test. Work metabolism (WM) estimated from forest work HR was compared against concurrent work V̇O2 measurements while taking into account the HR thermal component. Results show that MAC and WM can be accurately predicted from work HR measurements and simple regression models developed in this study (1% group mean prediction bias and up to 25% expected prediction bias for a single individual). Clothing insulation had no impact on predicted MAC nor age-predicted maximum HR equations. Practitioner summary: This study sheds light on four practical methodological issues faced by practitioners regarding the use of HR methodology to assess WM in actual work environments. More specifically, the effect of wearing work clothes and the use of two different maximum HR prediction equations on the ability of a submaximal step-test to assess MAC are examined, as well as the accuracy of using an individual's step-test HR to workload relationship to predict WM from HR data collected during actual work in the presence of thermal stress.

  4. A new transmission methodology for quality assurance in radiotherapy based on radiochromic film measurements

    PubMed Central

    do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi

    2015-01-01

    Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID:26699306

  5. Initialization and Setup of the Coastal Model Test Bed: STWAVE

    DTIC Science & Technology

    2017-01-01

    Laboratory (CHL) Field Research Facility (FRF) in Duck , NC. The improved evaluation methodology will promote rapid enhancement of model capability and focus...Blanton 2008) study . This regional digital elevation model (DEM), with a cell size of 10 m, was generated from numerous datasets collected at different...INFORMATION: For additional information, contact Spicer Bak, Coastal Observation and Analysis Branch, Coastal and Hydraulics Laboratory, 1261 Duck Road

  6. Engineering Design Handbook. Army Weapon Systems Analysis. Part 2

    DTIC Science & Technology

    1979-10-01

    EXPERIMENTAL DESIGN ............................... ............ 41-3 41-5 RESULTS OF THE ASARS lIX SIMULATIONS ........................... 41-4 41-6 LATIN...sciences and human factors engineering fields utilizing experimental methodology and multi-variable statistical techniques drawn from experimental ...randomly to grenades for the test design . The nine experimental types of hand grenades (first’ nine in Table 33-2) had a "pip" on their spherical

  7. Wall-to-wall Landsat TM classifications for Georgia in support of SAFIS using FIA plots for training and verification

    Treesearch

    William H. Cooke; Andrew J. Hartsell

    2000-01-01

    Wall-to-wall Landsat TM classification efforts in Georgia require field validation. Validation uslng FIA data was testing by developing a new crown modeling procedure. A methodology is under development at the Southern Research Station to model crown diameter using Forest Health monitoring data. These models are used to simulate the proportion of tree crowns that...

  8. Using Recent Planetary Science Data to Develop Advanced Undergraduate Physics and Astronomy Activities

    NASA Astrophysics Data System (ADS)

    Steckloff, Jordan; Lindell, Rebecca

    2016-10-01

    Teaching science by having students manipulate real data is a popular trend in astronomy and planetary science education. However, many existing activities simply couple this data with traditional "cookbook" style verification labs. As with most topics within science, this instructional technique does not enhance the average students' understanding of the phenomena being studied. Here we present a methodology for developing "science by doing" activities that incorporate the latest discoveries in planetary science with up-to-date constructivist pedagogy to teach advanced concepts in Physics and Astronomy. In our methodology, students are first guided to understand, analyze, and plot real raw scientific data; develop and test physical and computational models to understand and interpret the data; finally use their models to make predictions about the topic being studied and test it with real data.To date, two activities have been developed according to this methodology: Understanding Asteroids through their Light Curves (hereafter "Asteroid Activity"), and Understanding Exoplanetary Systems through Simple Harmonic Motion (hereafter "Exoplanet Activity"). The Asteroid Activity allows students to explore light curves available on the Asteroid Light Curve Database (ALCDB) to discover general properties of asteroids, including their internal structure, strength, and mechanism of asteroid moon formation. The Exoplanet Activity allows students to investigate the masses and semi-major axes of exoplanets in a system by comparing the radial velocity motion of their host star to that of a coupled simple harmonic oscillator. Students then explore how noncircular orbits lead to deviations from simple harmonic motion. These activities will be field tested during the Fall 2016 semester in an advanced undergraduate mechanics and astronomy courses at a large Midwestern STEM-focused university. We will present the development methodologies for these activities, description of the activities, and results from the pre-tests.

  9. Structural qualification testing and operational loading on a fiberglass rotor blade for the Mod-OA wind turbine

    NASA Technical Reports Server (NTRS)

    Sullivan, T. L.

    1983-01-01

    Fatigue tests were performed on full- and half-scale root end sections, first to qualify the root retention design, and second to induce failure. Test methodology and results are presented. Two operational blades were proof tested to design limit load to ascertain buckling resistance. Measurements of natural frequency, damping ratio, and deflection under load made on the operational blades are documented. The tests showed that all structural design requirements were met or exceeded. Blade loads measured during 3000 hr of field operation were close to those expected. The measured loads validated the loads used in the fatigue tests and gave high confidence in the ability of the blades to achieve design life.

  10. Mediators and moderators in early intervention research

    PubMed Central

    Breitborde, Nicholas J. K.; Srihari, Vinod H.; Pollard, Jessica M.; Addington, Donald N.; Woods, Scott W.

    2015-01-01

    Aim The goal of this paper is to provide clarification with regard to the nature of mediator and moderator variables and the statistical methods used to test for the existence of these variables. Particular attention will be devoted to discussing the ways in which the identification of mediator and moderator variables may help to advance the field of early intervention in psychiatry. Methods We completed a literature review of the methodological strategies used to test for mediator and moderator variables. Results Although several tests for mediator variables are currently available, recent evaluations suggest that tests which directly evaluate the indirect effect are superior. With regard to moderator variables, two approaches (‘pick-a-point’ and regions of significance) are available, and we provide guidelines with regard to how researchers can determine which approach may be most appropriate to use for their specific study. Finally, we discuss how to evaluate the clinical importance of mediator and moderator relationships as well as the methodology to calculate statistical power for tests of mediation and moderation. Conclusion Further exploration of mediator and moderator variables may provide valuable information with regard to interventions provided early in the course of a psychiatric illness. PMID:20536970

  11. An Acoustical Comparison of Sub-Scale and Full-Scale Far-Field Measurements for the Reusable Solid Rocket Motor

    NASA Technical Reports Server (NTRS)

    Haynes, Jared; Kenny, R. Jeremy

    2010-01-01

    Recently, members of the Marshall Space Flight Center (MSFC) Fluid Dynamics Branch and Wyle Labs measured far-field acoustic data during a series of three Reusable Solid Rocket Motor (RSRM) horizontal static tests conducted in Promontory, Utah. The test motors included the Technical Evaluation Motor 13 (TEM-13), Flight Verification Motor 2 (FVM-2), and the Flight Simulation Motor 15 (FSM-15). Similar far-field data were collected during horizontal static tests of sub-scale solid rocket motors at MSFC. Far-field acoustical measurements were taken at multiple angles within a circular array centered about the nozzle exit plane, each positioned at a radial distance of 80 nozzle-exit-diameters from the nozzle. This type of measurement configuration is useful for calculating rocket noise characteristics such as those outlined in the NASA SP-8072 "Acoustic Loads Generated by the Propulsion System." Acoustical scaling comparisons are made between the test motors, with particular interest in the Overall Sound Power, Acoustic Efficiency, Non-dimensional Relative Sound Power Spectrum, and Directivity. Since most empirical data in the NASA SP-8072 methodology is derived from small rockets, this investigation provides an opportunity to check the data collapse between a sub-scale and full-scale rocket motor.

  12. Durability assessment to environmental impact of nano-structured consolidants on Carrara marble by field exposure tests.

    PubMed

    Bonazza, Alessandra; Vidorni, Giorgia; Natali, Irene; Ciantelli, Chiara; Giosuè, Chiara; Tittarelli, Francesca

    2017-01-01

    The EU policy of reducing the emissions of combustion generated pollutants entails climate induced deterioration to become more important. Moreover, products applied to preserve outdoor built heritage and their preliminary performance tests often turn out to be improper. In such context, the paper reports the outcomes of the methodology adopted to assess the durability and efficiency of nano-based consolidating products utilized for the conservation of carbonate artworks, performing field exposure tests on Carrara marble model samples in different sites in the framework of the EC Project NANOMATCH. Surface properties and cohesion, extent and penetration of the conservative products and their interactions with marble substrates and environmental conditions are here examined after outdoor exposure for eleven months in four different European cities and compared with the features of undamaged and of untreated damaged specimens undergoing the same exposure settings. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Logic Design Pathology and Space Flight Electronics

    NASA Technical Reports Server (NTRS)

    Katz, Richard; Barto, Rod L.; Erickson, K.

    1997-01-01

    Logic design errors have been observed in space flight missions and the final stages of ground test. The technologies used by designers and their design/analysis methodologies will be analyzed. This will give insight to the root causes of the failures. These technologies include discrete integrated circuit based systems, systems based on field and mask programmable logic, and the use computer aided engineering (CAE) systems. State-of-the-art (SOTA) design tools and methodologies will be analyzed with respect to high-reliability spacecraft design and potential pitfalls are discussed. Case studies of faults from large expensive programs to "smaller, faster, cheaper" missions will be used to explore the fundamental reasons for logic design problems.

  14. Investigations of Low Temperature Time Dependent Cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van der Sluys, W A; Robitz, E S; Young, B A

    2002-09-30

    The objective of this project was to investigate metallurgical and mechanical phenomena associated with time dependent cracking of cold bent carbon steel piping at temperatures between 327 C and 360 C. Boiler piping failures have demonstrated that understanding the fundamental metallurgical and mechanical parameters controlling these failures is insufficient to eliminate it from the field. The results of the project consisted of the development of a testing methodology to reproduce low temperature time dependent cracking in laboratory specimens. This methodology was used to evaluate the cracking resistance of candidate heats in order to identify the factors that enhance cracking sensitivity.more » The resultant data was integrated into current available life prediction tools.« less

  15. A review of engineering development of aqueous phase solar photocatalytic detoxification and disinfection processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goswami, D.Y.

    1997-05-01

    Scientific research on photocatalytic oxidation of hazardous chemicals has been conducted extensively over the last three decades. Use of solar radiation in photocatalytic detoxification and disinfection has only been explored in the last decade. Developments of engineering scale systems, design methodologies, and commercial and industrial applications have occurred even more recently. A number of reactor concepts and designs including concentrating and nonconcentrating types and methods of catalyst deployment have been developed. Some commercial and industrial field tests of solar detoxification systems have been conducted. This paper reviews the engineering developments of the solar photocatalytic detoxification and disinfection processes, including systemmore » design methodologies.« less

  16. Current target acquisition methodology in force on force simulations

    NASA Astrophysics Data System (ADS)

    Hixson, Jonathan G.; Miller, Brian; Mazz, John P.

    2017-05-01

    The U.S. Army RDECOM CERDEC NVESD MSD's target acquisition models have been used for many years by the military community in force on force simulations for training, testing, and analysis. There have been significant improvements to these models over the past few years. The significant improvements are the transition of ACQUIRE TTP-TAS (ACQUIRE Targeting Task Performance Target Angular Size) methodology for all imaging sensors and the development of new discrimination criteria for urban environments and humans. This paper is intended to provide an overview of the current target acquisition modeling approach and provide data for the new discrimination tasks. This paper will discuss advances and changes to the models and methodologies used to: (1) design and compare sensors' performance, (2) predict expected target acquisition performance in the field, (3) predict target acquisition performance for combat simulations, and (4) how to conduct model data validation for combat simulations.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz-Fellenz, Emily S.

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation andmore » careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.« less

  18. Fuzzy logic controllers for electrotechnical devices - On-site tuning approach

    NASA Astrophysics Data System (ADS)

    Hissel, D.; Maussion, P.; Faucher, J.

    2001-12-01

    Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.

  19. Magnetic Test Performance Capabilities at the Goddard Space Flight Center as Applied to the Global Geospace Science Initiative

    NASA Technical Reports Server (NTRS)

    Mitchell, Darryl R.

    1997-01-01

    Goddard Space Flight Center's (GSFC) Spacecraft Magnetic Test Facility (SMTF) is a historic test facility that has set the standard for all subsequent magnetic test facilities. The SMTF was constructed in the early 1960's for the purpose of simulating geomagnetic and interplanetary magnetic fields. Additionally, the facility provides the capability for measuring spacecraft generated magnetic fields as well as calibrating magnetic attitude control systems and science magnetometers. The SMTF was designed for large, spacecraft level tests and is currently the second largest spherical coil system in the world. The SMTF is a three-axis Braunbek system composed of four coils on each of three orthogonal axes. The largest coils are 12.7 meters (41.6 feet) in diameter. The three-axis Braunbek configuration provides a highly uniform cancellation of the geomagnetic field over the central 1.8 meter (6 foot) diameter primary test volume. Cancellation of the local geomagnetic field is to within +/-0.2 nanotesla with a uniformity of up to 0.001% within the 1.8 meter (6 foot) diameter primary test volume. Artificial magnetic field vectors from 0-60,000 nanotesla can be generated along any axis with a 0.1 nanotesla resolution. Oscillating or rotating field vectors can also be produced about any axis with a frequency of up to 100 radians/second. Since becoming fully operational in July of 1967, the SMTF has been the site of numerous spacecraft magnetics tests. Spacecraft tested at the SMTF include: the Solar Maximum Mission (SMM), Magsat, LANDSAT-D, the Fast Aurora] Snapshot (FAST) Explorer and the Sub-millimeter-Wave-Astronomy Satellite (SWAS) among others. This paper describes the methodology and sequencing used for the Global Geospace Science (GGS) initiative magnetic testing program in the Goddard Space Flight Center's SMTF. The GGS initiative provides an exemplary model of a strict and comprehensive magnetic control program.

  20. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia

    NASA Astrophysics Data System (ADS)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  1. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia.

    PubMed

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  2. Analysis of complex environment effect on near-field emission

    NASA Astrophysics Data System (ADS)

    Ravelo, B.; Lalléchère, S.; Bonnet, P.; Paladian, F.

    2014-10-01

    The article is dealing with uncertainty analyses of radiofrequency circuits electromagnetic compatibility emission based on the near-field/near-field (NF/NF) transform combined with stochastic approach. By using 2D data corresponding to electromagnetic (EM) field (X=E or H) scanned in the observation plane placed at the position z0 above the circuit under test (CUT), the X field map was extracted. Then, uncertainty analyses were assessed via the statistical moments from X component. In addition, stochastic collocation based was considered and calculations were applied to planar EM NF radiated by the CUTs as Wilkinson power divider and a microstrip line operating at GHz levels. After Matlab implementation, the mean and standard deviation were assessed. The present study illustrates how the variations of environmental parameters may impact EM fields. The NF uncertainty methodology can be applied to any physical parameter effects in complex environment and useful for printed circuit board (PCBs) design guideline.

  3. “Positive” Results Increase Down the Hierarchy of the Sciences

    PubMed Central

    Fanelli, Daniele

    2010-01-01

    The hypothesis of a Hierarchy of the Sciences with physical sciences at the top, social sciences at the bottom, and biological sciences in-between is nearly 200 years old. This order is intuitive and reflected in many features of academic life, but whether it reflects the “hardness” of scientific research—i.e., the extent to which research questions and results are determined by data and theories as opposed to non-cognitive factors—is controversial. This study analysed 2434 papers published in all disciplines and that declared to have tested a hypothesis. It was determined how many papers reported a “positive” (full or partial) or “negative” support for the tested hypothesis. If the hierarchy hypothesis is correct, then researchers in “softer” sciences should have fewer constraints to their conscious and unconscious biases, and therefore report more positive outcomes. Results confirmed the predictions at all levels considered: discipline, domain and methodology broadly defined. Controlling for observed differences between pure and applied disciplines, and between papers testing one or several hypotheses, the odds of reporting a positive result were around 5 times higher among papers in the disciplines of Psychology and Psychiatry and Economics and Business compared to Space Science, 2.3 times higher in the domain of social sciences compared to the physical sciences, and 3.4 times higher in studies applying behavioural and social methodologies on people compared to physical and chemical studies on non-biological material. In all comparisons, biological studies had intermediate values. These results suggest that the nature of hypotheses tested and the logical and methodological rigour employed to test them vary systematically across disciplines and fields, depending on the complexity of the subject matter and possibly other factors (e.g., a field's level of historical and/or intellectual development). On the other hand, these results support the scientific status of the social sciences against claims that they are completely subjective, by showing that, when they adopt a scientific approach to discovery, they differ from the natural sciences only by a matter of degree. PMID:20383332

  4. "Positive" results increase down the Hierarchy of the Sciences.

    PubMed

    Fanelli, Daniele

    2010-04-07

    The hypothesis of a Hierarchy of the Sciences with physical sciences at the top, social sciences at the bottom, and biological sciences in-between is nearly 200 years old. This order is intuitive and reflected in many features of academic life, but whether it reflects the "hardness" of scientific research--i.e., the extent to which research questions and results are determined by data and theories as opposed to non-cognitive factors--is controversial. This study analysed 2434 papers published in all disciplines and that declared to have tested a hypothesis. It was determined how many papers reported a "positive" (full or partial) or "negative" support for the tested hypothesis. If the hierarchy hypothesis is correct, then researchers in "softer" sciences should have fewer constraints to their conscious and unconscious biases, and therefore report more positive outcomes. Results confirmed the predictions at all levels considered: discipline, domain and methodology broadly defined. Controlling for observed differences between pure and applied disciplines, and between papers testing one or several hypotheses, the odds of reporting a positive result were around 5 times higher among papers in the disciplines of Psychology and Psychiatry and Economics and Business compared to Space Science, 2.3 times higher in the domain of social sciences compared to the physical sciences, and 3.4 times higher in studies applying behavioural and social methodologies on people compared to physical and chemical studies on non-biological material. In all comparisons, biological studies had intermediate values. These results suggest that the nature of hypotheses tested and the logical and methodological rigour employed to test them vary systematically across disciplines and fields, depending on the complexity of the subject matter and possibly other factors (e.g., a field's level of historical and/or intellectual development). On the other hand, these results support the scientific status of the social sciences against claims that they are completely subjective, by showing that, when they adopt a scientific approach to discovery, they differ from the natural sciences only by a matter of degree.

  5. A New Kind of Single-Well Tracer Test for Assessing Subsurface Heterogeneity

    NASA Astrophysics Data System (ADS)

    Hansen, S. K.; Vesselinov, V. V.; Lu, Z.; Reimus, P. W.; Katzman, D.

    2017-12-01

    Single-well injection-withdrawal (SWIW) tracer tests have historically been interpreted using the idealized assumption of tracer path reversibility (i.e., negligible background flow), with background flow due to natural hydraulic gradient being an un-modeled confounding factor. However, we have recently discovered that it is possible to use background flow to our advantage to extract additional information about the subsurface. To wit: we have developed a new kind of single-well tracer test that exploits flow due to natural gradient to estimate the variance of the log hydraulic conductivity field of a heterogeneous aquifer. The test methodology involves injection under forced gradient and withdrawal under natural gradient, and makes use of a relationship, discovered using a large-scale Monte Carlo study and machine learning techniques, between power law breakthrough curve tail exponent and log-hydraulic conductivity variance. We will discuss how we performed the computational study and derived this relationship and then show an application example in which our new single-well tracer test interpretation scheme was applied to estimation of heterogeneity of a formation at the chromium contamination site at Los Alamos National Laboratory. Detailed core hole records exist at the same site, from which it was possible to estimate the log hydraulic conductivity variance using a Kozeny-Carman relation. The variances estimated using our new tracer test methodology and estimated by direct inspection of core were nearly identical, corroborating the new methodology. Assessment of aquifer heterogeneity is of critical importance to deployment of amendments associated with in-situ remediation strategies, since permeability contrasts potentially reduce the interaction between amendment and contaminant. Our new tracer test provides an easy way to obtain this information.

  6. Design elements in implementation research: a structured review of child welfare and child mental health studies.

    PubMed

    Landsverk, John; Brown, C Hendricks; Rolls Reutz, Jennifer; Palinkas, Lawrence; Horwitz, Sarah McCue

    2011-01-01

    Implementation science is an emerging field of research with considerable penetration in physical medicine and less in the fields of mental health and social services. There remains a lack of consensus on methodological approaches to the study of implementation processes and tests of implementation strategies. This paper addresses the need for methods development through a structured review that describes design elements in nine studies testing implementation strategies for evidence-based interventions addressing mental health problems of children in child welfare and child mental health settings. Randomized trial designs were dominant with considerable use of mixed method designs in the nine studies published since 2005. The findings are discussed in reference to the limitations of randomized designs in implementation science and the potential for use of alternative designs.

  7. Diffusion orientation transform revisited.

    PubMed

    Canales-Rodríguez, Erick Jorge; Lin, Ching-Po; Iturria-Medina, Yasser; Yeh, Chun-Hung; Cho, Kuan-Hung; Melie-García, Lester

    2010-01-15

    Diffusion orientation transform (DOT) is a powerful imaging technique that allows the reconstruction of the microgeometry of fibrous tissues based on diffusion MRI data. The three main error sources involving this methodology are the finite sampling of the q-space, the practical truncation of the series of spherical harmonics and the use of a mono-exponential model for the attenuation of the measured signal. In this work, a detailed mathematical description that provides an extension to the DOT methodology is presented. In particular, the limitations implied by the use of measurements with a finite support in q-space are investigated and clarified as well as the impact of the harmonic series truncation. Near- and far-field analytical patterns for the diffusion propagator are examined. The near-field pattern makes available the direct computation of the probability of return to the origin. The far-field pattern allows probing the limitations of the mono-exponential model, which suggests the existence of a limit of validity for DOT. In the regimen from moderate to large displacement lengths the isosurfaces of the diffusion propagator reveal aberrations in form of artifactual peaks. Finally, the major contribution of this work is the derivation of analytical equations that facilitate the accurate reconstruction of some orientational distribution functions (ODFs) and skewness ODFs that are relatively immune to these artifacts. The new formalism was tested using synthetic and real data from a phantom of intersecting capillaries. The results support the hypothesis that the revisited DOT methodology could enhance the estimation of the microgeometry of fiber tissues.

  8. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  9. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  10. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    NASA Astrophysics Data System (ADS)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  11. Modal Identification in an Automotive Multi-Component System Using HS 3D-DIC

    PubMed Central

    López-Alba, Elías; Felipe-Sesé, Luis; Díaz, Francisco A.

    2018-01-01

    The modal characterization of automotive lighting systems becomes difficult using sensors due to the light weight of the elements which compose the component as well as the intricate access to allocate them. In experimental modal analysis, high speed 3D digital image correlation (HS 3D-DIC) is attracting the attention since it provides full-field contactless measurements of 3D displacements as main advantage over other techniques. Different methodologies have been published that perform modal identification, i.e., natural frequencies, damping ratios, and mode shapes using the full-field information. In this work, experimental modal analysis has been performed in a multi-component automotive lighting system using HS 3D-DIC. Base motion excitation was applied to simulate operating conditions. A recently validated methodology has been employed for modal identification using transmissibility functions, i.e., the transfer functions from base motion tests. Results make it possible to identify local and global behavior of the different elements of injected polymeric and metallic materials. PMID:29401725

  12. Rapid measurement of field-saturated hydraulic conductivity for areal characterization

    USGS Publications Warehouse

    Nimmo, J.R.; Schmidt, K.M.; Perkins, K.S.; Stock, J.D.

    2009-01-01

    To provide an improved methodology for characterizing the field-saturated hydraulic conductivity (Kfs) over broad areas with extreme spatial variability and ordinary limitations of time and resources, we developed and tested a simplified apparatus and procedure, correcting mathematically for the major deficiencies of the simplified implementation. The methodology includes use of a portable, falling-head, small-diameter (???20 cm) single-ring infiltrometer and an analytical formula for Kfs that compensates both for nonconstant falling head and for the subsurface radial spreading that unavoidably occurs with small ring size. We applied this method to alluvial fan deposits varying in degree of pedogenic maturity in the arid Mojave National Preserve, California. The measurements are consistent with a more rigorous and time-consuming Kfs measurement method, produce the expected systematic trends in Kfs when compared among soils of contrasting degrees of pedogenic development, and relate in expected ways to results of widely accepted methods. ?? Soil Science Society of America. All rights reserved.

  13. Current status of antifungal susceptibility testing methods.

    PubMed

    Arikan, Sevtap

    2007-11-01

    Antifungal susceptibility testing is a very dynamic field of medical mycology. Standardization of in vitro susceptibility tests by the Clinical and Laboratory Standards Institute (CLSI) and the European Committee for Antimicrobial Susceptibility Testing (EUCAST), and current availability of reference methods constituted the major remarkable steps in the field. Based on the established minimum inhibitory concentration (MIC) breakpoints, it is now possible to determine the susceptibilities of Candida strains to fluconazole, itraconazole, voriconazole, and flucytosine. Moreover, utility of fluconazole antifungal susceptibility tests as an adjunct in optimizing treatment of candidiasis has now been validated. While the MIC breakpoints and clinical significance of susceptibility testing for the remaining fungi and antifungal drugs remain yet unclear, modifications of the available methods as well as other methodologies are being intensively studied to overcome the present drawbacks and limitations. Among the other methods under investigation are Etest, colorimetric microdilution, agar dilution, determination of fungicidal activity, flow cytometry, and ergosterol quantitation. Etest offers the advantage of practical application and favorable agreement rates with the reference methods that are frequently above acceptable limits. However, MIC breakpoints for Etest remain to be evaluated and established. Development of commercially available, standardized colorimetric panels that are based on CLSI method parameters has added more to the antifungal susceptibility testing armamentarium. Flow cytometry, on the other hand, appears to offer rapid susceptibility testing but requires specified equipment and further evaluation for reproducibility and standardization. Ergosterol quantitation is another novel approach, which appears potentially beneficial particularly in discrimination of azole-resistant isolates from heavy trailers. The method is yet investigational and requires to be further studied. Developments in methodology and applications of antifungal susceptibility testing will hopefully provide enhanced utility in clinical guidance of antifungal therapy. However, and particularly in immunosuppressed host, in vitro susceptibility is and will remain only one of several factors that influence clinical outcome.

  14. Toward Theory-Based Research in Political Communication.

    ERIC Educational Resources Information Center

    Simon, Adam F.; Iyengar, Shanto

    1996-01-01

    Praises the theoretical and methodological potential of the field of political communication. Calls for greater interaction and cross fertilization among the fields of political science, sociology, economics, and psychology. Briefly discusses relevant research methodologies. (MJP)

  15. Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors

    NASA Technical Reports Server (NTRS)

    Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele

    2010-01-01

    This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.

  16. Identifying and Investigating Unexpected Response to Treatment: A Diabetes Case Study.

    PubMed

    Ozery-Flato, Michal; Ein-Dor, Liat; Parush-Shear-Yashuv, Naama; Aharonov, Ranit; Neuvirth, Hani; Kohn, Martin S; Hu, Jianying

    2016-09-01

    The availability of electronic health records creates fertile ground for developing computational models of various medical conditions. We present a new approach for detecting and analyzing patients with unexpected responses to treatment, building on machine learning and statistical methodology. Given a specific patient, we compute a statistical score for the deviation of the patient's response from responses observed in other patients having similar characteristics and medication regimens. These scores are used to define cohorts of patients showing deviant responses. Statistical tests are then applied to identify clinical features that correlate with these cohorts. We implement this methodology in a tool that is designed to assist researchers in the pharmaceutical field to uncover new features associated with reduced response to a treatment. It can also aid physicians by flagging patients who are not responding to treatment as expected and hence deserve more attention. The tool provides comprehensive visualizations of the analysis results and the supporting data, both at the cohort level and at the level of individual patients. We demonstrate the utility of our methodology and tool in a population of type II diabetic patients, treated with antidiabetic drugs, and monitored by the HbA1C test.

  17. Using Noble Gas Tracers to Estimate CO2 Saturation in the Field: Results from the 2014 CO2CRC Otway Repeat Residual Saturation Test

    NASA Astrophysics Data System (ADS)

    LaForce, T.; Ennis-King, J.; Boreham, C.; Serno, S.; Cook, P. J.; Freifeld, B. M.; Gilfillan, S.; Jarrett, A.; Johnson, G.; Myers, M.; Paterson, L.

    2015-12-01

    Residual trapping efficiency is a critical parameter in the design of secure subsurface CO2 storage. Residual saturation is also a key parameter in oil and gas production when a field is under consideration for enhanced oil recovery. Tracers are an important tool that can be used to estimate saturation in field tests. A series of measurements of CO2 saturation in an aquifer were undertaken as part of the Otway stage 2B extension field project in Dec. 2014. These tests were a repeat of similar tests in the same well in 2011 with improvements to the data collection and handling method. Two single-well tracer tests using noble gas tracers were conducted. In the first test krypton and xenon are injected into the water-saturated formation to establish dispersivity of the tracers in single-phase flow. Near-residual CO2 saturation is then established near the well. In the second test krypton and xenon are injected with CO2-saturated water to measure the final CO2 saturation. The recovery rate of the tracers is similar to predicted rates using recently published partitioning coefficients. Due to technical difficulties, there was mobile CO2 in the reservoir throughout the second tracer test in 2014. As a consequence, it is necessary to use a variation of the previous simulation procedure to interpret the second tracer test. One-dimensional, radial simulations are used to estimate average saturation of CO2 near the well. Estimates of final average CO2 saturation are computed using two relative permeability models, thermal and isothermal simulations, and three sets of coefficients for the partitioning of the tracers between phases. Four of the partitioning coefficients used were not previously available in the literature. The noble gas tracer field test and analysis of the 2011 and 2014 data both give an average CO2 saturation that is consistent with other field measurements. This study has demonstrated the repeatability of the methodology for noble gas tracer tests in the field.

  18. SELECTION AND TREATMENT OF STRIPPER GAS WELLS FOR PRODUCTION ENHANCEMENT IN THE MID-CONTINENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott Reeves

    2003-08-01

    Stripper gas wells are an important source of domestic energy supply and under constant threat of permanent loss (shut-in) due to marginal economics. In 1998, 192 thousand stripper gas wells produced over a Tcf of gas, at an average rate of less than 16 Mcfd. This represents about 57% of all producing gas wells in the onshore lower-48 states, yet only 8% of production. Reserves of stripper gas wells are estimated to be only 1.6 Tcf, or slightly over 1% of the onshore lower-48 total (end of year 1996 data). Obviously, stripper gas wells are at the very margin of economic sustenance. As the demand for natural gas in the U.S. grows to the forecasted estimate of over 30 Tcf annually by the year 2010, supply from current conventional sources is expected to decline. Therefore, an important need exists to fully exploit known domestic resources of natural gas, including those represented by stripper gas wells. The overall objectives of this project are to develop an efficient and low-cost methodology to broadly categorize the well performance characteristics for a stripper gas field, identify the high-potential candidate wells for remediation, and diagnose the specific causes for well underperformance. With this capability, stripper gas well operators can more efficiently and economically produce these resources and maximize these gas reserves. A further objective is to identify/develop, evaluate and test ''new and novel,'' economically viable remediation options. Finally, it is the objective of this project that all the methods and technologies developed in this project, while being tested in the Mid-Continent, be widely applicable to stripper gas wells of all types across the country. The project activities during the reporting period were: (1) Compiled information and results of field activities that Oneok has conducted in relation to the project. Field activities have included performing six pressure transient tests, and implementing six workovers, four of which were Gas-Gun treatments. (2) Results indicate that the candidate selection methodology was marginally successful based on the pressure transient test results, but highly successful based on the workovers. For the selected candidate wells that were worked over, incremental reserve costs were $$1.00/Mcf. (3) Based on the combined results, the accuracy of the candidate selection methodology tested under this project is unclear. Generally, however, the technique should provide better-than-average candidate selections.« less

  19. Fluvial sediment fingerprinting: literature review and annotated bibliography

    USGS Publications Warehouse

    Williamson, Joyce E.; Haj, Adel E.; Stamm, John F.; Valder, Joshua F.; Prautzch, Vicki L.

    2014-01-01

    The U.S. Geological Survey has evaluated and adopted various field methods for collecting real-time sediment and nutrient data. These methods have proven to be valuable representations of sediment and nutrient concentrations and loads but are not able to accurately identify specific source areas. Recently, more advanced data collection and analysis techniques have been evaluated that show promise in identifying specific source areas. Application of field methods could include studies of sources of fluvial sediment, otherwise referred to as sediment “fingerprinting.” The identification of sediment is important, in part, because knowing the primary sediment source areas in watersheds ensures that best management practices are incorporated in areas that maximize reductions in sediment loadings. This report provides a literature review and annotated bibliography of existing methodologies applied in the field of fluvial sediment fingerprinting. This literature review provides a bibliography of publications where sediment fingerprinting methods have been used; however, this report is not assumed to provide an exhaustive listing. Selected publications were categorized by methodology with some additional summary information. The information contained in the summary may help researchers select methods better suited to their particular study or study area, and identify methods in need of more testing and application.

  20. Buoyant Outflows in the Presence of Ccomplex Topography

    DTIC Science & Technology

    2010-09-30

    of the flow exchange through the Dardanelles Strait on the Aegean Sea coastal flows, cross-shelf exchanges and basin -wide eddy field; e) examine...enhance the predictive capability of operational Navy models, by developing and testing a methodology to link the Mediterranean and Black Sea basins ...in the Aegean Sea through the Dardanelles Strait was shown to have a significant impact on the basin -wide circulation, with implications on the

  1. The Value of Being a Conscientious Learner: Examining the Effects of the Big Five Personality Traits on Self-Reported Learning from Training

    ERIC Educational Resources Information Center

    Woods, Stephen A.; Patterson, Fiona C.; Koczwara, Anna; Sofat, Juilitta A.

    2016-01-01

    Purpose: The aim of this paper is to examine the impact of personality traits of the Big Five model on training outcomes to help explain variation in training effectiveness. Design/Methodology/ Approach: Associations of the Big Five with self-reported learning following training were tested in a pre- and post-design in a field sample of junior…

  2. Quantitative Field Testing Rotylenchulus reniformis DNA from Metagenomic Samples Isolated Directly from Soil

    PubMed Central

    Showmaker, Kurt; Lawrence, Gary W.; Lu, Shien; Balbalian, Clarissa; Klink, Vincent P.

    2011-01-01

    A quantitative PCR procedure targeting the β-tubulin gene determined the number of Rotylenchulus reniformis Linford & Oliveira 1940 in metagenomic DNA samples isolated from soil. Of note, this outcome was in the presence of other soil-dwelling plant parasitic nematodes including its sister genus Helicotylenchus Steiner, 1945. The methodology provides a framework for molecular diagnostics of nematodes from metagenomic DNA isolated directly from soil. PMID:22194958

  3. Measuring Sense of Community in the Military: Cross-Cultural Evidence for the Validity of the Brief Sense of Community Scale and Its Underlying Theory

    ERIC Educational Resources Information Center

    Wombacher, Jorg; Tagg, Stephen K.; Burgi, Thomas; MacBryde, Jillian

    2010-01-01

    In this article, the authors present a German Sense of Community (SOC) Scale for use in military settings. The scale is based on the translation and field-testing of an existing U.S.-based measure of neighborhood SOC (Peterson, Speer, & McMillan, 2008). The methodological intricacies underlying cross-cultural scale development are highlighted, as…

  4. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research.

    PubMed

    Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim

    2017-01-01

    Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  5. The case for causal influences of action videogame play upon vision and attention.

    PubMed

    Kristjánsson, Árni

    2013-05-01

    Over the past decade, exciting findings have surfaced suggesting that routine action videogame play improves attentional and perceptual skills. Apparently, performance during multiple-object tracking, useful-field-of-view tests, and task switching improves, contrast sensitivity and spatial-resolution thresholds decrease, and the attentional blink and backward masking are lessened by short-term training on action videogames. These are remarkable findings showing promise for the training of attention and the treatment of disorders of attentional function. While the findings are interesting, evidence of causal influences of videogame play is not as strong as is often claimed. In many studies, observers with game play experience and those without are tested. Such studies do not address causality, since preexisting differences are not controlled for. Other studies investigate the training of videogame play, with some evidence of training benefits. Methodological shortcomings and potential confounds limit their impact, however, and they have not always been replicated. No longitudinal studies on videogame training exist, but these may be required to provide conclusive answers about any benefits of videogame training and any interaction with preexisting differences. Suggestions for methodological improvement are made here, including recommendations for longitudinal studies. Such studies may become crucial for the field of attentional training to reach its full potential.

  6. Spatial genetic analyses reveal cryptic population structure and migration patterns in a continuously harvested grey wolf (Canis lupus) population in north-eastern Europe.

    PubMed

    Hindrikson, Maris; Remm, Jaanus; Männil, Peep; Ozolins, Janis; Tammeleht, Egle; Saarma, Urmas

    2013-01-01

    Spatial genetics is a relatively new field in wildlife and conservation biology that is becoming an essential tool for unravelling the complexities of animal population processes, and for designing effective strategies for conservation and management. Conceptual and methodological developments in this field are therefore critical. Here we present two novel methodological approaches that further the analytical possibilities of STRUCTURE and DResD. Using these approaches we analyse structure and migrations in a grey wolf (Canislupus) population in north-eastern Europe. We genotyped 16 microsatellite loci in 166 individuals sampled from the wolf population in Estonia and Latvia that has been under strong and continuous hunting pressure for decades. Our analysis demonstrated that this relatively small wolf population is represented by four genetic groups. We also used a novel methodological approach that uses linear interpolation to statistically test the spatial separation of genetic groups. The new method, which is capable of using program STRUCTURE output, can be applied widely in population genetics to reveal both core areas and areas of low significance for genetic groups. We also used a recently developed spatially explicit individual-based method DResD, and applied it for the first time to microsatellite data, revealing a migration corridor and barriers, and several contact zones.

  7. Downward continuation of airborne gravity data by means of the change of boundary approach

    NASA Astrophysics Data System (ADS)

    Mansi, A. H.; Capponi, M.; Sampietro, D.

    2018-03-01

    Within the modelling of gravity data, a common practice is the upward/downward continuation of the signal, i.e. the process of continuing the gravitational signal in the vertical direction away or closer to the sources, respectively. The gravity field, being a potential field, satisfies the Laplace's equation outside the masses and this means that it allows to unambiguously perform this analytical continuation only in a source-free domain. The analytical continuation problem has been solved both in the space and spectral domains by exploiting different algorithms. As well known, the downward continuation operator, differently from the upward one, is an unstable operator, due to its spectral characteristics similar to those of a high-pass filter, and several regularization methods have been proposed in order to stabilize it. In this work, an iterative procedure to downward/upward continue the gravity field observations, acquired at different altitudes, is proposed. This methodology is based on the change of boundary principle and it has been expressively thought for aerogravimetric observations for geophysical exploration purposes. Within this field of application, usually several simplifications can be applied, basically due to the specific characteristics of the airborne surveys which are usually flown at almost constant altitude as close as possible to the terrain. For instance, these characteristics, as shown in the present work, allow to perform the downward continuation without the need of any regularization. The goodness of the proposed methodology has been evaluated by means of a numerical test on real data, acquired in the South of Australia. The test shows that it is possible to move the aerogravimetric data, acquired along tracks with a maximum height difference of about 250 m, with accuracies of the order of 10^{-3} mGal.

  8. Self-Study as an Emergent Methodology in Career and Technical Education, Adult Education and Technology: An Invitation to Inquiry

    ERIC Educational Resources Information Center

    Hawley, Todd S.; Hostetler, Andrew L.

    2017-01-01

    In this manuscript, the authors explore self-study as an emerging research methodology with the potential to open up spaces of inquiry for researchers, graduate students, and teachers in a broad array of fields. They argue that the fields of career and technical education (CTE), adult education and technology can leverage self-study methodology in…

  9. Preliminary Measurement of Electromagnetic Fields and Microdischarges From the Human Body.

    PubMed

    Zheng, Ying; Zhang, Houqi; Yip, Karr; Zheng, Zhen; Yang, Shiji

    2016-01-01

    From 1978-1999, a large number of experts and scholars in China tested and analyzed the external qi of qigong (ie, the electrical signals [ES] released from human practitioners). Development of negatives from some tests had revealed the existence of speckles on the films. In 1998, the current research team analyzed some of the negatives that had been exposed to the ES. The current research team intended to test for the presence of ES in qigong using the dielectric-barrier discharge (DBD) method. The study design involved 2 measurements: electromagnetic test of a open, placebo-controlled methodology and an optical test of single-blinded open, placebo-controlled methodology. The study occurred in Taiyuan, Suzhou, and Shenzhen (China) as well as in Hong Kong. Participants were 10 qigong masters and practitioners and 5 nonpractitioners from 4 cities. In the ES test, the practitioners released ES and the nonpractitioners simulated the release of ES, using 2 channels. Any ambient disturbance was recorded on both channels. For the photo file, the practitioner or nonpractitioner could press his or her palm onto 1 envelope that contained film or could hold his or her palm a certain distance (5-30 cm) above the envelope to release ES or simulate its release, respectively. An oscilloscope, current probes, and photo negatives were used to acquire >50,000 images. A type of discharged electromagnetic field (EMF), with a frequency of approximately 0.3-200 MHz, was recorded. The microdischarge pulses were positive, with a pulse width from 2-100 ns and with a total charge of approximately 0.001-0.2 nC. Many speckles could also be clearly seen in the photo negatives. Within the context of DBD theory, the speckles could be individual footprints of a barrier discharge for which the human skin acts as a barrier layer. Thus, the study measured reproducible field energy or an EMF and microdischarges. ES were measured; then EMFs with a frequency of approximately 0.3-200 MHz and microdischarge pulses were recorded. Within the context of DBD theory, those results suggest that the microdischarge may act as a barrier discharge to which the human skin forms a barrier layer.

  10. PRELIMINARY DATA REPORT: HUMATE INJECTION AS AN ENHANCED ATTENUATION METHOD AT THE F-AREA SEEPAGE BASINS, SAVANNAH RIVER SITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millings, M.

    2013-09-16

    A field test of a humate technology for uranium and I-129 remediation was conducted at the F-Area Field Research Site as part of the Attenuation-Based Remedies for the Subsurface Applied Field Research Initiative (ABRS AFRI) funded by the DOE Office of Soil and Groundwater Remediation. Previous studies have shown that humic acid sorbed to sediments strongly binds uranium at mildly acidic pH and potentially binds iodine-129 (I-129). Use of humate could be applicable for contaminant stabilization at a wide variety of DOE sites however pilot field-scale tests and optimization of this technology are required to move this technical approach frommore » basic science to actual field deployment and regulatory acceptance. The groundwater plume at the F-Area Field Research Site contains a large number of contaminants, the most important from a risk perspective being strontium-90 (Sr-90), uranium isotopes, I-129, tritium, and nitrate. Groundwater remains acidic, with pH as low as 3.2 near the basins and increasing to the background pH of approximately 5at the plume fringes. The field test was conducted in monitoring well FOB 16D, which historically has shown low pH and elevated concentrations of Sr-90, uranium, I-129 and tritium. The field test included three months of baseline monitoring followed by injection of a potassium humate solution and approximately four and half months of post monitoring. Samples were collected and analyzed for numerous constituents but the focus was on attenuation of uranium, Sr-90, and I-129. This report provides background information, methodology, and preliminary field results for a humate field test. Results from the field monitoring show that most of the excess humate (i.e., humate that did not sorb to the sediments) has flushed through the surrounding formation. Furthermore, the data indicate that the test was successful in loading a band of sediment surrounding the injection point to a point where pH could return to near normal during the study timeframe. Future work will involve a final report, which will include data trends, correlations and interpretations of laboratory data.« less

  11. A comprehensive methodology for the multidimensional and synchronic data collecting in soundscape.

    PubMed

    Kogan, Pablo; Turra, Bruno; Arenas, Jorge P; Hinalaf, María

    2017-02-15

    The soundscape paradigm is comprised of complex living systems where individuals interact moment-by-moment among one another and with the physical environment. The real environments provide promising conditions to reveal deep soundscape behavior, including the multiple components involved and their interrelations as a whole. However, measuring and analyzing the numerous simultaneous variables of soundscape represents a challenge that is not completely understood. This work proposes and applies a comprehensive methodology for multidimensional and synchronic data collection in soundscape. The soundscape variables were organized into three main entities: experienced environment, acoustic environment, and extra-acoustic environment, containing, in turn, subgroups of variables called components. The variables contained in these components were acquired through synchronic field techniques that include surveys, acoustic measurements, audio recordings, photography, and video. The proposed methodology was tested, optimized, and applied in diverse open environments, including squares, parks, fountains, university campuses, streets, and pedestrian areas. The systematization of this comprehensive methodology provides a framework for soundscape research, a support for urban and environment management, and a preliminary procedure for standardization in soundscape data collecting. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Self-Contained Automated Methodology for Optimal Flow Control

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff

    1997-01-01

    This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.

  13. Stress-Strain Characterization for Reversed Loading Path and Constitutive Modeling for AHSS Springback Predictions

    NASA Astrophysics Data System (ADS)

    Zhu, Hong; Huang, Mai; Sadagopan, Sriram; Yao, Hong

    2017-09-01

    With increasing vehicle fuel economy standards, automotive OEMs are widely using various AHSS grades including DP, TRIP, CP and 3rd Gen AHSS to reduce vehicle weight due to their good combination of strength and formability. As one of enabling technologies for AHSS application, the requirement for requiring accurate prediction of springback for cold stamped AHSS parts stimulated a large number of investigations in the past decade with reversed loading path at large strains followed by constitutive modeling. With a spectrum of complex loading histories occurring in production stamping processes, there were many challenges in this field including issues of test data reliability, loading path representability, constitutive model robustness and non-unique constitutive parameter-identification. In this paper, various testing approaches and constitutive modeling will be reviewed briefly and a systematic methodology from stress-strain characterization, constitutive model parameter identification for material card generation will be presented in order to support automotive OEM’s need on virtual stamping. This systematic methodology features a tension-compression test at large strain with robust anti-buckling device with concurrent friction force correction, properly selected loading paths to represent material behavior during different springback modes as well as the 10-parameter Yoshida model with knowledge-based parameter-identification through nonlinear optimization. Validation cases for lab AHSS parts will also be discussed to check applicability of this methodology.

  14. Cost analysis of ground-water supplies in the North Atlantic region, 1970

    USGS Publications Warehouse

    Cederstrom, Dagfin John

    1973-01-01

    The cost of municipal and industrial ground water (or, more specifically, large supplies of ground water) at the wellhead in the North Atlantic Region in 1970 generally ranged from 1.5 to 5 cents per thousand gallons. Water from crystalline rocks and shale is relatively expensive. Water from sandstone is less so. Costs of water from sands and gravels in glaciated areas and from Coastal Plain sediments range from moderate to very low. In carbonate rocks costs range from low to fairly high. The cost of ground water at the wellhead is low in areas of productive aquifers, but owing to the cost of connecting pipe, costs increase significantly in multiple-well fields. In the North Atlantic Region, development of small to moderate supplies of ground water may offer favorable cost alternatives to planners, but large supplies of ground water for delivery to one point cannot generally be developed inexpensively. Well fields in the less productive aquifers may be limited by costs to 1 or 2 million gallons a day, but in the more favorable aquifers development of several tens of millions of gallons a day may be practicable and inexpensive. Cost evaluations presented cannot be applied to any one specific well or specific site because yields of wells in any one place will depend on the local geologic and hydrologic conditions; however, with such cost adjustments as may be necessary, the methodology presented should have wide applicability. Data given show the cost of water at the wellhead based on the average yield of several wells. The cost of water delivered by a well field includes costs of connecting pipe and of wells that have the yields and spacings specified. Cost of transport of water from the well field to point of consumption and possible cost of treatment are not evaluated. In the methodology employed, costs of drilling and testing, pumping equipment, engineering for the well field, amortization at 5% percent interest, maintenance, and cost of power are considered. The report includes an analysis of test drilling costs leading to a production well field. The discussion shows that test drilling is a relatively low cost item and that more than a minimum of test holes in a previously unexplored area is, above all, simple insurance in keeping down costs and may easily result in final lower costs for the system. Use of the jet drill for testing is considered short sighted and may result in higher total costs and possibly failure to discover good aquifers. Economic development of ground water supplies will depend on obtaining qualified hydrologic and engineering advice, on carrying out adequate test drilling, and on utilizing high-quality (at times, more costly) material.

  15. Framework and indicator testing protocol for developing and piloting quality indicators for the UK quality and outcomes framework.

    PubMed

    Campbell, Stephen M; Kontopantelis, Evangelos; Hannon, Kerin; Burke, Martyn; Barber, Annette; Lester, Helen E

    2011-08-10

    Quality measures should be subjected to a testing protocol before being used in practice using key attributes such as acceptability, feasibility and reliability, as well as identifying issues derived from actual implementation and unintended consequences. We describe the methodologies and results of an indicator testing protocol (ITP) using data from proposed quality indicators for the United Kingdom Quality and Outcomes Framework (QOF). The indicator testing protocol involved a multi-step and methodological process: 1) The RAND/UCLA Appropriateness Method, to test clarity and necessity, 2) data extraction from patients' medical records, to test technical feasibility and reliability, 3) diaries, to test workload, 4) cost-effectiveness modelling, and 5) semi-structured interviews, to test acceptability, implementation issues and unintended consequences. Testing was conducted in a sample of representative family practices in England. These methods were combined into an overall recommendation for each tested indicator. Using an indicator testing protocol as part of piloting was seen as a valuable way of testing potential indicators in 'real world' settings. Pilot 1 (October 2009-March 2010) involved thirteen indicators across six clinical domains and twelve indicators passed the indicator testing protocol. However, the indicator testing protocol identified a number of implementation issues and unintended consequences that can be rectified or removed prior to national roll out. A palliative care indicator is used as an exemplar of the value of piloting using a multiple attribute indicator testing protocol - while technically feasible and reliable, it was unacceptable to practice staff and raised concerns about potentially causing actual patient harm. This indicator testing protocol is one example of a protocol that may be useful in assessing potential quality indicators when adapted to specific country health care settings and may be of use to policy-makers and researchers worldwide to test the likely effect of implementing indicators prior to roll out. It builds on and codifies existing literature and other testing protocols to create a field testing methodology that can be used to produce country specific quality indicators for pay-for-performance or quality improvement schemes.

  16. The effects of noise on the cognitive performance of physicians in a hospital emergency department

    NASA Astrophysics Data System (ADS)

    Dodds, Peter

    In this research, the acoustic environment of a contemporary urban hospital emergency department has been characterized. Perceptive and cognitive tests relating to the acoustic environment were conducted on both medical professionals and lay people and a methodology for developing augmentable acoustic simulations from field recordings was developed. While research of healthcare environments remains a popular area of investigation for the acoustics community, a lack of communication between medical and acoustics researchers as well as a lack of sophistication in the methods implemented to evaluate hospital environments and their occupants has led to stagnation. This research attempted to replicate traditional methods for the evaluation of hospital acoustic environments including impulse response based room acoustics measurements as well as psychoacoustic evaluations. This thesis also demonstrates some of the issues associated with conducting such research and provides an outline and implementation for alternative advanced methods of re- search. Advancements include the use of the n-Back test to evaluate the effects of the acoustic environment on cognitive function as well as the outline of a new methodology for implementing realistic immersive simulations for cognitive and perceptual testing using field recordings and signal processing techniques. Additionally, this research utilizes feedback from working emergency medicine physicians to determine the subjective degree of distraction subjects felt in response to a simulated acoustic environment. Results of the room acoustics measurements and all experiments will be presented and analyzed and possible directions for future research will be presented.

  17. Optimizing detection of noble gas emission at a former UNE site: sample strategy, collection, and analysis

    NASA Astrophysics Data System (ADS)

    Kirkham, R.; Olsen, K.; Hayes, J. C.; Emer, D. F.

    2013-12-01

    Underground nuclear tests may be first detected by seismic or air samplers operated by the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization). After initial detection of a suspicious event, member nations may call for an On-Site Inspection (OSI) that in part, will sample for localized releases of radioactive noble gases and particles. Although much of the commercially available equipment and methods used for surface and subsurface environmental sampling of gases can be used for an OSI scenario, on-site sampling conditions, required sampling volumes and establishment of background concentrations of noble gases require development of specialized methodologies. To facilitate development of sampling equipment and methodologies that address OSI sampling volume and detection objectives, and to collect information required for model development, a field test site was created at a former underground nuclear explosion site located in welded volcanic tuff. A mixture of SF-6, Xe127 and Ar37 was metered into 4400 m3 of air as it was injected into the top region of the UNE cavity. These tracers were expected to move towards the surface primarily in response to barometric pumping or through delayed cavity pressurization (accelerated transport to minimize source decay time). Sampling approaches compared during the field exercise included sampling at the soil surface, inside surface fractures, and at soil vapor extraction points at depths down to 2 m. Effectiveness of various sampling approaches and the results of tracer gas measurements will be presented.

  18. Characterization of an In-Situ Ground Terminal via a Geostationary Satellite

    NASA Technical Reports Server (NTRS)

    Piasecki, Marie T.; Welch, Bryan W.; Mueller, Carl H.

    2015-01-01

    In 2015, the Space Communications and Navigation (SCaN) Testbed project completed an S-Band ground station located at the NASA Glenn Research Center in Cleveland, Ohio. This S-Band ground station was developed to create a fully characterized and controllable dynamic link environment when testing novel communication techniques for Software Defined Radios and Cognitive Communication Systems. In order to provide a useful environment for potential experimenters, it was necessary to characterize various RF devices at both the component level in the laboratory and at the system level after integration. This paper will discuss some of the laboratory testing of the ground station components, with a particular focus/emphasis on the near-field measurements of the antenna. It will then describe the methodology for characterizing the installed ground station at the system level via a Tracking and Data Relay Satellite (TDRS), with specific focus given to the characterization of the ground station antenna pattern, where the max TDRS transmit power limited the validity of the non-noise floor received power data to the antenna main lobe region. Finally, the paper compares the results of each test as well as provides lessons learned from this type of testing methodology.

  19. Characterization of an In-Situ Ground Terminal via a Geostationary Satellite

    NASA Technical Reports Server (NTRS)

    Piasecki, Marie; Welch, Bryan; Mueller, Carl

    2015-01-01

    In 2015, the Space Communications and Navigation (SCaN) Testbed project completed an S-Band ground station located at the NASA Glenn Research Center in Cleveland, Ohio. This S-Band ground station was developed to create a fully characterized and controllable dynamic link environment when testing novel communication techniques for Software Defined Radios and Cognitive Communication Systems. In order to provide a useful environment for potential experimenters, it was necessary to characterize various RF devices at both the component level in the laboratory and at the system level after integration. This paper will discuss some of the laboratory testing of the ground station components, with a particular focus emphasis on the near-field measurements of the antenna. It will then describe the methodology for characterizing the installed ground station at the system level via a Tracking and Data Relay Satellite (TDRS), with specific focus given to the characterization of the ground station antenna pattern, where the max TDRS transmit power limited the validity of the non-noise floor received power data to the antenna main lobe region. Finally, the paper compares the results of each test as well as provides lessons learned from this type of testing methodology.

  20. Fuzzy logic based sensor performance evaluation of vehicle mounted metal detector systems

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Tran, Minh D.

    2015-05-01

    Vehicle Mounted Metal Detector (VMMD) systems are widely used for detection of threat objects in humanitarian demining and military route clearance scenarios. Due to the diverse nature of such operational conditions, operational use of VMMD without a proper understanding of its capability boundaries may lead to heavy causalities. Multi-criteria fitness evaluations are crucial for determining capability boundaries of any sensor-based demining equipment. Evaluation of sensor based military equipment is a multi-disciplinary topic combining the efforts of researchers, operators, managers and commanders having different professional backgrounds and knowledge profiles. Information acquired through field tests usually involves uncertainty, vagueness and imprecision due to variations in test and evaluation conditions during a single test or series of tests. This report presents a fuzzy logic based methodology for experimental data analysis and performance evaluation of VMMD. This data evaluation methodology has been developed to evaluate sensor performance by consolidating expert knowledge with experimental data. A case study is presented by implementing the proposed data analysis framework in a VMMD evaluation scenario. The results of this analysis confirm accuracy, practicability and reliability of the fuzzy logic based sensor performance evaluation framework.

  1. Heat tracer test in an alluvial aquifer: Field experiment and inverse modelling

    NASA Astrophysics Data System (ADS)

    Klepikova, Maria; Wildemeersch, Samuel; Hermans, Thomas; Jamin, Pierre; Orban, Philippe; Nguyen, Frédéric; Brouyère, Serge; Dassargues, Alain

    2016-09-01

    Using heat as an active tracer for aquifer characterization is a topic of increasing interest. In this study, we investigate the potential of using heat tracer tests for characterization of a shallow alluvial aquifer. A thermal tracer test was conducted in the alluvial aquifer of the Meuse River, Belgium. The tracing experiment consisted in simultaneously injecting heated water and a dye tracer in an injection well and monitoring the evolution of groundwater temperature and tracer concentration in the pumping well and in measurement intervals. To get insights in the 3D characteristics of the heat transport mechanisms, temperature data from a large number of observation wells closely spaced along three transects were used. Temperature breakthrough curves in observation wells are contrasted with what would be expected in an ideal layered aquifer. They reveal strongly unequal lateral and vertical components of the transport mechanisms. The observed complex behavior of the heat plume is explained by the groundwater flow gradient on the site and heterogeneities in the hydraulic conductivity field. Moreover, due to high injection temperatures during the field experiment a temperature-induced fluid density effect on heat transport occurred. By using a flow and heat transport numerical model with variable density coupled with a pilot point approach for inversion of the hydraulic conductivity field, the main preferential flow paths were delineated. The successful application of a field heat tracer test at this site suggests that heat tracer tests is a promising approach to image hydraulic conductivity field. This methodology could be applied in aquifer thermal energy storage (ATES) projects for assessing future efficiency that is strongly linked to the hydraulic conductivity variability in the considered aquifer.

  2. Exploratory behaviour in the open field test adapted for larval zebrafish: impact of environmental complexity.

    PubMed

    Ahmad, Farooq; Richardson, Michael K

    2013-01-01

    This study aimed to develop and characterize a novel (standard) open field test adapted for larval zebrafish. We also developed and characterized a variant of the same assay consisting of a colour-enriched open field; this was used to assess the impact of environmental complexity on patterns of exploratory behaviours as well to determine natural colour preference/avoidance. We report the following main findings: (1) zebrafish larvae display characteristic patterns of exploratory behaviours in the standard open field, such as thigmotaxis/centre avoidance; (2) environmental complexity (i.e. presence of colours) differentially affects patterns of exploratory behaviours and greatly attenuates natural zone preference; (3) larvae displayed the ability to discriminate colours. As reported previously in adult zebrafish, larvae showed avoidance towards blue and black; however, in contrast to the reported adult behaviour, larvae displayed avoidance towards red. Avoidance towards yellow and preference for green and orange are shown for the first time, (4) compared to standard open field tests, exposure to the colour-enriched open field resulted in an enhanced expression of anxiety-like behaviours. To conclude, we not only developed and adapted a traditional rodent behavioural assay that serves as a gold standard in preclinical drug screening, but we also provide a version of the same test that affords the possibility to investigate the impact of environmental stress on behaviour in larval zebrafish while representing the first test for assessment of natural colour preference/avoidance in larval zebrafish. In the future, these assays will improve preclinical drug screening methodologies towards the goal to uncover novel drugs. This article is part of a Special Issue entitled: insert SI title. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Measurement accuracy of weighing and tipping-bucket rainfall intensity gauges under dynamic laboratory testing

    NASA Astrophysics Data System (ADS)

    Colli, M.; Lanza, L. G.; La Barbera, P.; Chan, P. W.

    2014-07-01

    The contribution of any single uncertainty factor in the resulting performance of infield rain gauge measurements still has to be comprehensively assessed due to the high number of real world error sources involved, such as the intrinsic variability of rainfall intensity (RI), wind effects, wetting losses, the ambient temperature, etc. In recent years the World Meteorological Organization (WMO) addressed these issues by fostering dedicated investigations, which revealed further difficulties in assessing the actual reference rainfall intensity in the field. This work reports on an extensive assessment of the OTT Pluvio2 weighing gauge accuracy when measuring rainfall intensity under laboratory dynamic conditions (time varying reference flow rates). The results obtained from the weighing rain gauge (WG) were also compared with a MTX tipping-bucket rain gauge (TBR) under the same test conditions. Tests were carried out by simulating various artificial precipitation events, with unsteady rainfall intensity, using a suitable dynamic rainfall generator. Real world rainfall data measured by an Ogawa catching-type drop counter at a field test site located within the Hong Kong International Airport (HKIA) were used as a reference for the artificial rain generation system. Results demonstrate that the differences observed between the laboratory and field performance of catching-type gauges are only partially attributable to the weather and operational conditions in the field. The dynamics of real world precipitation events is responsible for a large part of the measurement errors, which can be accurately assessed in the laboratory under controlled environmental conditions. This allows for new testing methodologies and the development of instruments with enhanced performance in the field.

  4. Evaluation of Mapping Methodologies at a Legacy Test Site

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Schultz-Fellenz, E. S.; Roback, R. C.; Kelley, R. E.; Drellack, S.; Reed, D.; Miller, E.; Cooper, D. I.; Sandoval, M.; Wang, R.

    2013-12-01

    On June 12th, 1985, a nuclear test with an announced yield between 20-150kt was detonated in rhyolitic lava in a vertical emplacement borehole at a depth of 608m below the surface. This test did not collapse to the surface and form a crater, but rather resulted in a subsurface collapse with more subtle surface expressions of deformation, providing an opportunity to evaluate the site using a number of surface mapping methodologies. The site was investigated over a two-year time span by several mapping teams. In order to determine the most time efficient and accurate approach for mapping post-shot surface features at a legacy test site, a number of different techniques were employed. The site was initially divided into four quarters, with teams applying various methodologies, techniques, and instrumentations to each quarter. Early methods included transect lines and site gridding with a Brunton pocket transit, flagging tape, measuring tape, and stakes; surveying using a hand-held personal GPS to locate observed features with an accuracy of × 5-10m; and extensive photo-documentation. More recent methods have incorporated the use of near survey grade GPS devices to allow careful location and mapping of surface features. Initially, gridding was employed along with the high resolution GPS surveys, but this was found to be time consuming and of little observational value. Raw visual observation (VOB) data included GPS coordinates for artifacts or features of interest, field notes, and photographs. A categorization system was used to organize the myriad of items, in order to aid in database searches and for visual presentation of findings. The collected data set was imported into a geographic information system (GIS) as points, lines, or polygons and overlain onto a digital color orthophoto map of the test site. Once these data were mapped, spectral data were collected using a high resolution field spectrometer. In addition to geo-locating the field observations with 10cm resolution GPS, LiDAR and hyperspectral imagery were also acquired. The LiDAR and hyperspectral data are being processed and will be added to the existing geo-referenced database as separate information layers for remote sensing analysis of surface features associated with the legacy test. By consolidating the various components of a VOB data point (coordinates, photo and item description) into a standalone database, searching or querying for other components or collects such as subsurface geophysical and/or airborne imagery is made much easier. Work by Los Alamos National Laboratory was sponsored by the National Nuclear Security Administration Award No. DE-AC52-06NA25946/NST10-NCNS-PD00. Work by National Security Technologies, LLC, was performed under Contract No. DE AC52 06NA25946 with the U.S. Department of Energy.

  5. Leaching of organic contaminants from storage of reclaimed asphalt pavement.

    PubMed

    Norin, Malin; Strömvall, A M

    2004-03-01

    Recycling of asphalt has been promoted by rapid increases in both the use and price of petroleum-based bitumen. Semi-volatile organic compounds in leachates from reclaimed asphalt pavement, measured in field samples and in laboratory column test, were analysed through a GC/MS screen-test methodology. Sixteen PAH (polyaromatic hydrocarbons) were also analysed in leachates from the column study. The highest concentrations of semi-volatile compounds, approximately 400 microg l(-1), were measured in field samples from the scarified stockpile. Naphthalene, butylated hydroxytoluene (BHT) and dibutyl phthalate (DBP) were the most dominant of the identified semi-volatiles. The occurrence of these compounds in urban groundwater, also indicate high emission rates and persistent structures of the compounds, making them potentially hazardous. Car exhausts, rubber tires and the asphalt material itself are all probable emission sources, determined from the organic contaminants released from the stockpiles. The major leaching mechanism indicated was dissolution of organic contaminants from the surface of the asphalt gravels. In the laboratory column test, the release of high-molecular weight and more toxic PAH was higher in the leachates after two years than at the commencement of storage. The concentrations of semi-volatiles in leachates, were also several times lower than those from the field stockpile. These results demonstrate the need to follow up laboratory column test with real field measurements.

  6. Advanced Capabilities for Wind Tunnel Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.

    2010-01-01

    Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.

  7. A Multifunctional Interface Method for Coupling Finite Element and Finite Difference Methods: Two-Dimensional Scalar-Field Problems

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    2002-01-01

    A multifunctional interface method with capabilities for variable-fidelity modeling and multiple method analysis is presented. The methodology provides an effective capability by which domains with diverse idealizations can be modeled independently to exploit the advantages of one approach over another. The multifunctional method is used to couple independently discretized subdomains, and it is used to couple the finite element and the finite difference methods. The method is based on a weighted residual variational method and is presented for two-dimensional scalar-field problems. A verification test problem and a benchmark application are presented, and the computational implications are discussed.

  8. LSST Camera Optics Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riot, V J; Olivier, S; Bauman, B

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics willmore » meet their performance goals.« less

  9. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  10. AWARE Wide Field View

    DTIC Science & Technology

    2016-04-29

    G. Anderson, S. D. Feller, E . M. Vera , H. S. Son, S.-H. Youn, J. Kim, M. E . Gehm, D. J. Brady, J. M. Nichols, K. P. Judd, M. D. Duncan, J. R...scale in monocentric gigapixel cameras." Applied Optics 50(30): 5824-5833. Tremblay, E . J., et al. (2012). "Design and scaling of monocentric...cameras. Optomechanical Engineering 2013. A. E . Hatheway. 8836. Youn, S. H., et al. (2013). Efficient testing methodologies for microcameras in a

  11. Historic Properties Report: Harry Diamond Laboratories, Maryland and Satellite Installations Woodbridge Research Facility, Virginia and Blossom Point Field Test Facility, Maryland

    DTIC Science & Technology

    1984-07-01

    HISTORIC PROPERTIES REPORT HARRY DIAMOND LABORATORIES, MARYLAND ,’ / .’- AND SATELLITE INSTALLATIONS ~WOODBRIDGE RESEARCH FACILITY, VIRGINIA AND ,00... report . METHODOLOGY 1. Documentary Research Harry Diamond Laboratories (HDL) and its two satellite facilities at Woodbridge and Blossom Point are...drawings, and written history. Interagency Archeological Services and U.S. Army, Harry Diamond Laboratories. 106 Case Report and Mitigation Plan: Ballast

  12. The Development of NASA's Fault Management Handbook

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Fesq, Lorraine M.; Barth, Timothy; Clark, Micah; Day, John; Fretz, Kristen; Friberg, Kenneth; Johnson, Stephen; Hattis, Philip; McComas, David; hide

    2011-01-01

    NASA is developing a FM Handbook to establish guidelines and to provide recommendations for defining, developing, analyzing, evaluating, testing, and operating FM systems. It establishes a process for developing FM throughout the lifecycle of a mission and provides a basis for moving the field toward a formal and consistent FM methodology to be applied on future programs. This paper describes the motivation for, the development of, and the future plans for the NASA FM Handbook.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapil, Sanjay; Oberst, R. D.; Bieker, Jill Marie

    Chemical disinfection and inactivation of viruses is largely understudied, but is very important especially in the case of highly infectious viruses. The purpose of this LDRD was to determine the efficacy of the Sandia National Laboratories developed decontamination formulations against Bovine Coronavirus (BCV) as a surrogate for the coronavirus that causes Severe Acute Respiratory Syndrome (SARS) in humans. The outbreak of SARS in late 2002 resulted from a highly infectious virus that was able to survive and remain infectious for extended periods. For this study, preliminary testing with Escherichia coli MS-2 (MS-2) and Escherichia coli T4 (T4) bacteriophages was conductedmore » to develop virucidal methodology for verifying the inactivation after treatment with the test formulations following AOAC germicidal methodologies. After the determination of various experimental parameters (i.e. exposure, concentration) of the formulations, final testing was conducted on BCV. All experiments were conducted with various organic challenges (horse serum, bovine feces, compost) for results that more accurately represent field use condition. The MS-2 and T4 were slightly more resistant than BCV and required a 2 minute exposure while BCV was completely inactivated after a 1 minute exposure. These results were also consistent for the testing conducted in the presence of the various organic challenges indicating that the test formulations are highly effective for real world application.« less

  14. Pumping tests in networks of multilevel sampling wells: Motivation and methodology

    USGS Publications Warehouse

    Butler, J.J.; McElwee, C.D.; Bohling, Geoffrey C.

    1999-01-01

    The identification of spatial variations in hydraulic conductivity (K) on a scale of relevance for transport investigations has proven to be a considerable challenge. Recently, a new field method for the estimation of interwell variations in K has been proposed. This method, hydraulic tomography, essentially consists of a series of short‐term pumping tests performed in a tomographic‐like arrangement. In order to fully realize the potential of this approach, information about lateral and vertical variations in pumping‐induced head changes (drawdown) is required with detail that has previously been unobtainable in the field. Pumping tests performed in networks of multilevel sampling (MLS) wells can provide data of the needed density if drawdown can accurately and rapidly be measured in the small‐diameter tubing used in such wells. Field and laboratory experiments show that accurate transient drawdown data can be obtained in the small‐diameter MLS tubing either directly with miniature fiber‐optic pressure sensors or indirectly using air‐pressure transducers. As with data from many types of hydraulic tests, the quality of drawdown measurements from MLS tubing is quite dependent on the effectiveness of well development activities. Since MLS ports of the standard design are prone to clogging and are difficult to develop, alternate designs are necessary to ensure accurate drawdown measurements. Initial field experiments indicate that drawdown measurements obtained from pumping tests performed in MLS networks have considerable potential for providing valuable information about spatial variations in hydraulic conductivity.

  15. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    PubMed

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of work; and (4) emphasis on generic measurement approaches that can be flexibly tailored to particular contexts of study.

  16. A literature review of empirical research on learning analytics in medical education

    PubMed Central

    Saqr, Mohammed

    2018-01-01

    The number of publications in the field of medical education is still markedly low, despite recognition of the value of the discipline in the medical education literature, and exponential growth of publications in other fields. This necessitates raising awareness of the research methods and potential benefits of learning analytics (LA). The aim of this paper was to offer a methodological systemic review of empirical LA research in the field of medical education and a general overview of the common methods used in the field in general. Search was done in Medline database using the term “LA.” Inclusion criteria included empirical original research articles investigating LA using qualitative, quantitative, or mixed methodologies. Articles were also required to be written in English, published in a scholarly peer-reviewed journal and have a dedicated section for methods and results. A Medline search resulted in only six articles fulfilling the inclusion criteria for this review. Most of the studies collected data about learners from learning management systems or online learning resources. Analysis used mostly quantitative methods including descriptive statistics, correlation tests, and regression models in two studies. Patterns of online behavior and usage of the digital resources as well as predicting achievement was the outcome most studies investigated. Research about LA in the field of medical education is still in infancy, with more questions than answers. The early studies are encouraging and showed that patterns of online learning can be easily revealed as well as predicting students’ performance. PMID:29599699

  17. A literature review of empirical research on learning analytics in medical education.

    PubMed

    Saqr, Mohammed

    2018-01-01

    The number of publications in the field of medical education is still markedly low, despite recognition of the value of the discipline in the medical education literature, and exponential growth of publications in other fields. This necessitates raising awareness of the research methods and potential benefits of learning analytics (LA). The aim of this paper was to offer a methodological systemic review of empirical LA research in the field of medical education and a general overview of the common methods used in the field in general. Search was done in Medline database using the term "LA." Inclusion criteria included empirical original research articles investigating LA using qualitative, quantitative, or mixed methodologies. Articles were also required to be written in English, published in a scholarly peer-reviewed journal and have a dedicated section for methods and results. A Medline search resulted in only six articles fulfilling the inclusion criteria for this review. Most of the studies collected data about learners from learning management systems or online learning resources. Analysis used mostly quantitative methods including descriptive statistics, correlation tests, and regression models in two studies. Patterns of online behavior and usage of the digital resources as well as predicting achievement was the outcome most studies investigated. Research about LA in the field of medical education is still in infancy, with more questions than answers. The early studies are encouraging and showed that patterns of online learning can be easily revealed as well as predicting students' performance.

  18. Cryogenic Insulation Standard Data and Methodologies Project

    NASA Technical Reports Server (NTRS)

    Summerfield, Burton; Thompson, Karen; Zeitlin, Nancy; Mullenix, Pamela; Fesmire, James; Swanger, Adam

    2015-01-01

    Extending some recent developments in the area of technical consensus standards for cryogenic thermal insulation systems, a preliminary Inter-Laboratory Study of foam insulation materials was performed by NASA Kennedy Space Center and LeTourneau University. The initial focus was ambient pressure cryogenic boil off testing using the Cryostat-400 flat-plate instrument. Completion of a test facility at LETU has enabled direct, comparative testing, using identical cryostat instruments and methods, and the production of standard thermal data sets for a number of materials under sub-ambient conditions. The two sets of measurements were analyzed and indicate there is reasonable agreement between the two laboratories. Based on cryogenic boiloff calorimetry, new equipment and methods for testing thermal insulation systems have been successfully developed. These boiloff instruments (or cryostats) include both flat plate and cylindrical models and are applicable to a wide range of different materials under a wide range of test conditions. Test measurements are generally made at large temperature difference (boundary temperatures of 293 K and 78 K are typical) and include the full vacuum pressure range. Results are generally reported in effective thermal conductivity (ke) and mean heat flux (q) through the insulation system. The new cryostat instruments provide an effective and reliable way to characterize the thermal performance of materials under subambient conditions. Proven in through thousands of tests of hundreds of material systems, they have supported a wide range of aerospace, industry, and research projects. Boiloff testing technology is not just for cryogenic testing but is a cost effective, field-representative methodology to test any material or system for applications at sub-ambient temperatures. This technology, when adequately coupled with a technical standards basis, can provide a cost-effective, field-representative methodology to test any material or system for applications at sub-ambient to cryogenic temperatures. A growing need for energy efficiency and cryogenic applications is creating a worldwide demand for improved thermal insulation systems for low temperatures. The need for thermal characterization of these systems and materials raises a corresponding need for insulation test standards and thermal data targeted for cryogenic-vacuum applications. Such standards have a strong correlation to energy, transportation, and environment and the advancement of new materials technologies in these areas. In conjunction with this project, two new standards on cryogenic insulation were recently published by ASTM International: C1774 and C740. Following the requirements of NPR 7120.10, Technical Standards for NASA Programs and Projects, the appropriate information in this report can be provided to the NASA Chief Engineer as input for NASA's annual report to NIST, as required by OMB Circular No. A-119, describing NASA's use of voluntary consensus standards and participation in the development of voluntary consensus standards and bodies.

  19. Bioindicators of contaminant exposure and effect in aquatic and terrestrial monitoring

    USGS Publications Warehouse

    Melancon, Mark J.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John

    2003-01-01

    Bioindicators of contaminant exposure presently used in environmental monitoring arc discussed. Some have been extensively field-validated and arc already in routine application. Included are (1) inhibition of brain or blood cholinesterase by anticholinesterase pesticides, (2) induction of hepatic microsomal cytochromes P450 by chemicals such as PAHs and PCBs, (3) reproductive problems such as terata and eggshell thinning, and (4) aberrations of hemoglobin synthesis, including the effects of lead and of certain chlorinated hydrocarbons. Many studies on DNA damage and of histopathological effects, particularly in the form of tumors, have already been completed. There are presently numerous other opportunities for field validation. Bile metabolites of contaminants in fish reveal exposure to contaminants that might otherwise be difficult to detect or quantify. Bile analysis is beginning to be extended to species other than fishes. Assessment of oxidative damage and immune competence appear to be valuable biomarkers. needing only additional field validation for wider use. The use of metallothioneins as biomarkers depends on the development of convenient, inexpensive methodology that provides information not available from measurements of metal ions. The use of stress proteins as biomarkers depends on development of convenient, inexpensive methodology and field validation. Gene arrays and proteomics hold promise as bioindicators for contaminant exposure or effect, particularly because of the large amount of data that could be generated, but they still need extensive development and testing.

  20. Segmentation of lung fields using Chan-Vese active contour model in chest radiographs

    NASA Astrophysics Data System (ADS)

    Sohn, Kiwon

    2011-03-01

    A CAD tool for chest radiographs consists of several procedures and the very first step is segmentation of lung fields. We develop a novel methodology for segmentation of lung fields in chest radiographs that can satisfy the following two requirements. First, we aim to develop a segmentation method that does not need a training stage with manual estimation of anatomical features in a large training dataset of images. Secondly, for the ease of implementation, it is desirable to apply a well established model that is widely used for various image-partitioning practices. The Chan-Vese active contour model, which is based on Mumford-Shah functional in the level set framework, is applied for segmentation of lung fields. With the use of this model, segmentation of lung fields can be carried out without detailed prior knowledge on the radiographic anatomy of the chest, yet in some chest radiographs, the trachea regions are unfavorably segmented out in addition to the lung field contours. To eliminate artifacts from the trachea, we locate the upper end of the trachea, find a vertical center line of the trachea and delineate it, and then brighten the trachea region to make it less distinctive. The segmentation process is finalized by subsequent morphological operations. We randomly select 30 images from the Japanese Society of Radiological Technology image database to test the proposed methodology and the results are shown. We hope our segmentation technique can help to promote of CAD tools, especially for emerging chest radiographic imaging techniques such as dual energy radiography and chest tomosynthesis.

  1. Preclinical animal anxiety research - flaws and prejudices.

    PubMed

    Ennaceur, Abdelkader; Chazot, Paul L

    2016-04-01

    The current tests of anxiety in mice and rats used in preclinical research include the elevated plus-maze (EPM) or zero-maze (EZM), the light/dark box (LDB), and the open-field (OF). They are currently very popular, and despite their poor achievements, they continue to exert considerable constraints on the development of novel approaches. Hence, a novel anxiety test needs to be compared with these traditional tests, and assessed against various factors that were identified as a source of their inconsistent and contradictory results. These constraints are very costly, and they are in most cases useless as they originate from flawed methodologies. In the present report, we argue that the EPM or EZM, LDB, and OF do not provide unequivocal measures of anxiety; that there is no evidence of motivation conflict involved in these tests. They can be considered at best, tests of natural preference for unlit and/or enclosed spaces. We also argued that pharmacological validation of a behavioral test is an inappropriate approach; it stems from the confusion of animal models of human behavior with animal models of pathophysiology. A behavioral test is developed to detect not to produce symptoms, and a drug is used to validate an identified physiological target. In order to overcome the major methodological flaws in animal anxiety studies, we proposed an open space anxiety test, a 3D maze, which is described here with highlights of its various advantages over to the traditional tests.

  2. Creativity and psychopathology: a systematic review.

    PubMed

    Thys, Erik; Sabbe, Bernard; De Hert, Marc

    2014-01-01

    The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.

  3. Lithographic process window optimization for mask aligner proximity lithography

    NASA Astrophysics Data System (ADS)

    Voelkel, Reinhard; Vogler, Uwe; Bramati, Arianna; Erdmann, Andreas; Ünal, Nezih; Hofmann, Ulrich; Hennemeyer, Marc; Zoberbier, Ralph; Nguyen, David; Brugger, Juergen

    2014-03-01

    We introduce a complete methodology for process window optimization in proximity mask aligner lithography. The commercially available lithography simulation software LAB from GenISys GmbH was used for simulation of light propagation and 3D resist development. The methodology was tested for the practical example of lines and spaces, 5 micron half-pitch, printed in a 1 micron thick layer of AZ® 1512HS1 positive photoresist on a silicon wafer. A SUSS MicroTec MA8 mask aligner, equipped with MO Exposure Optics® was used in simulation and experiment. MO Exposure Optics® is the latest generation of illumination systems for mask aligners. MO Exposure Optics® provides telecentric illumination and excellent light uniformity over the full mask field. MO Exposure Optics® allows the lithography engineer to freely shape the angular spectrum of the illumination light (customized illumination), which is a mandatory requirement for process window optimization. Three different illumination settings have been tested for 0 to 100 micron proximity gap. The results obtained prove, that the introduced process window methodology is a major step forward to obtain more robust processes in mask aligner lithography. The most remarkable outcome of the presented study is that a smaller exposure gap does not automatically lead to better print results in proximity lithography - what the "good instinct" of a lithographer would expect. With more than 5'000 mask aligners installed in research and industry worldwide, the proposed process window methodology might have significant impact on yield improvement and cost saving in industry.

  4. A methodology for TLD postal dosimetry audit of high-energy radiotherapy photon beams in non-reference conditions.

    PubMed

    Izewska, Joanna; Georg, Dietmar; Bera, Pranabes; Thwaites, David; Arib, Mehenna; Saravi, Margarita; Sergieva, Katia; Li, Kaibao; Yip, Fernando Garcia; Mahant, Ashok Kumar; Bulski, Wojciech

    2007-07-01

    A strategy for national TLD audit programmes has been developed by the International Atomic Energy Agency (IAEA). It involves progression through three sequential dosimetry audit steps. The first step audits are for the beam output in reference conditions for high-energy photon beams. The second step audits are for the dose in reference and non-reference conditions on the beam axis for photon and electron beams. The third step audits involve measurements of the dose in reference, and non-reference conditions off-axis for open and wedged symmetric and asymmetric fields for photon beams. Through a co-ordinated research project the IAEA developed the methodology to extend the scope of national TLD auditing activities to more complex audit measurements for regular fields. Based on the IAEA standard TLD holder for high-energy photon beams, a TLD holder was developed with horizontal arm to enable measurements 5cm off the central axis. Basic correction factors were determined for the holder in the energy range between Co-60 and 25MV photon beams. New procedures were developed for the TLD irradiation in hospitals. The off-axis measurement methodology for photon beams was tested in a multi-national pilot study. The statistical distribution of dosimetric parameters (off-axis ratios for open and wedge beam profiles, output factors, wedge transmission factors) checked in 146 measurements was 0.999+/-0.012. The methodology of TLD audits in non-reference conditions with a modified IAEA TLD holder has been shown to be feasible.

  5. Bayesian methodology for the design and interpretation of clinical trials in critical care medicine: a primer for clinicians.

    PubMed

    Kalil, Andre C; Sun, Junfeng

    2014-10-01

    To review Bayesian methodology and its utility to clinical decision making and research in the critical care field. Clinical, epidemiological, and biostatistical studies on Bayesian methods in PubMed and Embase from their inception to December 2013. Bayesian methods have been extensively used by a wide range of scientific fields, including astronomy, engineering, chemistry, genetics, physics, geology, paleontology, climatology, cryptography, linguistics, ecology, and computational sciences. The application of medical knowledge in clinical research is analogous to the application of medical knowledge in clinical practice. Bedside physicians have to make most diagnostic and treatment decisions on critically ill patients every day without clear-cut evidence-based medicine (more subjective than objective evidence). Similarly, clinical researchers have to make most decisions about trial design with limited available data. Bayesian methodology allows both subjective and objective aspects of knowledge to be formally measured and transparently incorporated into the design, execution, and interpretation of clinical trials. In addition, various degrees of knowledge and several hypotheses can be tested at the same time in a single clinical trial without the risk of multiplicity. Notably, the Bayesian technology is naturally suited for the interpretation of clinical trial findings for the individualized care of critically ill patients and for the optimization of public health policies. We propose that the application of the versatile Bayesian methodology in conjunction with the conventional statistical methods is not only ripe for actual use in critical care clinical research but it is also a necessary step to maximize the performance of clinical trials and its translation to the practice of critical care medicine.

  6. Establishing a Research Agenda for Understanding the Role and Impact of Mental Health Peer Specialists.

    PubMed

    Chinman, Matthew; McInnes, D Keith; Eisen, Susan; Ellison, Marsha; Farkas, Marianne; Armstrong, Moe; Resnick, Sandra G

    2017-09-01

    Mental health peer specialists are individuals with serious mental illnesses who receive training to use their lived experiences to help others with serious mental illnesses in clinical settings. This Open Forum discusses the state of the research for mental health peer specialists and suggests a research agenda to advance the field. Studies have suggested that peer specialists vary widely in their roles, settings, and theoretical orientations. Theories of action have been proposed, but none have been tested. Outcome studies have shown benefits of peer specialists; however, many studies have methodological shortcomings. Qualitative descriptions of peer specialists are plentiful but lack grounding in implementation science frameworks. A research agenda advancing the field could include empirically testing theoretical mechanisms of peer specialists, developing a measure of peer specialist fidelity, conducting more rigorous outcomes studies, involving peer specialists in executing the research, and assessing various factors that influence implementing peer specialist services and testing strategies that could address those factors.

  7. Methodological approach in determination of small spatial units in a highly complex terrain in atmospheric pollution research: the case of Zasavje region in Slovenia.

    PubMed

    Kukec, Andreja; Boznar, Marija Z; Mlakar, Primoz; Grasic, Bostjan; Herakovic, Andrej; Zadnik, Vesna; Zaletel-Kragelj, Lijana; Farkas, Jerneja; Erzen, Ivan

    2014-05-01

    The study of atmospheric air pollution research in complex terrains is challenged by the lack of appropriate methodology supporting the analysis of the spatial relationship between phenomena affected by a multitude of factors. The key is optimal design of a meaningful approach based on small spatial units of observation. The Zasavje region, Slovenia, was chosen as study area with the main objective to investigate in practice the role of such units in a test environment. The process consisted of three steps: modelling of pollution in the atmosphere with dispersion models, transfer of the results to geographical information system software, and then moving on to final determination of the function of small spatial units. A methodology capable of designing useful units for atmospheric air pollution research in highly complex terrains was created, and the results were deemed useful in offering starting points for further research in the field of geospatial health.

  8. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  9. NGS tools for traceability in candies as high processed food products: Ion Torrent PGM versus conventional PCR-cloning.

    PubMed

    Muñoz-Colmenero, Marta; Martínez, Jose Luis; Roca, Agustín; Garcia-Vazquez, Eva

    2017-01-01

    The Next Generation Sequencing methodologies are considered the next step within DNA-based methods and their applicability in different fields is being evaluated. Here, we tested the usefulness of the Ion Torrent Personal Genome Machine (PGM) in food traceability analyzing candies as a model of high processed foods, and compared the results with those obtained by PCR-cloning-sequencing (PCR-CS). The majority of samples exhibited consistency between methodologies, yielding more information and species per product from the PGM platform than PCR-CS. Significantly higher AT-content in sequences of the same species was also obtained from PGM. This together with some taxonomical discrepancies between methodologies suggest that the PGM platform is still pre-mature for its use in food traceability of complex highly processed products. It could be a good option for analysis of less complex food, saving time and cost per sample. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A discussion of current issues and concepts in the practice of skull-photo/craniofacial superimposition.

    PubMed

    Gordon, G M; Steyn, M

    2016-05-01

    A recent review paper on cranio-facial superimposition (CFS) stated that "there have been specific conceptual variances" from the original methods used in the practice of skull-photo superimposition, leading to poor results as far as accuracy is concerned. It was argued that the deviations in the practice of the technique have resulted in the reduced accuracies (for both failure to include and failure to exclude) that are noted in several recent studies. This paper aims to present the results from recent research to highlight the advancement of skull-photo/cranio-facial superimposition, and to discuss some of the issues raised regarding deviations from original techniques. The evolving methodology of CFS is clarified in context with the advancement of technology, forensic science and specifically within the field of forensic anthropology. Developments in the skull-photo/cranio-facial superimposition techniques have largely focused on testing reliability and accuracy objectively. Techniques now being employed by forensic anthropologists must conform to rigorous scientific testing and methodologies. Skull-photo/cranio-facial superimposition is constantly undergoing accuracy and repeatability testing which is in line with the principles of the scientific method and additionally allows for advancement in the field. Much of the research has indicated that CFS is useful in exclusion which is consistent with the concept of Popperian falsifiability - a hypothesis and experimental design which is falsifiable. As the hypothesis is disproved or falsified, another evolves to replace it and explain the new observations. Current and future studies employing different methods to test the accuracy and reliability of skull-photo/cranio-facial superimposition will enable researchers to establish the contribution the technique can have for identification purposes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Open-Source Photometric System for Enzymatic Nitrate Quantification

    PubMed Central

    Wittbrodt, B. T.; Squires, D. A.; Walbeck, J.; Campbell, E.; Campbell, W. H.; Pearce, J. M.

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique. PMID:26244342

  12. Open-Source Photometric System for Enzymatic Nitrate Quantification.

    PubMed

    Wittbrodt, B T; Squires, D A; Walbeck, J; Campbell, E; Campbell, W H; Pearce, J M

    2015-01-01

    Nitrate, the most oxidized form of nitrogen, is regulated to protect people and animals from harmful levels as there is a large over abundance due to anthropogenic factors. Widespread field testing for nitrate could begin to address the nitrate pollution problem, however, the Cadmium Reduction Method, the leading certified method to detect and quantify nitrate, demands the use of a toxic heavy metal. An alternative, the recently proposed Environmental Protection Agency Nitrate Reductase Nitrate-Nitrogen Analysis Method, eliminates this problem but requires an expensive proprietary spectrophotometer. The development of an inexpensive portable, handheld photometer will greatly expedite field nitrate analysis to combat pollution. To accomplish this goal, a methodology for the design, development, and technical validation of an improved open-source water testing platform capable of performing Nitrate Reductase Nitrate-Nitrogen Analysis Method. This approach is evaluated for its potential to i) eliminate the need for toxic chemicals in water testing for nitrate and nitrite, ii) reduce the cost of equipment to perform this method for measurement for water quality, and iii) make the method easier to carryout in the field. The device is able to perform as well as commercial proprietary systems for less than 15% of the cost for materials. This allows for greater access to the technology and the new, safer nitrate testing technique.

  13. Screening tests for hazard classification of complex waste materials - Selection of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weltens, R., E-mail: reinhilde.weltens@vito.be; Vanermen, G.; Tirez, K.

    In this study we describe the development of an alternative methodology for hazard characterization of waste materials. Such an alternative methodology for hazard assessment of complex waste materials is urgently needed, because the lack of a validated instrument leads to arbitrary hazard classification of such complex waste materials. False classification can lead to human and environmental health risks and also has important financial consequences for the waste owner. The Hazardous Waste Directive (HWD) describes the methodology for hazard classification of waste materials. For mirror entries the HWD classification is based upon the hazardous properties (H1-15) of the waste which canmore » be assessed from the hazardous properties of individual identified waste compounds or - if not all compounds are identified - from test results of hazard assessment tests performed on the waste material itself. For the latter the HWD recommends toxicity tests that were initially designed for risk assessment of chemicals in consumer products (pharmaceuticals, cosmetics, biocides, food, etc.). These tests (often using mammals) are not designed nor suitable for the hazard characterization of waste materials. With the present study we want to contribute to the development of an alternative and transparent test strategy for hazard assessment of complex wastes that is in line with the HWD principles for waste classification. It is necessary to cope with this important shortcoming in hazardous waste classification and to demonstrate that alternative methods are available that can be used for hazard assessment of waste materials. Next, by describing the pros and cons of the available methods, and by identifying the needs for additional or further development of test methods, we hope to stimulate research efforts and development in this direction. In this paper we describe promising techniques and argument on the test selection for the pilot study that we have performed on different types of waste materials. Test results are presented in a second paper. As the application of many of the proposed test methods is new in the field of waste management, the principles of the tests are described. The selected tests tackle important hazardous properties but refinement of the test battery is needed to fulfil the a priori conditions.« less

  14. Synthetic training sets for the development of discriminant functions for the detection of volatile organic compounds from passive infrared remote sensing data.

    PubMed

    Wan, Boyong; Small, Gary W

    2011-01-21

    A novel synthetic data generation methodology is described for use in the development of pattern recognition classifiers that are employed for the automated detection of volatile organic compounds (VOCs) during infrared remote sensing measurements. The approach used is passive Fourier transform infrared spectrometry implemented in a downward-looking mode on an aircraft platform. A key issue in developing this methodology in practice is the need for example data that can be used to train the classifiers. To replace the time-consuming and costly collection of training data in the field, this work implements a strategy for taking laboratory analyte spectra and superimposing them on background spectra collected from the air. The resulting synthetic spectra can be used to train the classifiers. This methodology is tested by developing classifiers for ethanol and methanol, two prevalent VOCs in wide industrial use. The classifiers are successfully tested with data collected from the aircraft during controlled releases of ethanol and during a methanol release from an industrial facility. For both ethanol and methanol, missed detections in the aircraft data are in the range of 4 to 5%, with false positive detections ranging from 0.1 to 0.3%.

  15. A design methodology of magentorheological fluid damper using Herschel-Bulkley model

    NASA Astrophysics Data System (ADS)

    Liao, Linqing; Liao, Changrong; Cao, Jianguo; Fu, L. J.

    2003-09-01

    Magnetorheological fluid (MR fluid) is highly concentrated suspension of very small magnetic particle in inorganic oil. The essential behavior of MR fluid is its ability to reversibly change from free-flowing, linear viscous liquids to semi-solids having controllable yield strength in milliseconds when exposed to magnetic field. This feature provides simple, quiet, rapid-response interfaces between electronic controls and mechanical systems. In this paper, a mini-bus MR fluid damper based on plate Poiseuille flow mode is typically analyzed using Herschel-Bulkley model, which can be used to account for post-yield shear thinning or thickening under the quasi-steady flow condition. In the light of various value of flow behavior index, the influences of post-yield shear thinning or thickening on flow velocity profiles of MR fluid in annular damping orifice are examined numerically. Analytical damping coefficient predictions also are compared via the nonlinear Bingham plastic model and Herschel-Bulkley constitutive model. A MR fluid damper, which is designed and fabricated according to design method presented in this paper, has tested by electro-hydraulic servo vibrator and its control system in National Center for Test and Supervision of Coach Quality. The experimental results reveal that the analysis methodology and design theory are reasonable and MR fluid damper can be designed according to the design methodology.

  16. SEURAT: Safety Evaluation Ultimately Replacing Animal Testing--recommendations for future research in the field of predictive toxicology.

    PubMed

    Daston, George; Knight, Derek J; Schwarz, Michael; Gocht, Tilman; Thomas, Russell S; Mahony, Catherine; Whelan, Maurice

    2015-01-01

    The development of non-animal methodology to evaluate the potential for a chemical to cause systemic toxicity is one of the grand challenges of modern science. The European research programme SEURAT is active in this field and will conclude its first phase, SEURAT-1, in December 2015. Drawing on the experience gained in SEURAT-1 and appreciating international advancement in both basic and regulatory science, we reflect here on how SEURAT should evolve and propose that further research and development should be directed along two complementary and interconnecting work streams. The first work stream would focus on developing new 'paradigm' approaches for regulatory science. The goal here is the identification of 'critical biological targets' relevant for toxicity and to test their suitability to be used as anchors for predicting toxicity. The second work stream would focus on integration and application of new approach methods for hazard (and risk) assessment within the current regulatory 'paradigm', aiming for acceptance of animal-free testing strategies by regulatory authorities (i.e. translating scientific achievements into regulation). Components for both work streams are discussed and may provide a structure for a future research programme in the field of predictive toxicology.

  17. The complementarity of the technical tools of tissue engineering and the concepts of artificial organs for the design of functional bioartificial tissues.

    PubMed

    Lenas, Petros; Moreno, Angel; Ikonomou, Laertis; Mayer, Joerg; Honda, Hiroyuki; Novellino, Antonio; Pizarro, Camilo; Nicodemou-Lena, Eleni; Rodergas, Silvia; Pintor, Jesus

    2008-09-01

    Although tissue engineering uses powerful biological tools, it still has a weak conceptual foundation, which is restricted at the cell level. The design criteria at the cell level are not directly related with the tissue functions, and consequently, such functions cannot be implemented in bioartificial tissues with the currently used methods. On the contrary, the field of artificial organs focuses on the function of the artificial organs that are treated in the design as integral entities, instead of the optimization of the artificial organ components. The field of artificial organs has already developed and tested methodologies that are based on system concepts and mathematical-computational methods that connect the component properties with the desired global organ function. Such methodologies are needed in tissue engineering for the design of bioartificial tissues with tissue functions. Under the framework of biomedical engineering, artificial organs and tissue engineering do not present competitive approaches, but are rather complementary and should therefore design a common future for the benefit of patients.

  18. Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies

    PubMed Central

    El-Baz, Ayman; Beache, Garth M.; Gimel'farb, Georgy; Suzuki, Kenji; Okada, Kazunori; Elnakib, Ahmed; Soliman, Ahmed; Abdollahi, Behnoush

    2013-01-01

    This paper overviews one of the most important, interesting, and challenging problems in oncology, the problem of lung cancer diagnosis. Developing an effective computer-aided diagnosis (CAD) system for lung cancer is of great clinical importance and can increase the patient's chance of survival. For this reason, CAD systems for lung cancer have been investigated in a huge number of research studies. A typical CAD system for lung cancer diagnosis is composed of four main processing steps: segmentation of the lung fields, detection of nodules inside the lung fields, segmentation of the detected nodules, and diagnosis of the nodules as benign or malignant. This paper overviews the current state-of-the-art techniques that have been developed to implement each of these CAD processing steps. For each technique, various aspects of technical issues, implemented methodologies, training and testing databases, and validation methods, as well as achieved performances, are described. In addition, the paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems. PMID:23431282

  19. Novel methodology to characterize electromagnetic exposure of the brain

    NASA Astrophysics Data System (ADS)

    Crespo-Valero, Pedro; Christopoulou, Maria; Zefferer, Marcel; Christ, Andreas; Achermann, Peter; Nikita, Konstantina S.; Kuster, Niels

    2011-01-01

    Due to the greatly non-uniform field distribution induced in brain tissues by radio frequency electromagnetic sources, the exposure of anatomical and functional regions of the brain may be a key issue in interpreting laboratory findings and epidemiological studies concerning endpoints related to the central nervous system. This paper introduces the Talairach atlas in characterization of the electromagnetic exposure of the brain. A hierarchical labeling scheme is mapped onto high-resolution human models. This procedure is fully automatic and allows identification of over a thousand different sites all over the brain. The electromagnetic absorption can then be extracted and interpreted in every region or combination of regions in the brain, depending on the characterization goals. The application examples show how this methodology enhances the dosimetry assessment of the brain based on results obtained by either finite difference time domain simulations or measurements delivered by test compliance dosimetry systems. Applications include, among others, the detailed dosimetric analysis of the exposure of the brain during cell phone use, improved design of exposure setups for human studies or medical diagnostic and therapeutic devices using electromagnetic fields or ultrasound.

  20. Molecular modeling: An open invitation for applied mathematics

    NASA Astrophysics Data System (ADS)

    Mezey, Paul G.

    2013-10-01

    Molecular modeling methods provide a very wide range of challenges for innovative mathematical and computational techniques, where often high dimensionality, large sets of data, and complicated interrelations imply a multitude of iterative approximations. The physical and chemical basis of these methodologies involves quantum mechanics with several non-intuitive aspects, where classical interpretation and classical analogies are often misleading or outright wrong. Hence, instead of the everyday, common sense approaches which work so well in engineering, in molecular modeling one often needs to rely on rather abstract mathematical constraints and conditions, again emphasizing the high level of reliance on applied mathematics. Yet, the interdisciplinary aspects of the field of molecular modeling also generates some inertia and perhaps too conservative reliance on tried and tested methodologies, that is at least partially caused by the less than up-to-date involvement in the newest developments in applied mathematics. It is expected that as more applied mathematicians take up the challenge of employing the latest advances of their field in molecular modeling, important breakthroughs may follow. In this presentation some of the current challenges of molecular modeling are discussed.

  1. Potential Energy Cost Savings from Increased Commercial Energy Code Compliance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberg, Michael I.; Hart, Philip R.; Athalye, Rahul A.

    2016-08-22

    An important question for commercial energy code compliance is: “How much energy cost savings can better compliance achieve?” This question is in sharp contrast to prior efforts that used a checklist of code requirements, each of which was graded pass or fail. Percent compliance for any given building was simply the percent of individual requirements that passed. A field investigation method is being developed that goes beyond the binary approach to determine how much energy cost savings is not realized. Prototype building simulations were used to estimate the energy cost impact of varying levels of non-compliance for newly constructed officemore » buildings in climate zone 4C. Field data collected from actual buildings on specific conditions relative to code requirements was then applied to the simulation results to find the potential lost energy savings for a single building or for a sample of buildings. This new methodology was tested on nine office buildings in climate zone 4C. The amount of additional energy cost savings they could have achieved had they complied fully with the 2012 International Energy Conservation Code is determined. This paper will present the results of the test, lessons learned, describe follow-on research that is needed to verify that the methodology is both accurate and practical, and discuss the benefits that might accrue if the method were widely adopted.« less

  2. Signal-adapted tomography as a tool for dust devil detection

    NASA Astrophysics Data System (ADS)

    Aguirre, C.; Franzese, G.; Esposito, F.; Vázquez, Luis; Caro-Carretero, Raquel; Vilela-Mendes, Rui; Ramírez-Nicolás, María; Cozzolino, F.; Popa, C. I.

    2017-12-01

    Dust devils are important phenomena to take into account to understand the global dust circulation of a planet. On Earth, their contribution to the injection of dust into the atmosphere seems to be secondary. Elsewhere, there are many indications that the dust devil's role on other planets, in particular on Mars, could be fundamental, impacting the global climate. The ability to identify and study these vortices from the acquired meteorological measurements assumes a great importance for planetary science. Here we present a new methodology to identify dust devils from the pressure time series testing the method on the data acquired during a 2013 field campaign performed in the Tafilalt region (Morocco) of the North-Western Sahara Desert. Although the analysis of pressure is usually studied in the time domain, we prefer here to follow a different approach and perform the analysis in a time signal-adapted domain, the relation between the two being a bilinear transformation, i.e. a tomogram. The tomographic technique has already been successfully applied in other research fields like those of plasma reflectometry or the neuronal signatures. Here we show its effectiveness also in the dust devils detection. To test our results, we compare the tomography with a phase picker time domain analysis. We show the level of agreement between the two methodologies and the advantages and disadvantages of the tomographic approach.

  3. Rapid Mapping for Built Heritage at Risk Using Low-Cost and Cots Sensors. a Test in the Duomo Vecchio of San Severino Marche

    NASA Astrophysics Data System (ADS)

    Calantropio, A.; Colucci, E.; Teppati Losè, L.

    2017-11-01

    In the last years, the researchers in the field of Geomatics have focused their attention in the experimentation and validation of new methodologies and techniques, stressing especially the potential of low-cost and COTS (Commercial Off The Shelf) solutions and sensors. In particular, these tools have been used with purposes of rapid mapping in different contexts (ranging from the construction industry, environmental monitoring, mining activities, etc.). The Built Heritage, due to its intrinsic nature of endangered artefact, can largely benefit from the technological and methodological innovations in this research field. The contribute presented in this paper will highlight these main topics: the rapid mapping of the Built Heritage (in particular the one subjected to different types of risk) using low-cost and COTS solutions. Different sensors and techniques were chosen to be evaluated on a specific test site: the Duomo Vecchio of San Severino Marche (MC - Italy), that was partially affected by the earthquake swarm that hit the area of Central Italy starting from the 24th of August 2016. One of the main aims of this work is to demonstrate how low-cost and COTS sensors can contribute to the documentation of the Built Heritage for its safeguard, for damage assessment in case of disastrous events and operations of restoration and preservation.

  4. Assessment of Near-Field Sonic Boom Simulation Tools

    NASA Technical Reports Server (NTRS)

    Casper, J. H.; Cliff, S. E.; Thomas, S. D.; Park, M. A.; McMullen, M. S.; Melton, J. E.; Durston, D. A.

    2008-01-01

    A recent study for the Supersonics Project, within the National Aeronautics and Space Administration, has been conducted to assess current in-house capabilities for the prediction of near-field sonic boom. Such capabilities are required to simulate the highly nonlinear flow near an aircraft, wherein a sonic-boom signature is generated. There are many available computational fluid dynamics codes that could be used to provide the near-field flow for a sonic boom calculation. However, such codes have typically been developed for applications involving aerodynamic configuration, for which an efficiently generated computational mesh is usually not optimum for a sonic boom prediction. Preliminary guidelines are suggested to characterize a state-of-the-art sonic boom prediction methodology. The available simulation tools that are best suited to incorporate into that methodology are identified; preliminary test cases are presented in support of the selection. During this phase of process definition and tool selection, parallel research was conducted in an attempt to establish criteria that link the properties of a computational mesh to the accuracy of a sonic boom prediction. Such properties include sufficient grid density near shocks and within the zone of influence, which are achieved by adaptation and mesh refinement strategies. Prediction accuracy is validated by comparison with wind tunnel data.

  5. Spatial and Temporal Extrapolation of Disdrometer Size Distributions Based on a Lagrangian Trajectory Model of Falling Rain

    NASA Technical Reports Server (NTRS)

    Lane, John E.; Kasparis, Takis; Jones, W. Linwood; Metzger, Philip T.

    2009-01-01

    Methodologies to improve disdrometer processing, loosely based on mathematical techniques common to the field of particle flow and fluid mechanics, are examined and tested. The inclusion of advection and vertical wind field estimates appear to produce significantly improved results in a Lagrangian hydrometeor trajectory model, in spite of very strict assumptions of noninteracting hydrometeors, constant vertical air velocity, and time independent advection during the scan time interval. Wind field data can be extracted from each radar elevation scan by plotting and analyzing reflectivity contours over the disdrometer site and by collecting the radar radial velocity data to obtain estimates of advection. Specific regions of disdrometer spectra (drop size versus time) often exhibit strong gravitational sorting signatures, from which estimates of vertical velocity can be extracted. These independent wind field estimates become inputs and initial conditions to the Lagrangian trajectory simulation of falling hydrometeors.

  6. Likelihood Ratios for Glaucoma Diagnosis Using Spectral Domain Optical Coherence Tomography

    PubMed Central

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M.; Weinreb, Robert N.; Medeiros, Felipe A.

    2014-01-01

    Purpose To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral domain optical coherence tomography (spectral-domain OCT). Design Observational cohort study. Methods 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the Receiver Operating Characteristic (ROC) curve. Results Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86μm were associated with positive LRs, i.e., LRs greater than 1; whereas RNFL thickness values higher than 86μm were associated with negative LRs, i.e., LRs smaller than 1. A modified Fagan nomogram was provided to assist calculation of post-test probability of disease from the calculated likelihood ratios and pretest probability of disease. Conclusion The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision-making. PMID:23972303

  7. Failure mechanisms and lifetime prediction methodology for polybutylene pipe in water distribution system

    NASA Astrophysics Data System (ADS)

    Niu, Xiqun

    Polybutylene (PB) is a semicrystalline thermoplastics. It has been widely used in potable water distribution piping system. However, field practice shows that failure occurs much earlier than the expected service lifetime. What are the causes and how to appropriately evaluate its lifetime motivate this study. In this thesis, three parts of work have been done. First is the understanding of PB, which includes material thermo and mechanical characterization, aging phenomena and notch sensitivity. The second part analyzes the applicability of the existing lifetime testing method for PB. It is shown that PB is an anomaly in terms of the temperature-lifetime relation because of the fracture mechanism transition across the testing temperature range. The third part is the development of the methodology of lifetime prediction for PB pipe. The fracture process of PB pipe consists of three stages, i.e., crack initiation, slow crack growth (SCG) and crack instability. The practical lifetime of PB pipe is primarily determined by the duration of the first two stages. The mechanism of crack initiation and the quantitative estimation of the time to crack initiation are studied by employing environment stress cracking technique. A fatigue slow crack growth testing method has been developed and applied in the study of SCG. By using Paris-Erdogan equation, a model is constructed to evaluate the time for SCG. As a result, the total lifetime is determined. Through this work, the failure mechanisms of PB pipe has been analyzed and the lifetime prediction methodology has been developed.

  8. A Quasi-3-D Theory for Impedance Eduction in Uniform Grazing Flow

    NASA Technical Reports Server (NTRS)

    Watson, W. R.; Jones, M. G.; Parrott, T. L.

    2005-01-01

    A 2-D impedance eduction methodology is extended to quasi-3-D sound fields in uniform or shearing mean flow. We introduce a nonlocal, nonreflecting boundary condition to terminate the duct and then educe the impedance by minimizing an objective function. The introduction of a parallel, sparse, equation solver significantly reduces the wall clock time for educing the impedance when compared to that of the sequential band solver used in the 2-D methodology. The accuracy, efficiency, and robustness of the methodology is demonstrated using two examples. In the first example, we show that the method reproduces the known impedance of a ceramic tubular test liner. In the second example, we illustrate that the approach educes the impedance of a four-segment liner where the first, second, and fourth segments consist of a perforated face sheet bonded to honeycomb, and the third segment is a cut from the ceramic tubular test liner. The ability of the method to educe the impedances of multisegmented liners has the potential to significantly reduce the amount of time and cost required to determine the impedance of several uniform liners by allowing them to be placed in series in the test section and to educe the impedance of each segment using a single numerical experiment. Finally, we probe the objective function in great detail and show that it contains a single minimum. Thus, our objective function is ideal for use with local, inexpensive, gradient-based optimizers.

  9. Validity and Reliability of Field-Based Measures for Assessing Movement Skill Competency in Lifelong Physical Activities: A Systematic Review.

    PubMed

    Hulteen, Ryan M; Lander, Natalie J; Morgan, Philip J; Barnett, Lisa M; Robertson, Samuel J; Lubans, David R

    2015-10-01

    It has been suggested that young people should develop competence in a variety of 'lifelong physical activities' to ensure that they can be active across the lifespan. The primary aim of this systematic review is to report the methodological properties, validity, reliability, and test duration of field-based measures that assess movement skill competency in lifelong physical activities. A secondary aim was to clearly define those characteristics unique to lifelong physical activities. A search of four electronic databases (Scopus, SPORTDiscus, ProQuest, and PubMed) was conducted between June 2014 and April 2015 with no date restrictions. Studies addressing the validity and/or reliability of lifelong physical activity tests were reviewed. Included articles were required to assess lifelong physical activities using process-oriented measures, as well as report either one type of validity or reliability. Assessment criteria for methodological quality were adapted from a checklist used in a previous review of sport skill outcome assessments. Movement skill assessments for eight different lifelong physical activities (badminton, cycling, dance, golf, racquetball, resistance training, swimming, and tennis) in 17 studies were identified for inclusion. Methodological quality, validity, reliability, and test duration (time to assess a single participant), for each article were assessed. Moderate to excellent reliability results were found in 16 of 17 studies, with 71% reporting inter-rater reliability and 41% reporting intra-rater reliability. Only four studies in this review reported test-retest reliability. Ten studies reported validity results; content validity was cited in 41% of these studies. Construct validity was reported in 24% of studies, while criterion validity was only reported in 12% of studies. Numerous assessments for lifelong physical activities may exist, yet only assessments for eight lifelong physical activities were included in this review. Generalizability of results may be more applicable if more heterogeneous samples are used in future research. Moderate to excellent levels of inter- and intra-rater reliability were reported in the majority of studies. However, future work should look to establish test-retest reliability. Validity was less commonly reported than reliability, and further types of validity other than content validity need to be established in future research. Specifically, predictive validity of 'lifelong physical activity' movement skill competency is needed to support the assertion that such activities provide the foundation for a lifetime of activity.

  10. Young adolescents' engagement in dietary behaviour - the impact of gender, socio-economic status, self-efficacy and scientific literacy. Methodological aspects of constructing measures in nutrition literacy research using the Rasch model.

    PubMed

    Guttersrud, Øystein; Petterson, Kjell Sverre

    2015-10-01

    The present study validates a revised scale measuring individuals' level of the 'engagement in dietary behaviour' aspect of 'critical nutrition literacy' and describes how background factors affect this aspect of Norwegian tenth-grade students' nutrition literacy. Data were gathered electronically during a field trial of a standardised sample test in science. Test items and questionnaire constructs were distributed evenly across four electronic field-test booklets. Data management and analysis were performed using the RUMM2030 item analysis package and the IBM SPSS Statistics 20 statistical software package. Students responded on computers at school. Seven hundred and forty tenth-grade students at twenty-seven randomly sampled public schools were enrolled in the field-test study. The engagement in dietary behaviour scale and the self-efficacy in science scale were distributed to 178 of these students. The dietary behaviour scale and the self-efficacy in science scale came out as valid, reliable and well-targeted instruments usable for the construction of measurements. Girls and students with high self-efficacy reported higher engagement in dietary behaviour than other students. Socio-economic status and scientific literacy - measured as ability in science by applying an achievement test - did not correlate significantly different from zero with students' engagement in dietary behaviour.

  11. [Marine, aviation and space physician, psychologist and physiologist (To the 90th anniversary of the birth of G. M. Zarakovskii)].

    PubMed

    Dvornikov, M V; Medenkov, A A

    2015-04-01

    In the current paper authors discuss problems of marine and aerospace medicine and psychophysiology, which Georgii Zarakovskii (1925-2014), a prominent domestic experts in the field of military medicine, psychology and ergonomics, solved. Authors focused on methodological approaches and results of the study of psychophysiological characteristics and human capabilities took into account for design of tools and organization of flight crews, astronauts and military experts. Authors marked the contribution to the creation of a system integrating psychophysiological features and characteristics of the person neccessary for development, testing and maintenance of aerospace engineering and organization of its professional activities. The possibilities of using the methodology of psychophysiological activity analysis in order to improve the reliability of psychophysiological military specialists, are shown.

  12. Finding clusters of similar events within clinical incident reports: a novel methodology combining case based reasoning and information retrieval

    PubMed Central

    Tsatsoulis, C; Amthauer, H

    2003-01-01

    A novel methodological approach for identifying clusters of similar medical incidents by analyzing large databases of incident reports is described. The discovery of similar events allows the identification of patterns and trends, and makes possible the prediction of future events and the establishment of barriers and best practices. Two techniques from the fields of information science and artificial intelligence have been integrated—namely, case based reasoning and information retrieval—and very good clustering accuracies have been achieved on a test data set of incident reports from transfusion medicine. This work suggests that clustering should integrate the features of an incident captured in traditional form based records together with the detailed information found in the narrative included in event reports. PMID:14645892

  13. Validation of Vehicle Panel/Equipment Response from Diffuse Acoustic Field Excitation Using Spatially Correlated Transfer Function Approach

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; LaVerde, Bruce; Fulcher, Clay; Hunt, Ron

    2012-01-01

    An approach for predicting the vibration, strain, and force responses of a flight-like vehicle panel assembly to acoustic pressures is presented. Important validation for the approach is provided by comparison to ground test measurements in a reverberant chamber. The test article and the corresponding analytical model were assembled in several configurations to demonstrate the suitability of the approach for response predictions when the vehicle panel is integrated with equipment. Critical choices in the analysis necessary for convergence of the predicted and measured responses are illustrated through sensitivity studies. The methodology includes representation of spatial correlation of the pressure field over the panel surface. Therefore, it is possible to demonstrate the effects of hydrodynamic coincidence in the response. The sensitivity to pressure patch density clearly illustrates the onset of coincidence effects on the panel response predictions.

  14. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  15. Critical reflections on methodological challenge in arts and dementia evaluation and research.

    PubMed

    Gray, Karen; Evans, Simon Chester; Griffiths, Amanda; Schneider, Justine

    2017-01-01

    Methodological rigour, or its absence, is often a focus of concern for the emerging field of evaluation and research around arts and dementia. However, this paper suggests that critical attention should also be paid to the way in which individual perceptions, hidden assumptions and underlying social and political structures influence methodological work in the field. Such attention will be particularly important for addressing methodological challenges relating to contextual variability, ethics, value judgement and signification identified through a literature review on this topic. Understanding how, where and when evaluators and researchers experience such challenges may help to identify fruitful approaches for future evaluation.

  16. HDTS 2017.0 Testing and verification document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, Tad S.

    2017-08-01

    This report is a continuation of the series of Hunter Dose Tracking System (HDTS) Quality Assurance documents including (Foley and Powell, 2010; Dixon, 2012). In this report we have created a suite of automated test cases and a system to analyze the results of those tests as well as documented the methodology to ensure the field system performs within specifications. The software test cases cover all of the functions and interactions of functions that are practical to test. With the developed framework, if software defects are discovered, it will be easy to create one or more test cases to reproducemore » the defect and ensure that code changes correct the defect. These tests con rm HDTS version 2017.0 performs according to its specifications and documentation and that its performance meets the needs of its users at the Savannah River Site.« less

  17. Radiation resistance of elastomeric O-rings in mixed neutron and gamma fields: Testing methodology and experimental results

    NASA Astrophysics Data System (ADS)

    Zenoni, A.; Bignotti, F.; Donzella, A.; Donzella, G.; Ferrari, M.; Pandini, S.; Andrighetto, A.; Ballan, M.; Corradetti, S.; Manzolaro, M.; Monetti, A.; Rossignoli, M.; Scarpa, D.; Alloni, D.; Prata, M.; Salvini, A.; Zelaschi, F.

    2017-11-01

    Materials and components employed in the presence of intense neutron and gamma fields are expected to absorb high dose levels that may induce deep modifications of their physical and mechanical properties, possibly causing loss of their function. A protocol for irradiating elastomeric materials in reactor mixed neutron and gamma fields and for testing the evolution of their main mechanical and physical properties with absorbed dose has been developed. Four elastomeric compounds used for vacuum O-rings, one fluoroelastomer polymer (FPM) based and three ethylene propylene diene monomer rubber (EPDM) based, presently available on the market have been selected for the test. One EPDM is rated as radiation resistant in gamma fields, while the other elastomers are general purpose products. Particular care has been devoted to dosimetry calculations, since absorbed dose in neutron fields, unlike pure gamma fields, is strongly dependent on the material composition and, in particular, on the hydrogen content. The products have been tested up to about 2 MGy absorbed dose. The FPM based elastomer, in spite of its lower dose absorption in fast neutron fields, features the largest variations of properties, with a dramatic increase in stiffness and brittleness. Out of the three EPDM based compounds, one shows large and rapid changes in the main mechanical properties, whereas the other two feature more stable behaviors. The performance of the EPDM rated as radiation resistant in pure gamma fields does not appear significantly better than that of the standard product. The predictive capability of the accelerated irradiation tests performed as well as the applicable concepts of threshold of radiation damage is discussed in view of the use of the examined products in the selective production of exotic species facility, now under construction at the Legnaro National Laboratories of the Italian Istituto Nazionale di Fisica Nucleare. It results that a careful account of dose rate effects and oxygen penetration in the material, both during test irradiations and in operating conditions, is needed to obtain reliable predictions.

  18. Radiation resistance of elastomeric O-rings in mixed neutron and gamma fields: Testing methodology and experimental results.

    PubMed

    Zenoni, A; Bignotti, F; Donzella, A; Donzella, G; Ferrari, M; Pandini, S; Andrighetto, A; Ballan, M; Corradetti, S; Manzolaro, M; Monetti, A; Rossignoli, M; Scarpa, D; Alloni, D; Prata, M; Salvini, A; Zelaschi, F

    2017-11-01

    Materials and components employed in the presence of intense neutron and gamma fields are expected to absorb high dose levels that may induce deep modifications of their physical and mechanical properties, possibly causing loss of their function. A protocol for irradiating elastomeric materials in reactor mixed neutron and gamma fields and for testing the evolution of their main mechanical and physical properties with absorbed dose has been developed. Four elastomeric compounds used for vacuum O-rings, one fluoroelastomer polymer (FPM) based and three ethylene propylene diene monomer rubber (EPDM) based, presently available on the market have been selected for the test. One EPDM is rated as radiation resistant in gamma fields, while the other elastomers are general purpose products. Particular care has been devoted to dosimetry calculations, since absorbed dose in neutron fields, unlike pure gamma fields, is strongly dependent on the material composition and, in particular, on the hydrogen content. The products have been tested up to about 2 MGy absorbed dose. The FPM based elastomer, in spite of its lower dose absorption in fast neutron fields, features the largest variations of properties, with a dramatic increase in stiffness and brittleness. Out of the three EPDM based compounds, one shows large and rapid changes in the main mechanical properties, whereas the other two feature more stable behaviors. The performance of the EPDM rated as radiation resistant in pure gamma fields does not appear significantly better than that of the standard product. The predictive capability of the accelerated irradiation tests performed as well as the applicable concepts of threshold of radiation damage is discussed in view of the use of the examined products in the selective production of exotic species facility, now under construction at the Legnaro National Laboratories of the Italian Istituto Nazionale di Fisica Nucleare. It results that a careful account of dose rate effects and oxygen penetration in the material, both during test irradiations and in operating conditions, is needed to obtain reliable predictions.

  19. 12 CFR 46.6 - Stress test methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Stress test methodologies and practices. 46.6 Section 46.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.6 Stress test methodologies and practices. (a) Potential impact on capital. During each quarter of...

  20. 12 CFR 46.6 - Stress test methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Stress test methodologies and practices. 46.6 Section 46.6 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY ANNUAL STRESS TEST § 46.6 Stress test methodologies and practices. (a) Potential impact on capital. During each quarter of...

  1. Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue

    NASA Astrophysics Data System (ADS)

    Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang

    2018-07-01

    Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.

  2. Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue

    NASA Astrophysics Data System (ADS)

    Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang

    2018-03-01

    Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.

  3. VEGAS-SSS. A VST early-type galaxy survey: analysis of small stellar systems. Testing the methodology on the globular cluster system in NGC 3115

    NASA Astrophysics Data System (ADS)

    Cantiello, Michele; Capaccioli, Massimo; Napolitano, Nicola; Grado, Aniello; Limatola, Luca; Paolillo, Maurizio; Iodice, Enrica; Romanowsky, Aaron J.; Forbes, Duncan A.; Raimondo, Gabriella; Spavone, Marilena; La Barbera, Francesco; Puzia, Thomas H.; Schipani, Pietro

    2015-03-01

    We present a study of globular clusters (GCs) and other small stellar systems (SSSs) in the field of NGC 3115, observed as part of the ongoing wide-field imaging survey VEGAS, carried out with the 2.6 m VST telescope. We used deep g and i observations of NGC 3115, a well-studied lenticular galaxy that is covered excellently well in the scientific literature. This is fundamental to test the methodologies, verify the results, and probe the capabilities of the VEGAS-SSS. Leveraging the large field of view of the VST allowed us to accurately study the distribution and properties of SSSs as a function of galactocentric distance, well beyond ~20 galaxy effective radii, in a way that is rarely possible. Our analysis of colors, magnitudes, and sizes of SSS candidates confirms the results from existing studies, some of which were carried out with 8-10 m class telescopes, and further extends them to previously unreached galactocentric distances with similar accuracy. In particular, we find a color bimodality for the GC population and a de Vaucouleurs r1/4 profile for the surface density of GCs similar to the galaxy light profile. The radial color gradient of blue and red GCs previously found, for instance, by the SLUGGS survey with Subaru and Keck data, is further extended out to the largest galactocentric radii inspected, ~65 kpc. In addition, the surface density profiles of blue and red GCs taken separately are well approximated by a r1/4 density profile, with the fraction of blue GCs being slightly larger at larger radii. We do not find hints of a trend for the red GC subpopulation and for the GC turnover magnitude to vary with radius, but we observe a ~0.2 mag difference in the turnover magnitude of the blue and red GC subpopulations. Finally, from inspecting SSS sizes and colors, we obtain a list of ultracompact dwarf galaxies and GC candidates suitable for future spectroscopic follow-up. In conclusion, our study shows i) the reliability of the methodologies developed to study SSSs in the field of bright early-type galaxies; and ii) the great potential of the VEGAS survey to produce original results on SSSs science, mainly thanks to the wide-field imaging adopted. Full Table 3 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/576/A14

  4. Development of Methodology and Technology for Identifying and Quantifying Emission Products from Open Burning and Open Detonation Thermal Treatment Methods. Field Test Series A, B, and C. Volume 1. Test Summary

    DTIC Science & Technology

    1992-01-01

    3-37 Table 3.2 Nominal Composition of Explosive D ............................. 3-38 Table 3.3 Nominal Composition of PBXN -6...RDX used during Phase C was PBXN -6, a mixture of RDX and Viton An* (hereafter referred to as 3 RDX), The nominal composition of this explosive is...given in table 3.3. I I I I 3-38 3 I I Table 3.3 Nominal Composition of PBXN -6. II Carbon Content (%) Ingredient Weight (%)I __ .1• •,, ,,,,i, RDX 95.0

  5. Evaluation of Visual Analytics Environments: The Road to the Visual Analytics Science and Technology Challenge Evaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.

    The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less

  6. A pilot study to test psychophonetics methodology for self-care and empathy in compassion fatigue, burnout and secondary traumatic stress

    PubMed Central

    Butler, Nadine

    2013-01-01

    Abstract Background Home-based care is recognised as being a stressful occupation. Practitioners working with patients experiencing high levels of trauma may be susceptible to compassion fatigue, with the sustained need to remain empathic being a contributing factor. Objectives The aim of this research was to evaluate psychophonetics methodology for self-care and empathy skills as an intervention for compassion fatigue. Objectives were to measure levels of compassion fatigue pre-intervention, then to apply the intervention and retest levels one month and six months post-intervention. Method The research applied a pilot test of a developed intervention as a quasi-experiment. The study sample comprised home-based carers working with HIV-positive patients at a hospice in Grabouw, a settlement in the Western Cape facing socioeconomic challenge. Results The result of the pilot study showed a statistically-significant improvement in secondary traumatic stress, a component of compassion fatigue, measured with the ProQOL v5 instrument post-intervention. Conclusion The results gave adequate indication for the implementation of a larger study in order to apply and test the intervention. The study highlights a dire need for further research in this field.

  7. Proteins, Platelets, and Blood Coagulation at Biomaterial Interfaces

    PubMed Central

    Xu, Li-Chong; Bauer, James; Siedlecki, Christopher A.

    2015-01-01

    Blood coagulation and platelet adhesion remain major impediments to the use of biomaterials in implantable medical devices. There is still significant controversy and question in the field regarding the role that surfaces play in this process. This manuscript addresses this topic area and reports on state of the art in the field. Particular emphasis is placed on the subject of surface engineering and surface measurements that allow for control and observation of surface-mediated biological responses in blood and test solutions. Appropriate use of surface texturing and chemical patterning methodologies allow for reduction of both blood coagulation and platelet adhesion, and new methods of surface interrogation at high resolution allow for measurement of the relevant biological factors. PMID:25448722

  8. Reconstructing Star Formation Histories of Galaxies

    NASA Astrophysics Data System (ADS)

    Fritze-v. Alvensleben, U.; Lilly, T.

    2007-12-01

    We present a methodological study to find out how far back and to what precision star formation histories of galaxies can be reconstructed from CMDs, from integrated spectra and Lick indices, and from integrated multi-band photometry. Our evolutionary synthesis models GALEV allow to describe the evolution of galaxies in terms of all three approaches and we have assumed typical observational uncertainties for each of them and then investigated to what extent and accuracy different star formation histories can be discriminated. For a field in the LMC bar region with both a deep CMD from HST observations and a trailing slit spectrum across exactly the same field of view we could test our modelling results against real data.

  9. Diagnosing schistosomiasis: where are we?

    PubMed

    Gomes, Luciana Inácia; Enk, Martin Johannes; Rabello, Ana

    2014-01-01

    In light of the World Health Organization's initiative to extend schistosomiasis morbidity and mortality control programs by including a disease elimination strategy in low endemic settings, this paper reviews diagnostic tools described during the last decades and provide an overview of ongoing efforts in making an efficient diagnostic tool available worldwide. A literature search on PubMed using the search criteria schistosomiasis and diagnosis within the period from 1978 to 2013 was carried out. Articles with abstract in English and that used laboratory techniques specifically developed for the detection of schistosomiasis in humans were included. Publications were categorized according to the methodology applied (parasitological, immunological, or molecular) and stage of development (in house development, limited field, or large scale field testing). The initial research generated 4,535 publications, of which only 643 met the inclusion criteria. The vast majority (537) of the publications focused on immunological techniques; 81 focused on parasitological diagnosis, and 25 focused on molecular diagnostic methods. Regarding the stage of development, 307 papers referred to in-house development, 202 referred to limited field tests, and 134 referred to large scale field testing. The data obtained show that promising new diagnostic tools, especially for Schistosoma antigen and deoxyribonucleic acid (DNA) detection, which are characterized by high sensitivity and specificity, are being developed. In combination with international funding initiatives these tools may result in a significant step forward in successful disease elimination and surveillance, which is to make efficient tests accessible and its large use self-sustainable for control programs in endemic countries.

  10. Numerical Modelling of Femur Fracture and Experimental Validation Using Bone Simulant.

    PubMed

    Marco, Miguel; Giner, Eugenio; Larraínzar-Garijo, Ricardo; Caeiro, José Ramón; Miguélez, María Henar

    2017-10-01

    Bone fracture pattern prediction is still a challenge and an active field of research. The main goal of this article is to present a combined methodology (experimental and numerical) for femur fracture onset analysis. Experimental work includes the characterization of the mechanical properties and fracture testing on a bone simulant. The numerical work focuses on the development of a model whose material properties are provided by the characterization tests. The fracture location and the early stages of the crack propagation are modelled using the extended finite element method and the model is validated by fracture tests developed in the experimental work. It is shown that the accuracy of the numerical results strongly depends on a proper bone behaviour characterization.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabrera-Palmer, Belkis

    Predicting the performance of radiation detection systems at field sites based on measured performance acquired under controlled conditions at test locations, e.g., the Nevada National Security Site (NNSS), remains an unsolved and standing issue within DNDO’s testing methodology. Detector performance can be defined in terms of the system’s ability to detect and/or identify a given source or set of sources, and depends on the signal generated by the detector for the given measurement configuration (i.e., source strength, distance, time, surrounding materials, etc.) and on the quality of the detection algorithm. Detector performance is usually evaluated in the performance and operationalmore » testing phases, where the measurement configurations are selected to represent radiation source and background configurations of interest to security applications.« less

  12. Measuring College Students' Alcohol Consumption in Natural Drinking Environments: Field Methodologies for Bars and Parties

    ERIC Educational Resources Information Center

    Clapp, John D.; Holmes, Megan R.; Reed, Mark B.; Shillington, Audrey M.; Freisthler, Bridget; Lange, James E.

    2007-01-01

    In recent years researchers have paid substantial attention to the issue of college students' alcohol use. One limitation to the current literature is an over reliance on retrospective, self-report survey data. This article presents field methodologies for measuring college students' alcohol consumption in natural drinking environments.…

  13. Neuroethics and animals: methods and philosophy.

    PubMed

    Takala, Tuija; Häyry, Matti

    2014-04-01

    This article provides an overview of the six other contributions in the Neuroethics and Animals special section. In addition, it discusses the methodological and theoretical problems of interdisciplinary fields. The article suggests that interdisciplinary approaches without established methodological and theoretical bases are difficult to assess scientifically. This might cause these fields to expand without actually advancing.

  14. Methods to Prove 20+ Year Life of CPV Products (in less than 20 Years)

    NASA Astrophysics Data System (ADS)

    Bowman, John; Spencer, Mark

    2011-12-01

    Due to the long term life expectations of photovoltaic products and the short duration of most introduced CPV technologies, it is critical for CPV companies to carefully construct field trials to prove product life. Because of the complicated geometric, thermal, and spectral characteristics of CPV systems, conducting very precise power output measurements reproducibly over many months is very difficult. Robust normalization methods specific to the exact optical system and PV cell type must be developed. Once the performance over a specific duration, e.g. one year, is established, then some justification is required to extrapolate to future performance. Comparisons to accelerated test results provide this justification. SolFocus has been conducting field trials of the SF-1100S CPV system for over two years. These field trials consist of controlled populations of SF-1100P modules, operating in grid-tied systems, which have been repeatedly measured at the individual module level over the duration of the trials. In this paper, field data will be presented along with normalization methodology and statistical methods for determining power degradation slope distributions for populations of individual modules. These results will be correlated with accelerated field tests which have been ongoing for 1.5 years and are estimated to be equivalent to 10 to 15 years of non-accelerated operation.

  15. A methodology of SiP testing based on boundary scan

    NASA Astrophysics Data System (ADS)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  16. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a 99% of confidence.

  17. 1996-2016: Two decades of econophysics: Between methodological diversification and conceptual coherence

    NASA Astrophysics Data System (ADS)

    Schinckus, C.

    2016-12-01

    This article aimed at presenting the scattered econophysics literature as a unified and coherent field through a specific lens imported from philosophy science. More precisely, I used the methodology developed by Imre Lakatos to cover the methodological evolution of econophysics over these last two decades. In this perspective, three co-existing approaches have been identified: statistical econophysics, bottom-up agent based econophysics and top-down agent based econophysics. Although the last is presented here as the last step of the methodological evolution of econophysics, it is worth mentioning that this tradition is still very new. A quick look on the econophysics literature shows that the vast majority of works in this field deal with a strictly statistical approach or a classical bottom-up agent-based modelling. In this context of diversification, the objective (and contribution) of this article is to emphasize the conceptual coherence of econophysics as a unique field of research. With this purpose, I used a theoretical framework coming from philosophy of science to characterize how econophysics evolved by combining a methodological enrichment with the preservation of its core conceptual statements.

  18. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges.

    PubMed

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-08-24

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases).

  19. Recent Advances of MEMS Resonators for Lorentz Force Based Magnetic Field Sensors: Design, Applications and Challenges

    PubMed Central

    Herrera-May, Agustín Leobardo; Soler-Balcazar, Juan Carlos; Vázquez-Leal, Héctor; Martínez-Castillo, Jaime; Vigueras-Zuñiga, Marco Osvaldo; Aguilera-Cortés, Luz Antonio

    2016-01-01

    Microelectromechanical systems (MEMS) resonators have allowed the development of magnetic field sensors with potential applications such as biomedicine, automotive industry, navigation systems, space satellites, telecommunications and non-destructive testing. We present a review of recent magnetic field sensors based on MEMS resonators, which operate with Lorentz force. These sensors have a compact structure, wide measurement range, low energy consumption, high sensitivity and suitable performance. The design methodology, simulation tools, damping sources, sensing techniques and future applications of magnetic field sensors are discussed. The design process is fundamental in achieving correct selection of the operation principle, sensing technique, materials, fabrication process and readout systems of the sensors. In addition, the description of the main sensing systems and challenges of the MEMS sensors are discussed. To develop the best devices, researches of their mechanical reliability, vacuum packaging, design optimization and temperature compensation circuits are needed. Future applications will require multifunctional sensors for monitoring several physical parameters (e.g., magnetic field, acceleration, angular ratio, humidity, temperature and gases). PMID:27563912

  20. Phase II: Field Detector Development For Undeclared/Declared Nuclear Testing For Treaty Verfiation Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriz, M.; Hunter, D.; Riley, T.

    2015-10-02

    Radioactive xenon isotopes are a critical part of the Comprehensive Nuclear Test Ban Treaty (CTBT) for the detection or confirmation of nuclear weapons tests as well as on-site treaty verification monitoring. On-site monitoring is not currently conducted because there are no commercially available small/robust field detector devices to measure the radioactive xenon isotopes. Xenon is an ideal signature to detect clandestine nuclear events since they are difficult to contain and can diffuse and migrate through soils due to their inert nature. There are four key radioxenon isotopes used in monitoring: 135Xe (9 hour half-life), 133mXe (2 day half-life), 133Xe (5more » day half-life) and 131mXe (12 day half-life) that decay through beta emission and gamma emission. Savannah River National Laboratory (SRNL) is a leader in the field of gas collections and has developed highly selective molecular sieves that allow for the collection of xenon gas directly from air. Phase I assessed the development of a small, robust beta-gamma coincidence counting system, that combines collection and in situ detection methodologies. Phase II of the project began development of the custom electronics enabling 2D beta-gamma coincidence analysis in a field portable system. This will be a significant advancement for field detection/quantification of short-lived xenon isotopes that would not survive transport time for laboratory analysis.« less

  1. Small Hot Jet Acoustic Rig Validation

    NASA Technical Reports Server (NTRS)

    Brown, Cliff; Bridges, James

    2006-01-01

    The Small Hot Jet Acoustic Rig (SHJAR), located in the Aeroacoustic Propulsion Laboratory (AAPL) at the NASA Glenn Research Center in Cleveland, Ohio, was commissioned in 2001 to test jet noise reduction concepts at low technology readiness levels (TRL 1-3) and develop advanced measurement techniques. The first series of tests on the SHJAR were designed to prove its capabilities and establish the quality of the jet noise data produced. Towards this goal, a methodology was employed dividing all noise sources into three categories: background noise, jet noise, and rig noise. Background noise was directly measured. Jet noise and rig noise were separated by using the distance and velocity scaling properties of jet noise. Effectively, any noise source that did not follow these rules of jet noise was labeled as rig noise. This method led to the identification of a high frequency noise source related to the Reynolds number. Experiments using boundary layer treatment and hot wire probes documented this noise source and its removal, allowing clean testing of low Reynolds number jets. Other tests performed characterized the amplitude and frequency of the valve noise, confirmed the location of the acoustic far field, and documented the background noise levels under several conditions. Finally, a full set of baseline data was acquired. This paper contains the methodology and test results used to verify the quality of the SHJAR rig.

  2. Assessing Methods for Mapping 2D Field Concentrations of CO2 Over Large Spatial Areas for Monitoring Time Varying Fluctuations

    NASA Astrophysics Data System (ADS)

    Zaccheo, T. S.; Pernini, T.; Botos, C.; Dobler, J. T.; Blume, N.; Braun, M.; Levine, Z. H.; Pintar, A. L.

    2014-12-01

    This work presents a methodology for constructing 2D estimates of CO2 field concentrations from integrated open path measurements of CO2 concentrations. It provides a description of the methodology, an assessment based on simulated data and results from preliminary field trials. The Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE) system, currently under development by Exelis and AER, consists of a set of laser-based transceivers and a number of retro-reflectors coupled with a cloud-based compute environment to enable real-time monitoring of integrated CO2 path concentrations, and provides 2D maps of estimated concentrations over an extended area of interest. The GreenLITE transceiver-reflector pairs provide laser absorption spectroscopy (LAS) measurements of differential absorption due to CO2 along intersecting chords within the field of interest. These differential absorption values for the intersecting chords of horizontal path are not only used to construct estimated values of integrated concentration, but also employed in an optimal estimation technique to derive 2D maps of underlying concentration fields. This optimal estimation technique combines these sparse data with in situ measurements of wind speed/direction and an analytic plume model to provide tomographic-like reconstruction of the field of interest. This work provides an assessment of this reconstruction method and preliminary results from the Fall 2014 testing at the Zero Emissions Research and Technology (ZERT) site in Bozeman, Montana. This work is funded in part under the GreenLITE program developed under a cooperative agreement between Exelis and the National Energy and Technology Laboratory (NETL) under the Department of Energy (DOE), contract # DE-FE0012574. Atmospheric and Environmental Research, Inc. is a major partner in this development.

  3. Applications of multivariate modeling to neuroimaging group analysis: a comprehensive alternative to univariate general linear model.

    PubMed

    Chen, Gang; Adleman, Nancy E; Saad, Ziad S; Leibenluft, Ellen; Cox, Robert W

    2014-10-01

    All neuroimaging packages can handle group analysis with t-tests or general linear modeling (GLM). However, they are quite hamstrung when there are multiple within-subject factors or when quantitative covariates are involved in the presence of a within-subject factor. In addition, sphericity is typically assumed for the variance-covariance structure when there are more than two levels in a within-subject factor. To overcome such limitations in the traditional AN(C)OVA and GLM, we adopt a multivariate modeling (MVM) approach to analyzing neuroimaging data at the group level with the following advantages: a) there is no limit on the number of factors as long as sample sizes are deemed appropriate; b) quantitative covariates can be analyzed together with within-subject factors; c) when a within-subject factor is involved, three testing methodologies are provided: traditional univariate testing (UVT) with sphericity assumption (UVT-UC) and with correction when the assumption is violated (UVT-SC), and within-subject multivariate testing (MVT-WS); d) to correct for sphericity violation at the voxel level, we propose a hybrid testing (HT) approach that achieves equal or higher power via combining traditional sphericity correction methods (Greenhouse-Geisser and Huynh-Feldt) with MVT-WS. To validate the MVM methodology, we performed simulations to assess the controllability for false positives and power achievement. A real FMRI dataset was analyzed to demonstrate the capability of the MVM approach. The methodology has been implemented into an open source program 3dMVM in AFNI, and all the statistical tests can be performed through symbolic coding with variable names instead of the tedious process of dummy coding. Our data indicates that the severity of sphericity violation varies substantially across brain regions. The differences among various modeling methodologies were addressed through direct comparisons between the MVM approach and some of the GLM implementations in the field, and the following two issues were raised: a) the improper formulation of test statistics in some univariate GLM implementations when a within-subject factor is involved in a data structure with two or more factors, and b) the unjustified presumption of uniform sphericity violation and the practice of estimating the variance-covariance structure through pooling across brain regions. Published by Elsevier Inc.

  4. Tools of the trade: theory and method in mindfulness neuroscience.

    PubMed

    Tang, Yi-Yuan; Posner, Michael I

    2013-01-01

    Mindfulness neuroscience is an emerging research field that investigates the underlying mechanisms of different mindfulness practices, different stages and different states of practice as well as different effects of practice over the lifespan. Mindfulness neuroscience research integrates theory and methods from eastern contemplative traditions, western psychology and neuroscience, and from neuroimaging techniques, physiological measures and behavioral tests. We here review several key theoretical and methodological challenges in the empirical study of mindfulness neuroscience and provide suggestions for overcoming these challenges.

  5. Tests for the Assessment of Sport-Specific Performance in Olympic Combat Sports: A Systematic Review With Practical Recommendations.

    PubMed

    Chaabene, Helmi; Negra, Yassine; Bouguezzi, Raja; Capranica, Laura; Franchini, Emerson; Prieske, Olaf; Hbacha, Hamdi; Granacher, Urs

    2018-01-01

    The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43-1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = -0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.

  6. Background element content of the lichen Pseudevernia furfuracea: A supra-national state of art implemented by novel field data from Italy.

    PubMed

    Cecconi, Elva; Incerti, Guido; Capozzi, Fiore; Adamo, Paola; Bargagli, Roberto; Benesperi, Renato; Candotto Carniel, Fabio; Favero-Longo, Sergio Enrico; Giordano, Simonetta; Puntillo, Domenico; Ravera, Sonia; Spagnuolo, Valeria; Tretiach, Mauro

    2018-05-01

    In biomonitoring, the knowledge of background element content (BEC) values is an essential pre-requisite for the correct assessment of pollution levels. Here, we estimated the BEC values of a highly performing biomonitor, the epiphytic lichen Pseudevernia furfuracea, by means of a careful review of literature data, integrated by an extensive field survey. Methodologically homogeneous element content datasets, reflecting different exposure conditions across European and extra-European countries, were compiled and comparatively analysed. Element content in samples collected in remote areas was compared to that of potentially enriched samples, testing differences between medians for 25 elements. This analysis confirmed that the former samples were substantially unaffected by anthropogenic contributions, and their metrics were therefore proposed as a first overview at supra-national background level. We also showed that bioaccumulation studies suffer a huge methodological variability. Limited to original field data, we investigated the background variability of 43 elements in 62 remote Italian sites, characterized in GIS environment for anthropization, land use, climate and lithology at different scale resolution. The relationships between selected environmental descriptors and BEC were tested using Principal Component Regression (PCR) modelling. Elemental composition resulted significantly dependent on land use, climate and lithology. In the case of lithogenic elements, regression models correctly reproduced the lichen content throughout the country at randomly selected sites. Further descriptors should be identified only for As, Co, and V. Through a multivariate approach we also identified three geographically homogeneous macro-regions for which specific BECs were provided for use as reference in biomonitoring applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Path planning in uncertain flow fields using ensemble method

    NASA Astrophysics Data System (ADS)

    Wang, Tong; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.

    2016-10-01

    An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.

  8. Methodologies for evaluating performance and assessing uncertainty of atmospheric dispersion models

    NASA Astrophysics Data System (ADS)

    Chang, Joseph C.

    This thesis describes methodologies to evaluate the performance and to assess the uncertainty of atmospheric dispersion models, tools that predict the fate of gases and aerosols upon their release into the atmosphere. Because of the large economic and public-health impacts often associated with the use of the dispersion model results, these models should be properly evaluated, and their uncertainty should be properly accounted for and understood. The CALPUFF, HPAC, and VLSTRACK dispersion modeling systems were applied to the Dipole Pride (DP26) field data (˜20 km in scale), in order to demonstrate the evaluation and uncertainty assessment methodologies. Dispersion model performance was found to be strongly dependent on the wind models used to generate gridded wind fields from observed station data. This is because, despite the fact that the test site was a flat area, the observed surface wind fields still showed considerable spatial variability, partly because of the surrounding mountains. It was found that the two components were comparable for the DP26 field data, with variability more important than uncertainty closer to the source, and less important farther away from the source. Therefore, reducing data errors for input meteorology may not necessarily increase model accuracy due to random turbulence. DP26 was a research-grade field experiment, where the source, meteorological, and concentration data were all well-measured. Another typical application of dispersion modeling is a forensic study where the data are usually quite scarce. An example would be the modeling of the alleged releases of chemical warfare agents during the 1991 Persian Gulf War, where the source data had to rely on intelligence reports, and where Iraq had stopped reporting weather data to the World Meteorological Organization since the 1981 Iran-Iraq-war. Therefore the meteorological fields inside Iraq must be estimated by models such as prognostic mesoscale meteorological models, based on observational data from areas outside of Iraq, and using the global fields simulated by the global meteorological models as the initial and boundary conditions for the mesoscale models. It was found that while comparing model predictions to observations in areas outside of Iraq, the predicted surface wind directions had errors between 30 to 90 deg, but the inter-model differences (or uncertainties) in the predicted surface wind directions inside Iraq, where there were no onsite data, were fairly constant at about 70 deg. (Abstract shortened by UMI.)

  9. Joint Test Protocol: Environmentally Friendly Zirconium Oxide Pretreatment Demonstration

    DTIC Science & Technology

    2013-12-01

    coatings . Loss of paint adhesion is the primary failure mode on aluminum and steel. 3.7.3 Test Methodology The test methodology for pencil hardness...conversion pretreatment coatings . Loss of paint adhesion is the primary failure mode on aluminum and steel. 3.8.3 Test Methodology The test...SUPPLEMENTARY NOTES 14. ABSTRACT There is a need to implement innovative and cost- effective replacement technologies to address the multiple health, safety

  10. Robot navigation research using the HERMIES mobile robot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, D.L.

    1989-01-01

    In recent years robot navigation has attracted much attention from researchers around the world. Not only are theoretical studies being simulated on sophisticated computers, but many mobile robots are now used as test vehicles for these theoretical studies. Various algorithms have been perfected for navigation in a known static environment; but navigation in an unknown and dynamic environment poses a much more challenging problem for researchers. Many different methodologies have been developed for autonomous robot navigation, but each methodology is usually restricted to a particular type of environment. One important research focus of the Center for Engineering Systems Advanced researchmore » (CESAR) at Oak Ridge National Laboratory, is autonomous navigation in unknown and dynamic environments using the series of HERMIES mobile robots. The research uses an expert system for high-level planning interfaced with C-coded routines for implementing the plans, and for quick processing of data requested by the expert system. In using this approach, the navigation is not restricted to one methodology since the expert system can activate a rule module for the methodology best suited for the current situation. Rule modules can be added the rule base as they are developed and tested. Modules are being developed or enhanced for navigating from a map, searching for a target, exploring, artificial potential-field navigation, navigation using edge-detection, etc. This paper will report on the various rule modules and methods of navigation in use, or under development at CESAR, using the HERMIES-IIB robot as a testbed. 13 refs., 5 figs., 1 tab.« less

  11. A systematic review of stakeholder views of selection methods for medical schools admission.

    PubMed

    Kelly, M E; Patterson, F; O'Flynn, S; Mulligan, J; Murphy, A W

    2018-06-15

    The purpose of this paper is to systematically review the literature with respect to stakeholder views of selection methods for medical school admissions. An electronic search of nine databases was conducted between January 2000-July 2014. Two reviewers independently assessed all titles (n = 1017) and retained abstracts (n = 233) for relevance. Methodological quality of quantitative papers was assessed using the MERSQI instrument. The overall quality of evidence in this field was low. Evidence was synthesised in a narrative review. Applicants support interviews, and multiple mini interviews (MMIs). There is emerging evidence that situational judgement tests (SJTs) and selection centres (SCs) are also well regarded, but aptitude tests less so. Selectors endorse the use of interviews in general and in particular MMIs judging them to be fair, relevant and appropriate, with emerging evidence of similarly positive reactions to SCs. Aptitude tests and academic records were valued in decisions of whom to call to interview. Medical students prefer interviews based selection to cognitive aptitude tests. They are unconvinced about the transparency and veracity of written applications. Perceptions of organisational justice, which describe views of fairness in organisational processes, appear to be highly influential on stakeholders' views of the acceptability of selection methods. In particular procedural justice (perceived fairness of selection tools in terms of job relevance and characteristics of the test) and distributive justice (perceived fairness of selection outcomes in terms of equal opportunity and equity), appear to be important considerations when deciding on acceptability of selection methods. There were significant gaps with respect to both key stakeholder groups and the range of selection tools assessed. Notwithstanding the observed limitations in the quality of research in this field, there appears to be broad concordance of views on the various selection methods, across the diverse stakeholders groups. This review highlights the need for better standards, more appropriate methodologies and for broadening the scope of stakeholder research.

  12. Finite element analysis of steady and transiently moving/rolling nonlinear viscoelastic structure. III - Impact/contact simulations

    NASA Technical Reports Server (NTRS)

    Nakajima, Yukio; Padovan, Joe

    1987-01-01

    In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.

  13. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    DTIC Science & Technology

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  14. The SAMI Galaxy Survey: cubism and covariance, putting round pegs into square holes

    NASA Astrophysics Data System (ADS)

    Sharp, R.; Allen, J. T.; Fogarty, L. M. R.; Croom, S. M.; Cortese, L.; Green, A. W.; Nielsen, J.; Richards, S. N.; Scott, N.; Taylor, E. N.; Barnes, L. A.; Bauer, A. E.; Birchall, M.; Bland-Hawthorn, J.; Bloom, J. V.; Brough, S.; Bryant, J. J.; Cecil, G. N.; Colless, M.; Couch, W. J.; Drinkwater, M. J.; Driver, S.; Foster, C.; Goodwin, M.; Gunawardhana, M. L. P.; Ho, I.-T.; Hampton, E. J.; Hopkins, A. M.; Jones, H.; Konstantopoulos, I. S.; Lawrence, J. S.; Leslie, S. K.; Lewis, G. F.; Liske, J.; López-Sánchez, Á. R.; Lorente, N. P. F.; McElroy, R.; Medling, A. M.; Mahajan, S.; Mould, J.; Parker, Q.; Pracy, M. B.; Obreschkow, D.; Owers, M. S.; Schaefer, A. L.; Sweet, S. M.; Thomas, A. D.; Tonini, C.; Walcher, C. J.

    2015-01-01

    We present a methodology for the regularization and combination of sparse sampled and irregularly gridded observations from fibre-optic multiobject integral field spectroscopy. The approach minimizes interpolation and retains image resolution on combining subpixel dithered data. We discuss the methodology in the context of the Sydney-AAO multiobject integral field spectrograph (SAMI) Galaxy Survey underway at the Anglo-Australian Telescope. The SAMI instrument uses 13 fibre bundles to perform high-multiplex integral field spectroscopy across a 1° diameter field of view. The SAMI Galaxy Survey is targeting ˜3000 galaxies drawn from the full range of galaxy environments. We demonstrate the subcritical sampling of the seeing and incomplete fill factor for the integral field bundles results in only a 10 per cent degradation in the final image resolution recovered. We also implement a new methodology for tracking covariance between elements of the resulting data cubes which retains 90 per cent of the covariance information while incurring only a modest increase in the survey data volume.

  15. [Field investigations of the air pollution level of populated territories].

    PubMed

    Vinokurov, M V

    2014-01-01

    The assessment and management of air quality of settlements is one of the priorities in the field of environmental protection. In the management of air quality the backbone factor is the methodology of the organization, performance and interpretation of data of field investigations. The present article is devoted to the analysis of the existing methodological approaches and practical aspects of their application in the organization and performance of field investigations with the aim to confirm the adequacy of the boundaries of the sanitary protection zone in the old industrial regions, hygienic evaluation of the data of field investigations of the air pollution level.

  16. Achieving Systemic Information Operations for Australian Defence

    DTIC Science & Technology

    1999-10-01

    is Checkland’s Soft Systems Methodology and some emphasis is placed on this methodology in the present document. Other soft methodologies also exist...Warfare 2 2 Proposed Development Method 5 3 Soft Systems Methodology 8 DSTO-TN-0235 DSTO-TN-0235 1 Introduction Widespread concern...that will be adopted will be one chosen from the burgeoning field of soft systems theory, for example Checkland’s Soft Systems Methodology (SSM)[8

  17. Teaching Research through Field Studies: A Cumulative Opportunity for Teaching Methodology to Human Geography Undergraduates

    ERIC Educational Resources Information Center

    Panelli, Ruth; Welch, Richard V.

    2005-01-01

    Notwithstanding its iconic status within geography, the debate continues about how fieldwork should be taught to undergraduate students. The authors engage with this debate and argue that field studies should follow the teaching of research methodology. In this paper they review relevant literature on the place of fieldwork in geography training,…

  18. A Methodology for Calculating Prestige Ranks of Academic Journals in Communication: A More Inclusive Alternative to Citation Metrics

    ERIC Educational Resources Information Center

    Stephen, Timothy D.

    2011-01-01

    The problem of how to rank academic journals in the communication field (human interaction, mass communication, speech, and rhetoric) is one of practical importance to scholars, university administrators, and librarians, yet there is no methodology that covers the field's journals comprehensively and objectively. This article reports a new ranking…

  19. A RLS-SVM Aided Fusion Methodology for INS during GPS Outages

    PubMed Central

    Yao, Yiqing; Xu, Xiaosu

    2017-01-01

    In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics. PMID:28245549

  20. A RLS-SVM Aided Fusion Methodology for INS during GPS Outages.

    PubMed

    Yao, Yiqing; Xu, Xiaosu

    2017-02-24

    In order to maintain a relatively high accuracy of navigation performance during global positioning system (GPS) outages, a novel robust least squares support vector machine (LS-SVM)-aided fusion methodology is explored to provide the pseudo-GPS position information for the inertial navigation system (INS). The relationship between the yaw, specific force, velocity, and the position increment is modeled. Rather than share the same weight in the traditional LS-SVM, the proposed algorithm allocates various weights for different data, which makes the system immune to the outliers. Field test data was collected to evaluate the proposed algorithm. The comparison results indicate that the proposed algorithm can effectively provide position corrections for standalone INS during the 300 s GPS outage, which outperforms the traditional LS-SVM method. Historical information is also involved to better represent the vehicle dynamics.

  1. Blood grouping based on PCR methods and agarose gel electrophoresis.

    PubMed

    Sell, Ana Maria; Visentainer, Jeane Eliete Laguila

    2015-01-01

    The study of erythrocyte antigens continues to be an intense field of research, particularly after the development of molecular testing methods. More than 300 specificities have been described by the International Society for Blood Transfusion as belonging to 33 blood group systems. The polymerase chain reaction (PCR) is a central tool for red blood cells (RBC) genotyping. PCR and agarose gel electrophoresis are low cost, easy, and versatile in vitro methods for amplifying defined target DNA (RBC polymorphic region). Multiplex-PCR, AS-PCR (Specific Allele Polymerase Chain Reaction), and RFLP-PCR (Restriction Fragment Length Polymorphism-Polymerase Chain Reaction) techniques are usually to identify RBC polymorphisms. Furthermore, it is an easy methodology to implement. This chapter describes the PCR methodology and agarose gel electrophoresis to identify the polymorphisms of the Kell, Duffy, Kidd, and MNS blood group systems.

  2. A methodological analysis of chaplaincy research: 2000-2009.

    PubMed

    Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F

    2011-01-01

    The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.

  3. Estimating the hydraulic parameters of a confined aquifer based on the response of groundwater levels to seismic Rayleigh waves

    NASA Astrophysics Data System (ADS)

    Sun, Xiaolong; Xiang, Yang; Shi, Zheming

    2018-05-01

    Groundwater flow models implemented to manage regional water resources require aquifer hydraulic parameters. Traditional methods for obtaining these parameters include laboratory experiments, field tests and model inversions, and each are potentially hindered by their unique limitations. Here, we propose a methodology for estimating hydraulic conductivity and storage coefficients using the spectral characteristics of the coseismic groundwater-level oscillations and seismic Rayleigh waves. The results from Well X10 are consistent with the variations and spectral characteristics of the water-level oscillations and seismic waves and present an estimated hydraulic conductivity of approximately 1 × 10-3 m s-1 and storativity of 15 × 10-6. The proposed methodology for estimating hydraulic parameters in confined aquifers is a practical and novel approach for groundwater management and seismic precursor anomaly analyses.

  4. Approaching sign language test construction: adaptation of the German sign language receptive skills test.

    PubMed

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired in preschool- and school-aged children (4-8 years old) is urgently needed. Using the British Sign Language Receptive Skills Test, that has been standardized and has sound psychometric properties, as a template for adaptation thus provides a starting point for tests of a sign language that is less documented, such as DGS. This article makes a novel contribution to the field by examining linguistic, cultural, and methodological issues in the process of adapting a test from the source language to the target language. The adapted DGS test has sound psychometric properties and provides the basis for revision prior to standardization. © The Author 2011. Published by Oxford University Press. All rights reserved.

  5. An integrative typology of personality assessment for aggression: implications for predicting counterproductive workplace behavior.

    PubMed

    Bing, Mark N; Stewart, Susan M; Davison, H Kristl; Green, Philip D; McIntyre, Michael D; James, Lawrence R

    2007-05-01

    This study presents an integrative typology of personality assessment for aggression. In this typology, self-report and conditional reasoning (L. R. James, 1998) methodologies are used to assess 2 separate, yet often congruent, components of aggressive personalities. Specifically, self-report is used to assess explicit components of aggressive tendencies, such as self-perceived aggression, whereas conditional reasoning is used to assess implicit components, in particular, the unconscious biases in reasoning that are used to justify aggressive acts. These 2 separate components are then integrated to form a new theoretical typology of personality assessment for aggression. Empirical tests of the typology were subsequently conducted using data gathered across 3 samples in laboratory and field settings and reveal that explicit and implicit components of aggression can interact in the prediction of counterproductive, deviant, and prosocial behaviors. These empirical tests also reveal that when either the self-report or conditional reasoning methodology is used in isolation, the resulting assessment of aggression may be incomplete. Implications for personnel selection, team composition, and executive coaching are discussed. 2007 APA, all rights reserved

  6. Epidemiology Characteristics, Methodological Assessment and Reporting of Statistical Analysis of Network Meta-Analyses in the Field of Cancer

    PubMed Central

    Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu

    2016-01-01

    Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997

  7. Eye-Tracking as a Tool in Process-Oriented Reading Test Validation

    ERIC Educational Resources Information Center

    Solheim, Oddny Judith; Uppstad, Per Henning

    2011-01-01

    The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arntzen, Evan V.; Hand, Kristine D.; Carter, Kathleen M.

    At the request of the U.S. Army Corps of Engineers (USACE; Portland District), Pacific Northwest National Laboratory (PNNL) undertook a project in 2006 to look further into issues of total dissolved gas (TDG) supersaturation in the lower Columbia River downstream of Bonneville Dam. In FY 2008, the third year of the project, PNNL conducted field monitoring and laboratory toxicity testing to both verify results from 2007 and answer some additional questions about how salmonid sac fry respond to elevated TDG in the field and the laboratory. For FY 2008, three objectives were 1) to repeat the 2006-2007 field effort tomore » collect empirical data on TDG from the Ives Island and Multnomah Falls study sites; 2) to repeat the static laboratory toxicity tests on hatchery chum salmon fry to verify 2007 results and to expose wild chum salmon fry to incremental increases in TDG, above those of the static test, until external symptoms of gas bubble disease were clearly present; and 3) to assess physiological responses to TDG levels in wild chum salmon sac fry incubating below Bonneville Dam during spill operations. This report summarizes the tasks conducted and results obtained in pursuit of the three objectives. Chapter 1 discusses the field monitoring, Chapter 2 reports the findings of the laboratory toxicity tests, and Chapter 3 describes the field-sampling task. Each chapter contains an objective-specific introduction, description of the study site and methods, results of research, and discussion of findings. Literature cited throughout this report is listed in Chapter 4. Additional details on the monitoring methodology and results are provided in Appendices A and B included on the compact disc bound inside the back cover of the printed version of this report.« less

  9. Wave Resource Characterization at US Wave Energy Converter (WEC) Test Sites

    NASA Astrophysics Data System (ADS)

    Dallman, A.; Neary, V. S.

    2016-02-01

    The US Department of Energy's (DOE) Marine and Hydrokinetic energy (MHK) Program is supporting a diverse research and development portfolio intended to accelerate commercialization of the marine renewable industry by improving technology performance, reducing market barriers, and lowering the cost of energy. Wave resource characterization at potential and existing wave energy converter (WEC) test sites and deployment locations contributes to this DOE goal by providing a catalogue of wave energy resource characteristics, met-ocean data, and site infrastructure information, developed utilizing a consistent methodology. The purpose of the catalogue is to enable the comparison of resource characteristics among sites to facilitate the selection of test sites that are most suitable for a developer's device and that best meet their testing needs and objectives. It also provides inputs for the design of WEC test devices and planning WEC tests, including the planning of deployment and operations and maintenance. The first edition included three sites: the Pacific Marine Energy Center (PMEC) North Energy Test Site (NETS) offshore of Newport, Oregon, the Kaneohe Bay Naval Wave Energy Test Site (WETS) offshore of Oahu, HI, and a potential site offshore of Humboldt Bay, CA (Eureka, CA). The second edition was recently finished, which includes five additional sites: the Jennette's Pier Wave Energy Converter Test Site in North Carolina, the US Army Corps of Engineers (USACE) Field Research Facility (FRF), the PMEC Lake Washington site, the proposed PMEC South Energy Test Site (SETS), and the proposed CalWave Central Coast WEC Test Site. The operational sea states are included according to the IEC Technical Specification on wave energy resource assessment and characterization, with additional information on extreme sea states, weather windows, and representative spectra. The methodology and a summary of results will be discussed.

  10. Falsifiability is not optional.

    PubMed

    LeBel, Etienne P; Berger, Derek; Campbell, Lorne; Loving, Timothy J

    2017-08-01

    Finkel, Eastwick, and Reis (2016; FER2016) argued the post-2011 methodological reform movement has focused narrowly on replicability, neglecting other essential goals of research. We agree multiple scientific goals are essential, but argue, however, a more fine-grained language, conceptualization, and approach to replication is needed to accomplish these goals. Replication is the general empirical mechanism for testing and falsifying theory. Sufficiently methodologically similar replications, also known as direct replications, test the basic existence of phenomena and ensure cumulative progress is possible a priori. In contrast, increasingly methodologically dissimilar replications, also known as conceptual replications, test the relevance of auxiliary hypotheses (e.g., manipulation and measurement issues, contextual factors) required to productively investigate validity and generalizability. Without prioritizing replicability, a field is not empirically falsifiable. We also disagree with FER2016's position that "bigger samples are generally better, but . . . that very large samples could have the downside of commandeering resources that would have been better invested in other studies" (abstract). We identify problematic assumptions involved in FER2016's modifications of our original research-economic model, and present an improved model that quantifies when (and whether) it is reasonable to worry that increasing statistical power will engender potential trade-offs. Sufficiently powering studies (i.e., >80%) maximizes both research efficiency and confidence in the literature (research quality). Given that we are in agreement with FER2016 on all key open science points, we are eager to start seeing the accelerated rate of cumulative knowledge development of social psychological phenomena such a sufficiently transparent, powered, and falsifiable approach will generate. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.

    PubMed

    Kellmeyer, Philipp

    2017-10-01

    Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.

  12. Work plan for conducting an ecological risk assessment at J-Field, Aberdeen Proving Ground, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hlohowskyj, I.; Hayse, J.; Kuperman, R.

    1995-03-01

    The Environmental Management Division of Aberdeen Proving Ground (APG), Maryland, is conducting a remedial investigation and feasibility study (RI/FS) of the J-Field area at APG pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), as amended. J-Field is within the Edgewood Area of APG in Harford County, Maryland, and activities at the Edgewood Area since World War II have included the development, manufacture, testing, and destruction of chemical agents and munitions. The J-Field site was used to destroy chemical agents and munitions by open burning and open detonation. This work plan presents the approach proposed to conduct anmore » ecological risk assessment (ERA) as part of the RI/FS program at J-Field. This work plan identifies the locations and types of field studies proposed for each area of concern (AOC), the laboratory studies proposed to evaluate toxicity of media, and the methodology to be used in estimating doses to ecological receptors and discusses the approach that will be used to estimate and evaluate ecological risks at J-Field. Eight AOCs have been identified at J-Field, and the proposed ERA is designed to evaluate the potential for adverse impacts to ecological receptors from contaminated media at each AOC, as well as over the entire J-Field site. The proposed ERA approach consists of three major phases, incorporating field and laboratory studies as well as modeling. Phase 1 includes biotic surveys of the aquatic and terrestrial habitats, biological tissue sampling and analysis, and media toxicity testing at each AOC and appropriate reference locations. Phase 2 includes definitive toxicity testing of media from areas of known or suspected contamination or of media for which the Phase 1 results indicate toxicity or adverse ecological effects. In Phase 3, the uptake models initially developed in Phase 2 will be finalized, and contaminant dose to each receptor from all complete pathways will be estimated.« less

  13. Psychophysiological Studies in Extreme Environments

    NASA Technical Reports Server (NTRS)

    Toscano, William B.

    2011-01-01

    This paper reviews the results from two studies that employed the methodology of multiple converging indicators (physiological measures, subjective self-reports and performance metrics) to examine individual differences in the ability of humans to adapt and function in high stress environments. The first study was a joint collaboration between researchers at the US Army Research Laboratory (ARL) and NASA Ames Research Center. Twenty-four men and women active duty soldiers volunteered as participants. Field tests were conducted in the Command and Control Vehicle (C2V), an enclosed armored vehicle, designed to support both stationary and on-the-move operations. This vehicle contains four computer workstations where crew members are expected to perform command decisions in the field under combat conditions. The study objectives were: 1) to determine the incidence of motion sickness in the C2V relative to interior seat orientation/position, and parked, moving and short-haul test conditions; and 2) to determine the impact of the above conditions on cognitive performance, mood, and physiology. Data collected during field tests included heart rate, respiration rate, skin temperature, and skin conductance, self-reports of mood and symptoms, and cognitive performance metrics that included seven subtests in the DELTA performance test battery. Results showed that during 4-hour operational tests over varied terrain motion sickness symptoms increased; performance degraded by at least 5 percent; and physiological response profiles of individuals were categorized based on good and poor cognitive performance. No differences were observed relative to seating orientation or position.

  14. Real-time monitoring of BTEX in air via ambient-pressure MPI

    NASA Astrophysics Data System (ADS)

    Swenson, Orven F.; Carriere, Josef P.; Isensee, Harlan; Gillispie, Gregory D.; Cooper, William F.; Dvorak, Michael A.

    1998-05-01

    We have developed and begun to field test a very sensitive method for real-time measurements of single-ring aromatic hydrocarbons in ambient air. In this study, we focus on the efficient 1 + 1 resonance enhanced multiphoton ionization (REMPI) of the BTEX species in the narrow region between 266 and 267 nm. We particularly emphasize 266.7 nm, a wavelength at which both benzene and toluene exhibit a sharp absorbance feature and benzene and its alkylated derivatives all absorb. An optical parametric oscillator system generating 266.7 nm, a REMPI cell, and digital oscilloscope detector are mounted on a breadboard attached to a small cart. In the first field test, the cart was wheeled through the various rooms of a chemistry research complex. Leakage of fuel through the gas caps of cars and light trucks in a parking lot was the subject of the second field test. The same apparatus was also used for a study in which the performance of the REMPI detector and a conventional photoionization detector were compared as a BTEX mixture was eluted by gas chromatography. Among the potential applications of the methodology are on-site analysis of combustion and manufacturing processes, soil gas and water headspace monitoring, space cabin and building air quality, and fuel leak detection.

  15. Measurement of magnetic property of FePt granular media at near Curie temperature

    NASA Astrophysics Data System (ADS)

    Yang, H. Z.; Chen, Y. J.; Leong, S. H.; An, C. W.; Ye, K. D.; Hu, J. F.

    2017-02-01

    The characterization of the magnetic switching behavior of heat assisted magnetic recording (HAMR) media at near Curie temperature (Tc) is important for high density recording. In this study, we measured the magnetic property of FePt granular media (with room temperature coercivity 25 kOe) at near Tc with a home built HAMR testing instrument. The local area of HAMR media is heated to near Tc by a flat-top optical heating beam. The magnetic property in the heated area was in-situ measured by a magneto-optic Kerr effect (MOKE) testing beam. The switching field distribution (SFD) and coercive field (Hc) of the FePt granular media and their dependence on the optical heating power at near Tc were studied. We measured the DC demagnetization (DCD) signal with pulsed laser heating at different optical powers. We also measured the Tc distribution of the media by measuring the AC magnetic signal as a function of optical heating power. In a summary, we studied the SFD, Hc of the HAMR media at near Tc in a static manner. The present methodology will facilitate the HAMR media testing.

  16. Complementary and alternative medicine for the treatment and diagnosis of asthma and allergic diseases.

    PubMed

    Passalacqua, G; Compalati, E; Schiappoli, M; Senna, G

    2005-03-01

    The use of Complementary/Alternative Medicines (CAM) is largely diffused and constantly increasing, especially in the field of allergic diseases and asthma. Homeopathy, acupuncture and phytotherapy are the most frequently utilised treatments, whereas complementary diagnostic techniques are mainly used in the field of food allergy-intolerance. Looking at the literature, the majority of clinical trials with CAMS are of low methodological quality, thus difficult to interpret. There are very few studies performed in a rigorously controlled fashion, and those studies provided inconclusive results. In asthma, none of the CAM have thus far been proved more effective than placebo or equally effective as standard treatments. Some herbal products, containing active principles, have displayed some clinical effect, but the herbal remedies are usually not standardised and not quantified, thus carry the risk of toxic effects or interactions. None of the alternative diagnostic techniques (electrodermal testing, kinesiology, leukocytotoxic test, iridology, hair analysis) have been proved able to distinguish between healthy and allergic subjects or to diagnose sensitizations. Therefore these tests must not be used, since they can lead to delayed or incorrect diagnosis and therapy.

  17. Culture and neuroscience: additive or synergistic?

    PubMed Central

    Dapretto, Mirella; Iacoboni, Marco

    2010-01-01

    The investigation of cultural phenomena using neuroscientific methods—cultural neuroscience (CN)—is receiving increasing attention. Yet it is unclear whether the integration of cultural study and neuroscience is merely additive, providing additional evidence of neural plasticity in the human brain, or truly synergistic, yielding discoveries that neither discipline could have achieved alone. We discuss how the parent fields to CN: cross-cultural psychology, psychological anthropology and cognitive neuroscience inform the investigation of the role of cultural experience in shaping the brain. Drawing on well-established methodologies from cross-cultural psychology and cognitive neuroscience, we outline a set of guidelines for CN, evaluate 17 CN studies in terms of these guidelines, and provide a summary table of our results. We conclude that the combination of culture and neuroscience is both additive and synergistic; while some CN methodologies and findings will represent the direct union of information from parent fields, CN studies employing the methodological rigor required by this logistically challenging new field have the potential to transform existing methodologies and produce unique findings. PMID:20083533

  18. The effects of node exclusion on the centrality measures in graph models of interacting economic agents

    NASA Astrophysics Data System (ADS)

    Caetano, Marco Antonio Leonel; Yoneyama, Takashi

    2015-07-01

    This work concerns the study of the effects felt by a network as a whole when a specific node is perturbed. Many real world systems can be described by network models in which the interactions of the various agents can be represented as an edge of a graph. With a graph model in hand, it is possible to evaluate the effect of deleting some of its edges on the architecture and values of nodes of the network. Eventually a node may end up isolated from the rest of the network and an interesting problem is to have a quantitative measure of the impact of such an event. For instance, in the field of finance, the network models are very popular and the proposed methodology allows to carry out "what if" tests in terms of weakening the links between the economic agents, represented as nodes. The two main concepts employed in the proposed methodology are (i) the vibrational IC-Information Centrality, which can provide a measure of the relative importance of a particular node in a network and (ii) autocatalytic networks that can indicate the evolutionary trends of the network. Although these concepts were originally proposed in the context of other fields of knowledge, they were also found to be useful in analyzing financial networks. In order to illustrate the applicability of the proposed methodology, a case of study using the actual data comprising stock market indices of 12 countries is presented.

  19. The reporting characteristics and methodological quality of Cochrane reviews about health policy research.

    PubMed

    Xiu-xia, Li; Ya, Zheng; Yao-long, Chen; Ke-hu, Yang; Zong-jiu, Zhang

    2015-04-01

    The systematic review has increasingly become a popular tool for researching health policy. However, due to the complexity and diversity in the health policy research, it has also encountered more challenges. We set out the Cochrane reviews on health policy research as a representative to provide the first examination of epidemiological and descriptive characteristics as well as the compliance of methodological quality with the AMSTAR. 99 reviews were included by inclusion criteria, 73% of which were Implementation Strategies, 15% were Financial Arrangements and 12% were Governance Arrangements; involved Public Health (34%), Theoretical Exploration (18%), Hospital Management (17%), Medical Insurance (12%), Pharmaceutical Policy (9%), Community Health (7%) and Rural Health (2%). Only 39% conducted meta-analysis, and 49% reported being updates, and none was rated low methodological quality. Our research reveals that the quantity and quality of the evidence should be improved, especially Financial Arrangements and Governance Arrangements involved Rural Health, Health Care Reform and Health Equity, etc. And the reliability of AMSTAR needs to be tested in larger range in this field. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Evolutionary Maps: A new model for the analysis of conceptual development, with application to the diurnal cycle

    NASA Astrophysics Data System (ADS)

    Navarro, Manuel

    2014-05-01

    This paper presents a model of how children generate concrete concepts from perception through processes of differentiation and integration. The model informs the design of a novel methodology (evolutionary maps or emaps), whose implementation on certain domains unfolds the web of itineraries that children may follow in the construction of concrete conceptual knowledge and pinpoints, for each conception, the architecture of the conceptual change that leads to the scientific concept. Remarkably, the generative character of its syntax yields conceptions that, if unknown, amount to predictions that can be tested experimentally. Its application to the diurnal cycle (including the sun's trajectory in the sky) indicates that the model is correct and the methodology works (in some domains). Specifically, said emap predicts a number of exotic trajectories of the sun in the sky that, in the experimental work, were drawn spontaneously both on paper and a dome. Additionally, the application of the emaps theoretical framework in clinical interviews has provided new insight into other cognitive processes. The field of validity of the methodology and its possible applications to science education are discussed.

  1. Improving the completion of Quality Improvement projects amongst psychiatry core trainees.

    PubMed

    Ewins, Liz

    2015-01-01

    Quality Improvement (QI) projects are seen increasingly as more valuable and effective in developing services than traditional audit. However, the development of this methodology has been slower in the mental health field and QI projects are new to most psychiatrists. This project describes a way of engaging trainees across Avon and Wiltshire Mental Health Partnership (AWP) Trust and the Severn School of Psychiatry in QI projects, using QI methodology itself. Through the implementation and development of training sessions and simple, low cost and sustainable interventions over a 10 month period, two thirds of core trainees and over a half of the advanced psychiatry trainees in the School are now participating in 28 individual QI projects and QI project methodology is to become embedded in the core psychiatry training course. As an additional positive outcome, specialty doctors, consultants, foundation doctors, GP trainees, medical students, as well as the wider multidisciplinary team, have all become engaged in QI projects alongside trainees, working with service users and their families to identify problems to tackle and ideas to test.

  2. Human-centred approaches in slipperiness measurement

    PubMed Central

    Grönqvist, Raoul; Abeysekera, John; Gard, Gunvor; Hsiang, Simon M.; Leamon, Tom B.; Newman, Dava J.; Gielo-Perczak, Krystyna; Lockhart, Thurmon E.; Pai, Clive Y.-C.

    2010-01-01

    A number of human-centred methodologies—subjective, objective, and combined—are used for slipperiness measurement. They comprise a variety of approaches from biomechanically-oriented experiments to psychophysical tests and subjective evaluations. The objective of this paper is to review some of the research done in the field, including such topics as awareness and perception of slipperiness, postural and balance control, rating scales for balance, adaptation to slippery conditions, measurement of unexpected movements, kinematics of slipping, and protective movements during falling. The role of human factors in slips and falls will be discussed. Strengths and weaknesses of human-centred approaches in relation to mechanical slip test methodologies are considered. Current friction-based criteria and thresholds for walking without slipping are reviewed for a number of work tasks. These include activities such as walking on a level or an inclined surface, running, stopping and jumping, as well as stair ascent and descent, manual exertion (pushing and pulling, load carrying, lifting) and particular concerns of the elderly and mobility disabled persons. Some future directions for slipperiness measurement and research in the field of slips and falls are outlined. Human-centred approaches for slipperiness measurement do have many applications. First, they are utilized to develop research hypotheses and models to predict workplace risks caused by slipping. Second, they are important alternatives to apparatus-based friction measurements and are used to validate such methodologies. Third, they are used as practical tools for evaluating and monitoring slip resistance properties of foot wear, anti-skid devices and floor surfaces. PMID:11794763

  3. Ten years later: Evaluation of the effectiveness of 12.5% amitraz against a field population of Rhipicephalus (Boophilus) microplus using field studies, artificial infestation (Stall tests) and adult immersion tests.

    PubMed

    Maciel, Willian Giquelin; Lopes, Welber Daniel Zanetti; Cruz, Breno Cayeiro; Gomes, Lucas Vinicius Costa; Teixeira, Weslen Fabrício Pires; Buzzulini, Carolina; Bichuette, Murilo Abud; Campos, Gabriel Pimentel; Felippelli, Gustavo; Soares, Vando Edésio; de Oliveira, Gilson Pereira; da Costa, Alvimar José

    2015-12-15

    Using field trials, artificial infestations (Stall tests) and in vitro adult immersion tests, the present study evaluated the acaricidal efficacy of 12.5% amitraz administered via whole body spraying against a Rhipicephalus (Boophilus) microplus population that did not have any contact with chemical products belonging to this acaricide family for 10 years (approximately 40 generations). Two natural infestation trials, two artificial infestation trials (Stall tests) and two adult immersion tests were performed in two different stages in 2005 and 2015. Between 2002 and 2015, the bovine herd of this property was formed by approximately 450 animals from the Simmental breed that were divided into nine paddocks formed by Cynodon dactylon (L.) Pers. For the natural infestation experiments in 2005 and 2015, we selected nearly 70 animals naturally infested with ticks from the same herd that belonged to the "São Paulo" farm located in São José do Rio Pardo, São Paulo, Brazil. Field studies were performed in the same paddock (9). To evaluate anti-R. (B.) microplus activity in the artificially infested cattle (Stall tests) and adult immersion tests, two experiments of each methodology were performed at CPPAR (the Center of Research in Animal Health located on the FCAV/UNESP campus in Jaboticabal, São Paulo, Brazil) in 2005 and 2015. R. (B.) microplus used in the artificial infestation, and adult immersion test experiments were obtained from paddocks 1-9 in 2005 and 2015 from the commercial farm where the field studies were performed. Based on the obtained results, it was possible to conclude that amitraz use in rotation with pyrethroids every 28 days for three consecutive years (2002-2004) previous to the beginning of the first trial (2005) was sufficient to generate a R. (B.) microplus strain resistant to amitraz. Moreover, using field trials, artificial infestations (Stall tests) and adult immersion tests, we verified that 40 generations of the tick species with no contact to the aforementioned compound (amitraz) were not sufficient to revert or modify the efficacy/resistance of amitraz for this analyzed R. (B.) microplus strain. The reversion of amitraz efficacy values in R. (B.) microplus may only occur when resistance of the field strain is incipient. Alternatively, the differences in the results may be due to differences in the Rhipicephalus spp. species between current study locations. Therefore, future studies must be performed to prove this hypothesis. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Searching for globally optimal functional forms for interatomic potentials using genetic programming with parallel tempering.

    PubMed

    Slepoy, A; Peters, M D; Thompson, A P

    2007-11-30

    Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.

  5. Using satellite imagery to identify and analyze tumuli on Earth and Mars

    NASA Astrophysics Data System (ADS)

    Diniega, Serina; Sangha, Simran; Browne, Brandon

    2018-01-01

    Tumuli are small, dome-like features that form when magmatic pressures build within a subsurface lava pathway, causing the overlying crust to bulge upwards. As the appearance of these features has been linked to lava flow structure (e.g., underlying lava flow tubes) and conditions, there is interest in identifying such features in satellite images so they can be used to expand our understanding of lava flows within regions difficult to access (such as on other planets). Here, we define a methodology for identifying (and measuring) tumuli within satellite imagery, and validate it by comparing our results with fieldwork results of terrestrial tumuli reported in the literature and with independent measurements we made within Amboy Field, CA. In addition, we present aggregated results from the application of our methodology to satellite images of six terrestrial fields and seven martian fields (with >2100 tumuli identified, per planet). Comparisons of tumuli morphometrics on Earth and Mars yield similarities in size and overall shape, which were surprising given the many differences in the environmental and planetary conditions within which these features have formed. Given our measurements, we identify constraints for tumulus formation models and drivers that would yield similar shapes and sizes on two different planets. Furthermore, we test a published hypothesis regarding the number of tumuli that form per a square kilometer, and find it unlikely that a diagnostic "tumuli density" value exists.

  6. Diagnosis of Ebola Virus Disease: Past, Present, and Future

    PubMed Central

    Brooks, Tim J. G.

    2016-01-01

    SUMMARY Laboratory diagnosis of Ebola virus disease plays a critical role in outbreak response efforts; however, establishing safe and expeditious testing strategies for this high-biosafety-level pathogen in resource-poor environments remains extremely challenging. Since the discovery of Ebola virus in 1976 via traditional viral culture techniques and electron microscopy, diagnostic methodologies have trended toward faster, more accurate molecular assays. Importantly, technological advances have been paired with increasing efforts to support decentralized diagnostic testing capacity that can be deployed at or near the point of patient care. The unprecedented scope of the 2014-2015 West Africa Ebola epidemic spurred tremendous innovation in this arena, and a variety of new diagnostic platforms that have the potential both to immediately improve ongoing surveillance efforts in West Africa and to transform future outbreak responses have reached the field. In this review, we describe the evolution of Ebola virus disease diagnostic testing and efforts to deploy field diagnostic laboratories in prior outbreaks. We then explore the diagnostic challenges pervading the 2014-2015 epidemic and provide a comprehensive examination of novel diagnostic tests that are likely to address some of these challenges moving forward. PMID:27413095

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenbauer, B.; Bohac, D.; Huelman, P.

    Combined space and water heater (combi or combo) systems are defined by their dual functionality. Combi systems provide both space heating and water heating capabilities with a single heat source. This guideline will focus on the installation and operation of residential systems with forced air heating and domestic hot water (DHW) functionality. Past NorthernSTAR research has used a combi system to replace a natural gas forced air distribution system furnace and tank type water heater (Schoenbauer et al. 2012; Schoenbauer, Bohac, and McAlpine 2014). The combi systems consisted of a water heater or boiler heating plant teamed with a hydronicmore » air handler that included an air handler, water coil, and water pump to circulate water between the heating plant and coil. The combi water heater or boiler had a separate circuit for DHW. Past projects focused on laboratory testing, field characterization, and control optimization of combi systems. Laboratory testing was done to fully characterize and test combi system components; field testing was completed to characterize the installed performance of combi systems; and control methodologies were analyzed to understand the potential of controls to simplify installation and design and to improve system efficiency and occupant comfort. This past work was relied upon on to create this measure guideline.« less

  8. An Evaluation Methodology for Protocol Analysis Systems

    DTIC Science & Technology

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  9. Air Force Energy Program Policy Memorandum

    DTIC Science & Technology

    2009-06-16

    Critical Asset Prioritization Methodology ( CAPM ) tool Manage costs. 3.4.2.5. Metrics Percentage of alternative/renewable fuel used for aviation fuel...supporting critical assets residing on military installations Field the Critical Asset Prioritization Methodology ( CAPM ) tool by Spring 2008. This CAPM ...Increase the number of flexible fuel systems • Identify/develop privately financed/operated energy production on Air Bases • Field the Critical

  10. Type A behavior and physiological responsivity in young women.

    PubMed

    Lawler, K A; Schmied, L; Mitchell, V P; Rixse, A

    1984-01-01

    The purpose of this study was to assess the coronary-prone behavior pattern and physiological responses to stress in young women. Thirty-seven women, aged 18-25 yr, were tested; half were studying in nontraditional fields for women, half in traditional. Based on the Jenkins Activity Survey, women in the male-dominated fields of study were more Type A. Subjects were monitored while resting and while solving mental arithmetic problems and visual puzzles; the dependent variables were heart rate, and blood pressure. Comparisons were made based on both the Jenkins Activity Survey and the structured interview, and using both median splits and extreme groups. There were no physiological differences between Types A and B women. Possible methodological issues accounting for the lack of results are considered.

  11. Force on Force Modeling with Formal Task Structures and Dynamic Geometry

    DTIC Science & Technology

    2017-03-24

    task framework, derived using the MMF methodology to structure a complex mission. It further demonstrated the integration of effects from a range of...application methodology was intended to support a combined developmental testing (DT) and operational testing (OT) strategy for selected systems under test... methodology to develop new or modify existing Models and Simulations (M&S) to: • Apply data from multiple, distributed sources (including test

  12. High Temperature, Permanent Magnet Biased, Fault Tolerant, Homopolar Magnetic Bearing Development

    NASA Technical Reports Server (NTRS)

    Palazzolo, Alan; Tucker, Randall; Kenny, Andrew; Kang, Kyung-Dae; Ghandi, Varun; Liu, Jinfang; Choi, Heeju; Provenza, Andrew

    2008-01-01

    This paper summarizes the development of a magnetic bearing designed to operate at 1,000 F. A novel feature of this high temperature magnetic bearing is its homopolar construction which incorporates state of the art high temperature, 1,000 F, permanent magnets. A second feature is its fault tolerance capability which provides the desired control forces with over one-half of the coils failed. The construction and design methodology of the bearing is outlined and test results are shown. The agreement between a 3D finite element, magnetic field based prediction for force is shown to be in good agreement with predictions at room and high temperature. A 5 axis test rig will be complete soon to provide a means to test the magnetic bearings at high temperature and speed.

  13. Signal/noise analysis to compare tests for measuring visual field loss and its progression.

    PubMed

    Artes, Paul H; Chauhan, Balwantray C

    2009-10-01

    To describe a methodology for establishing signal-to-noise ratios (SNRs) for different perimetric techniques, and to compare SNRs of frequency-doubling technology (FDT2) perimetry and standard automated perimetry (SAP). Fifteen patients with open-angle glaucoma (median MD, -2.6 dB, range +0.2 to -16.1 dB) were tested six times with FDT2 and SAP (SITA Standard program 24-2) within a 4-week period. Signals were estimated from the average superior-inferior difference between the mean deviation (MD) values in five mirror-pair sectors of the Glaucoma Hemifield Test, and noise from the dispersion of these differences over the six repeated tests. SNRs of FDT2 and SAP were compared by mixed-effects modeling. There was moderate correlation between the signals of FDT2 and SAP (r(2) = 0.68, P < 0.001), but no correlation of noise (r(2) = 0.01, P = 0.16). Although both signal as well as noise estimates were higher with FDT2 compared with SAP, 60% to 70% of sector pairs showed higher SNRs with FDT2. The SNRs of FDT2 were between 20% and 40% higher than those of SAP (P = 0.01). There were no meaningful differences between parametric and nonparametric estimates of signal, noise, or SNR. The higher SNRs of FDT2 suggest that this technique is at least as efficient as SAP at detecting localized visual field losses. Signal/noise analyses may provide a useful approach for comparing visual field tests independent of their decibel scales and may provide an initial indication of sensitivity to visual field change over time.

  14. Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation

    DTIC Science & Technology

    2016-05-01

    identifying and mapping flaw size distributions on glass surfaces for predicting mechanical response. International Journal of Applied Glass ...ARL-TN-0756 ● MAY 2016 US Army Research Laboratory Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation...Stress Optical Coefficient, Test Methodology, and Glass Standard Evaluation by Clayton M Weiss Oak Ridge Institute for Science and Education

  15. Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments

    DTIC Science & Technology

    2016-03-24

    NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION

  16. TESTING TREE-CLASSIFIER VARIANTS AND ALTERNATE MODELING METHODOLOGIES IN THE EAST GREAT BASIN MAPPING UNIT OF THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT (SW REGAP)

    EPA Science Inventory

    We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...

  17. 76 FR 67515 - Self-Regulatory Organizations; Chicago Mercantile Exchange, Inc.; Notice of Filing and Order...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ..., determined by the Clearing House using stress test methodology equal to the theoretical two largest IRS Clearing Member losses produced by such stress test or such other methodology determined by the IRS Risk... portion, determined by the Clearing House using stress test methodology equal to the theoretical third and...

  18. Simulation of Asymmetric Destabilization of Mine-void Rock Masses Using a Large 3D Physical Model

    NASA Astrophysics Data System (ADS)

    Lai, X. P.; Shan, P. F.; Cao, J. T.; Cui, F.; Sun, H.

    2016-02-01

    When mechanized sub-horizontal section top coal caving (SSTCC) is used as an underground mining method for exploiting extremely steep and thick coal seams (ESTCS), a large-scale surrounding rock caving may be violently created and have the potential to induce asymmetric destabilization from mine voids. In this study, a methodology for assessing the destabilization was developed to simulate the Weihuliang coal mine in the Urumchi coal field, China. Coal-rock mass and geological structure characterization were integrated with rock mechanics testing for assessment of the methodology and factors influencing asymmetric destabilization. The porous rock-like composite material ensured accuracy for building a 3D geological physical model of mechanized SSTCC by combining multi-mean timely track monitoring including acoustic emission, crack optical acquirement, roof separation observation, and close-field photogrammetry. An asymmetric 3D modeling analysis for destabilization characteristics was completed. Data from the simulated hydraulic support and buried pressure sensor provided effective information that was linked with stress-strain relationship of the working face in ESTCS. The results of the 3D physical model experiments combined with hybrid statistical methods were effective for predicting dynamic hazards in ESTCS.

  19. S4MPLE--Sampler for Multiple Protein-Ligand Entities: Methodology and Rigid-Site Docking Benchmarking.

    PubMed

    Hoffer, Laurent; Chira, Camelia; Marcou, Gilles; Varnek, Alexandre; Horvath, Dragos

    2015-05-19

    This paper describes the development of the unified conformational sampling and docking tool called Sampler for Multiple Protein-Ligand Entities (S4MPLE). The main novelty in S4MPLE is the unified dealing with intra- and intermolecular degrees of freedom (DoF). While classically programs are either designed for folding or docking, S4MPLE transcends this artificial specialization. It supports folding, docking of a flexible ligand into a flexible site and simultaneous docking of several ligands. The trick behind it is the formal assimilation of inter-molecular to intra-molecular DoF associated to putative inter-molecular contact axes. This is implemented within the genetic operators powering a Lamarckian Genetic Algorithm (GA). Further novelty includes differentiable interaction fingerprints to control population diversity, and fitting a simple continuum solvent model and favorable contact bonus terms to the AMBER/GAFF force field. Novel applications-docking of fragment-like compounds, simultaneous docking of multiple ligands, including free crystallographic waters-were published elsewhere. This paper discusses: (a) methodology, (b) set-up of the force field energy functions and (c) their validation in classical redocking tests. More than 80% success in redocking was achieved (RMSD of top-ranked pose < 2.0 Å).

  20. Urban Flow and Pollutant Dispersion Simulation with Multi-scale coupling of Meteorological Model with Computational Fluid Dynamic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Yushi; Poh, Hee Joo

    2014-11-01

    The Computational Fluid Dynamics analysis has become increasingly important in modern urban planning in order to create highly livable city. This paper presents a multi-scale modeling methodology which couples Weather Research and Forecasting (WRF) Model with open source CFD simulation tool, OpenFOAM. This coupling enables the simulation of the wind flow and pollutant dispersion in urban built-up area with high resolution mesh. In this methodology meso-scale model WRF provides the boundary condition for the micro-scale CFD model OpenFOAM. The advantage is that the realistic weather condition is taken into account in the CFD simulation and complexity of building layout can be handled with ease by meshing utility of OpenFOAM. The result is validated against the Joint Urban 2003 Tracer Field Tests in Oklahoma City and there is reasonably good agreement between the CFD simulation and field observation. The coupling of WRF- OpenFOAM provide urban planners with reliable environmental modeling tool in actual urban built-up area; and it can be further extended with consideration of future weather conditions for the scenario studies on climate change impact.

  1. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  2. Investigation of metallurgical coatings for automotive applications

    NASA Astrophysics Data System (ADS)

    Su, Jun Feng

    Metallurgical coatings have been widely used in the automotive industry from component machining, engine daily running to body decoration due to their high hardness, wear resistance, corrosion resistance and low friction coefficient. With high demands in energy saving, weight reduction and limiting environmental impact, the use of new materials such as light Aluminum/magnesium alloys with high strength-weight ratio for engine block and advanced high-strength steel (AHSS) with better performance in crash energy management for die stamping, are increasing. However, challenges are emerging when these new materials are applied such as the wear of the relative soft light alloys and machining tools for hard AHSS. The protective metallurgical coatings are the best option to profit from these new materials' advantages without altering largely in mass production equipments, machinery, tools and human labor. In this dissertation, a plasma electrolytic oxidation (PEO) coating processing on aluminum alloys was introduced in engine cylinder bores to resist wear and corrosion. The tribological behavior of the PEO coatings under boundary and starve lubrication conditions was studied experimentally and numerically for the first time. Experimental results of the PEO coating demonstrated prominent wear resistance and low friction, taking into account the extreme working conditions. The numerical elastohydrodynamic lubrication (EHL) and asperity contact based tribological study also showed a promising approach on designing low friction and high wear resistant PEO coatings. Other than the fabrication of the new coatings, a novel coating evaluation methodology, namely, inclined impact sliding tester was presented in the second part of this dissertation. This methodology has been developed and applied in testing and analyzing physical vapor deposition (PVD)/ chemical vapor deposition (CVD)/PEO coatings. Failure mechanisms of these common metallurgical hard coatings were systematically studied and summarized via the new testing methodology. Field tests based on the new coating characterization technique proved that this methodology is reliable, effective and economical.

  3. A critical investigation of post-liquefaction strength and steady-state flow behavior of saturated soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jong, H.L.

    1988-01-01

    The first objective was to perform a critical evaluation of the recently proposed steady-state analysis methodology for evaluation of post-liquefaction stability of potentially liquefiable soils. This analysis procedure is based on direct comparison between the in-situ undrained residual (steady state) strength of soils in an embankment or foundation, and the driving shear stresses in these soils. A laboratory investigation was performed to investigate factors affecting steady-state strengths, and also to evaluate the validity of assumptions involved in correcting the results of laboratory steady-state strength tests on undisturbed samples for effects of sampling disturbance in order to estimate in-situ strengths. Next,more » a field case study was performed using the steady-state analysis and testing methodologies to analyze Lower San Fernando Dam, which suffered a liquefaction-induced slope failure as a results of a 1971 earthquake. This leads to the second objective which was to extend the Lower San Fernando Dam case study to consideration of analysis methods used to evaluate the likelihood of triggering liquefaction during an earthquake. Finally, a number of the high quality undisturbed samples were subjected to undrained cyclic testing in order to repeat an earlier (1973) study of the use of cyclic tests data to predict liquefaction behavior at Lower San Fernando Dam.« less

  4. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  5. Discourse Analysis and the Study of Educational Leadership

    ERIC Educational Resources Information Center

    Anderson, Gary; Mungal, Angus Shiva

    2015-01-01

    Purpose: The purpose of this paper is to provide an overview of the current and past work using discourse analysis in the field of educational administration and of discourse analysis as a methodology. Design/Methodology/Approach: Authors reviewed research in educational leadership that uses discourse analysis as a methodology. Findings: While…

  6. Discipline and Methodology in Higher Education Research

    ERIC Educational Resources Information Center

    Tight, Malcolm

    2013-01-01

    Higher education research is a multidisciplinary field, engaging researchers from across the academy who make use of a wide range of methodological approaches. This article examines the relation between discipline and methodology in higher education research, analysing a database of 567 articles published in 15 leading higher education journals…

  7. Comparison of ISRU Excavation System Model Blade Force Methodology and Experimental Results

    NASA Technical Reports Server (NTRS)

    Gallo, Christopher A.; Wilkinson, R. Allen; Mueller, Robert P.; Schuler, Jason M.; Nick, Andrew J.

    2010-01-01

    An Excavation System Model has been written to simulate the collection and transportation of regolith on the Moon. The calculations in this model include an estimation of the forces on the digging tool as a result of excavation into the regolith. Verification testing has been performed and the forces recorded from this testing were compared to the calculated theoretical data. A prototype lunar vehicle built at the NASA Johnson Space Center (JSC) was tested with a bulldozer type blade developed at the NASA Kennedy Space Center (KSC) attached to the front. This is the initial correlation of actual field test data to the blade forces calculated by the Excavation System Model and the test data followed similar trends with the predicted values. This testing occurred in soils developed at the NASA Glenn Research Center (GRC) which are a mixture of different types of sands and whose soil properties have been well characterized. Three separate analytical models are compared to the test data.

  8. The effects of group supervision of nurses: a systematic literature review.

    PubMed

    Francke, Anneke L; de Graaff, Fuusje M

    2012-09-01

    To gain insight into the existing scientific evidence on the effects of group supervision for nurses. A systematic literature study of original research publications. Searches were performed in February 2010 in PubMed, CINAHL, Cochrane Library, Embase, ERIC, the NIVEL catalogue, and PsycINFO. No limitations were applied regarding date of publication, language or country. Original research publications were eligible for review when they described group supervision programmes directed at nurses; used a control group or a pre-test post-test design; and gave information about the effects of group supervision on nurse or patient outcomes. The two review authors independently assessed studies for inclusion. The methodological quality of included studies was also independently assessed by the review authors, using a check list developed by Van Tulder et al. in collaboration with the Dutch Cochrane Centre. Data related to the original publications were extracted by one review author and checked by a second review author. No statistical pooling of outcomes was performed, because there was large heterogeneity of outcomes. A total of 1087 potentially relevant references were found. After screening of the references, eight studies with a control group and nine with a pre-test post-test design were included. Most of the 17 studies included have serious methodological limitations, but four Swedish publications in the field of dementia care had high methodological quality and all point to positive effects on nurses' attitudes and skills and/or nurse-patient interactions. However, in interpreting these positive results, it must be taken into account that these four high-quality publications concern sub-studies of one 'sliced' research project using the same study sample. Moreover, these four publications combined a group supervision intervention with the introduction of individual care planning, which also hampers conclusions about the effectiveness of group supervision alone. Although there are rather a lot of indications that group supervision of nurses is effective, evidence on the effects is still scarce. Further methodologically sound research is needed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. University of Washington/ Northwest National Marine Renewable Energy Center Tidal Current Technology Test Protocol, Instrumentation, Design Code, and Oceanographic Modeling Collaboration: Cooperative Research and Development Final Report, CRADA Number CRD-11-452

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R.

    The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less

  10. Journal: A Review of Some Tracer-Test Design Equations for ...

    EPA Pesticide Factsheets

    Determination of necessary tracer mass, initial sample-collection time, and subsequent sample-collection frequency are the three most difficult aspects to estimate for a proposed tracer test prior to conducting the tracer test. To facilitate tracer-mass estimation, 33 mass-estimation equations are reviewed here, 32 of which were evaluated using previously published tracer-test design examination parameters. Comparison of the results produced a wide range of estimated tracer mass, but no means is available by which one equation may be reasonably selected over the others. Each equation produces a simple approximation for tracer mass. Most of the equations are based primarily on estimates or measurements of discharge, transport distance, and suspected transport times. Although the basic field parameters commonly employed are appropriate for estimating tracer mass, the 33 equations are problematic in that they were all probably based on the original developers' experience in a particular field area and not necessarily on measured hydraulic parameters or solute-transport theory. Suggested sampling frequencies are typically based primarily on probable transport distance, but with little regard to expected travel times. This too is problematic in that tends to result in false negatives or data aliasing. Simulations from the recently developed efficient hydrologic tracer-test design methodology (EHTD) were compared with those obtained from 32 of the 33 published tracer-

  11. Evaluation of near field atmospheric dispersion around nuclear facilities using a Lorentzian distribution methodology.

    PubMed

    Hawkley, Gavin

    2014-12-01

    Atmospheric dispersion modeling within the near field of a nuclear facility typically applies a building wake correction to the Gaussian plume model, whereby a point source is modeled as a plane source. The plane source results in greater near field dilution and reduces the far field effluent concentration. However, the correction does not account for the concentration profile within the near field. Receptors of interest, such as the maximally exposed individual, may exist within the near field and thus the realm of building wake effects. Furthermore, release parameters and displacement characteristics may be unknown, particularly during upset conditions. Therefore, emphasis is placed upon the need to analyze and estimate an enveloping concentration profile within the near field of a release. This investigation included the analysis of 64 air samples collected over 128 wk. Variables of importance were then derived from the measurement data, and a methodology was developed that allowed for the estimation of Lorentzian-based dispersion coefficients along the lateral axis of the near field recirculation cavity; the development of recirculation cavity boundaries; and conservative evaluation of the associated concentration profile. The results evaluated the effectiveness of the Lorentzian distribution methodology for estimating near field releases and emphasized the need to place air-monitoring stations appropriately for complete concentration characterization. Additionally, the importance of the sampling period and operational conditions were discussed to balance operational feedback and the reporting of public dose.

  12. How to Identify E-Learning Trends in Academic Teaching: Methodological Approaches and the Analysis of Scientific Discourses

    ERIC Educational Resources Information Center

    Fischer, Helge; Heise, Linda; Heinz, Matthias; Moebius, Kathrin; Koehler, Thomas

    2015-01-01

    Purpose: The purpose of this paper is to introduce methodology and findings of a trend study in the field of e-learning. The overall interest of the study was the analysis of scientific e-learning discourses. What comes next in the field of academic e-learning? Which e-learning trends dominate the discourse at universities? Answering such…

  13. Methodology capture: discriminating between the "best" and the rest of community practice

    PubMed Central

    Eales, James M; Pinney, John W; Stevens, Robert D; Robertson, David L

    2008-01-01

    Background The methodologies we use both enable and help define our research. However, as experimental complexity has increased the choice of appropriate methodologies has become an increasingly difficult task. This makes it difficult to keep track of available bioinformatics software, let alone the most suitable protocols in a specific research area. To remedy this we present an approach for capturing methodology from literature in order to identify and, thus, define best practice within a field. Results Our approach is to implement data extraction techniques on the full-text of scientific articles to obtain the set of experimental protocols used by an entire scientific discipline, molecular phylogenetics. Our methodology for identifying methodologies could in principle be applied to any scientific discipline, whether or not computer-based. We find a number of issues related to the nature of best practice, as opposed to community practice. We find that there is much heterogeneity in the use of molecular phylogenetic methods and software, some of which is related to poor specification of protocols. We also find that phylogenetic practice exhibits field-specific tendencies that have increased through time, despite the generic nature of the available software. We used the practice of highly published and widely collaborative researchers ("expert" researchers) to analyse the influence of authority on community practice. We find expert authors exhibit patterns of practice common to their field and therefore act as useful field-specific practice indicators. Conclusion We have identified a structured community of phylogenetic researchers performing analyses that are customary in their own local community and significantly different from those in other areas. Best practice information can help to bridge such subtle differences by increasing communication of protocols to a wider audience. We propose that the practice of expert authors from the field of evolutionary biology is the closest to contemporary best practice in phylogenetic experimental design. Capturing best practice is, however, a complex task and should also acknowledge the differences between fields such as the specific context of the analysis. PMID:18761740

  14. Extending methods: using Bourdieu's field analysis to further investigate taste

    NASA Astrophysics Data System (ADS)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  15. Optimal Multi-Type Sensor Placement for Structural Identification by Static-Load Testing

    PubMed Central

    Papadopoulou, Maria; Vernay, Didier; Smith, Ian F. C.

    2017-01-01

    Assessing ageing infrastructure is a critical challenge for civil engineers due to the difficulty in the estimation and integration of uncertainties in structural models. Field measurements are increasingly used to improve knowledge of the real behavior of a structure; this activity is called structural identification. Error-domain model falsification (EDMF) is an easy-to-use model-based structural-identification methodology which robustly accommodates systematic uncertainties originating from sources such as boundary conditions, numerical modelling and model fidelity, as well as aleatory uncertainties from sources such as measurement error and material parameter-value estimations. In most practical applications of structural identification, sensors are placed using engineering judgment and experience. However, since sensor placement is fundamental to the success of structural identification, a more rational and systematic method is justified. This study presents a measurement system design methodology to identify the best sensor locations and sensor types using information from static-load tests. More specifically, three static-load tests were studied for the sensor system design using three types of sensors for a performance evaluation of a full-scale bridge in Singapore. Several sensor placement strategies are compared using joint entropy as an information-gain metric. A modified version of the hierarchical algorithm for sensor placement is proposed to take into account mutual information between load tests. It is shown that a carefully-configured measurement strategy that includes multiple sensor types and several load tests maximizes information gain. PMID:29240684

  16. Electromagnetic Compatibility (EMC) for Integration and Use of Near Field Communication (NFC) in Aircraft

    NASA Astrophysics Data System (ADS)

    Nalbantoglu, Cemal; Kiehl, Thorsten; God, Ralf; Stadtler, Thiemo; Kebel, Robert; Bienert, Renke

    2016-05-01

    For portable electronic devices (PEDs), e.g. smartphones or tablets, near field communication (NFC) enables easy and convenient man-machine interaction by simply tapping a PED to a tangible NFC user interface. Usage of NFC technology in the air transport system is supposed to facilitate travel processes and self-services for passengers and to support digital interaction with other participating stakeholders. One of the potential obstacles to benefit from NFC technology in the aircraft cabin is the lack of an explicit qualification guideline for electromagnetic compatibility (EMC) testing. In this paper, we propose a methodology for EMC testing and for characterizing NFC devices and their emissions according to aircraft industry standards (RTCA DO-160, DO-294, DO-307 and EUROCAE ED- 130). A potential back-door coupling scenario of radiated NFC emissions and possible effects to nearby aircraft wiring are discussed. A potential front-door- coupling effect on NAV/COM equipment is not investigated in this paper.

  17. 76 FR 50993 - Agency Information Collection Activities: Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-17

    ...: Proposed Collection; Comment Request--Generic Clearance to Conduct Methodological Testing, Surveys, Focus... proposed information collection. This information collection will conduct research by methodological... Methodological Testing, Surveys, Focus Groups, and Related Tools to Improve the Management of Federal Nutrition...

  18. Analysis of Two Advanced Smoothing Algorithms.

    DTIC Science & Technology

    1985-09-01

    59 B. METHODOLOGY . ......... ........... 60 6 C. TESTING AND RESULTS ---- LINEAR UNDERLYING FUNCTION...SMOOTHING ALGORITHMS ...... .................... 94 A. GENERAL ......... ....................... .. 94 B. METHODOLOGY ............................ .95 C...to define succinctly. 59 B. METHODOLOGY There is no established procedure to follow in testing the efficiency and effectiveness of a smoothing

  19. Proposed Objective Odor Control Test Methodology for Waste Containment

    NASA Technical Reports Server (NTRS)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  20. Development of a Polarizable Force Field For Proteins via Ab Initio Quantum Chemistry: First Generation Model and Gas Phase Tests

    PubMed Central

    KAMINSKI, GEORGE A.; STERN, HARRY A.; BERNE, B. J.; FRIESNER, RICHARD A.; CAO, YIXIANG X.; MURPHY, ROBERT B.; ZHOU, RUHONG; HALGREN, THOMAS A.

    2014-01-01

    We present results of developing a methodology suitable for producing molecular mechanics force fields with explicit treatment of electrostatic polarization for proteins and other molecular system of biological interest. The technique allows simulation of realistic-size systems. Employing high-level ab initio data as a target for fitting allows us to avoid the problem of the lack of detailed experimental data. Using the fast and reliable quantum mechanical methods supplies robust fitting data for the resulting parameter sets. As a result, gas-phase many-body effects for dipeptides are captured within the average RMSD of 0.22 kcal/mol from their ab initio values, and conformational energies for the di- and tetrapeptides are reproduced within the average RMSD of 0.43 kcal/mol from their quantum mechanical counterparts. The latter is achieved in part because of application of a novel torsional fitting technique recently developed in our group, which has already been used to greatly improve accuracy of the peptide conformational equilibrium prediction with the OPLS-AA force field.1 Finally, we have employed the newly developed first-generation model in computing gas-phase conformations of real proteins, as well as in molecular dynamics studies of the systems. The results show that, although the overall accuracy is no better than what can be achieved with a fixed-charges model, the methodology produces robust results, permits reasonably low computational cost, and avoids other computational problems typical for polarizable force fields. It can be considered as a solid basis for building a more accurate and complete second-generation model. PMID:12395421

  1. Methodological Standards for Meta-Analyses and Qualitative Systematic Reviews of Cardiac Prevention and Treatment Studies: A Scientific Statement From the American Heart Association.

    PubMed

    Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer

    2017-09-05

    Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.

  2. Advancements in the LEWICE Ice Accretion Model

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1993-01-01

    Recent evidence has shown that the NASA/Lewis Ice Accretion Model, LEWICE, does not predict accurate ice shapes for certain glaze ice conditions. This paper will present the methodology used to make a first attempt at improving the ice accretion prediction in these regimes. Importance is given to the correlations for heat transfer coefficient and ice density, as well as runback flow, selection of the transition point, flow field resolution, and droplet trajectory models. Further improvements and refinement of these modules will be performed once tests in NASA's Icing Research Tunnel, scheduled for 1993, are completed.

  3. An outflow boundary condition for aeroacoustic computations

    NASA Technical Reports Server (NTRS)

    Hayder, M. Ehtesham; Hagstrom, Thomas

    1995-01-01

    A formulation of boundary condition for flows with small disturbances is presented. The authors test their methodology in an axisymmetric jet flow calculation, using both the Navier-Stokes and Euler equations. Solutions in the far field are assumed to be oscillatory. If the oscillatory disturbances are small, the growth of the solution variables can be predicted by linear theory. Eigenfunctions of the linear theory are used explicitly in the formulation of the boundary conditions. This guarantees correct solutions at the boundary in the limit where the predictions of linear theory are valid.

  4. Nutrient Stress Detection in Corn Using Neural Networks and AVIRIS Hyperspectral Imagery

    NASA Technical Reports Server (NTRS)

    Estep, Lee

    2001-01-01

    AVIRIS image cube data has been processed for the detection of nutrient stress in corn by both known, ratio-type algorithms and by trained neural networks. The USDA Shelton, NE, ARS Variable Rate Nitrogen Application (VRAT) experimental farm was the site used in the study. Upon application of ANOVA and Dunnett multiple comparsion tests on the outcome of both the neural network processing and the ratio-type algorithm results, it was found that the neural network methodology provides a better overall capability to separate nutrient stressed crops from in-field controls.

  5. A hierarchical approach to ecological assessment of contaminated soils at Aberdeen Proving Ground, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuperman, R.G.

    1995-12-31

    Despite the expansion of environmental toxicology studies over the past decade, soil ecosystems have largely been ignored in ecotoxicological studies in the United States. The objective of this project was to develop and test the efficacy of a comprehensive methodology for assessing ecological impacts of soil contamination. A hierarchical approach that integrates biotic parameters and ecosystem processes was used to give insight into the mechanisms that lead to alterations in the structure and function of soil ecosystems in contaminated areas. This approach involved (1) a thorough survey of the soil biota to determine community structure, (2) laboratory and field testsmore » on critical ecosystem processes, (3) toxicity trials, and (4) the use of spatial analyses to provide input to the decision-making, process. This methodology appears to, offer an efficient and potentially cost-saving tool for remedial investigations of contaminated sites.« less

  6. A Methodological Analysis of Randomized Clinical Trials of Computer-Assisted Therapies for Psychiatric Disorders: Toward Improved Standards for an Emerging Field

    PubMed Central

    Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.

    2013-01-01

    Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689

  7. Aging and feature search: the effect of search area.

    PubMed

    Burton-Danner, K; Owsley, C; Jackson, G R

    2001-01-01

    The preattentive system involves the rapid parallel processing of visual information in the visual scene so that attention can be directed to meaningful objects and locations in the environment. This study used the feature search methodology to examine whether there are aging-related deficits in parallel-processing capabilities when older adults are required to visually search a large area of the visual field. Like young subjects, older subjects displayed flat, near-zero slopes for the Reaction Time x Set Size function when searching over a broad area (30 degrees radius) of the visual field, implying parallel processing of the visual display. These same older subjects exhibited impairment in another task, also dependent on parallel processing, performed over the same broad field area; this task, called the useful field of view test, has more complex task demands. Results imply that aging-related breakdowns of parallel processing over a large visual field area are not likely to emerge when required responses are simple, there is only one task to perform, and there is no limitation on visual inspection time.

  8. Methodology for Calculating Latency of GPS Probe Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhongxiang; Hamedi, Masoud; Young, Stanley

    Crowdsourced GPS probe data, such as travel time on changeable-message signs and incident detection, have been gaining popularity in recent years as a source for real-time traffic information to driver operations and transportation systems management and operations. Efforts have been made to evaluate the quality of such data from different perspectives. Although such crowdsourced data are already in widespread use in many states, particularly the high traffic areas on the Eastern seaboard, concerns about latency - the time between traffic being perturbed as a result of an incident and reflection of the disturbance in the outsourced data feed - havemore » escalated in importance. Latency is critical for the accuracy of real-time operations, emergency response, and traveler information systems. This paper offers a methodology for measuring probe data latency regarding a selected reference source. Although Bluetooth reidentification data are used as the reference source, the methodology can be applied to any other ground truth data source of choice. The core of the methodology is an algorithm for maximum pattern matching that works with three fitness objectives. To test the methodology, sample field reference data were collected on multiple freeway segments for a 2-week period by using portable Bluetooth sensors as ground truth. Equivalent GPS probe data were obtained from a private vendor, and their latency was evaluated. Latency at different times of the day, impact of road segmentation scheme on latency, and sensitivity of the latency to both speed-slowdown and recovery-from-slowdown episodes are also discussed.« less

  9. A methodology for testing fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Andrews, D. M.; Mahmood, A.; Mccluskey, E. J.

    1985-01-01

    A methodology for testing fault tolerant software is presented. There are problems associated with testing fault tolerant software because many errors are masked or corrected by voters, limiter, or automatic channel synchronization. This methodology illustrates how the same strategies used for testing fault tolerant hardware can be applied to testing fault tolerant software. For example, one strategy used in testing fault tolerant hardware is to disable the redundancy during testing. A similar testing strategy is proposed for software, namely, to move the major emphasis on testing earlier in the development cycle (before the redundancy is in place) thus reducing the possibility that undetected errors will be masked when limiters and voters are added.

  10. Standardized Laboratory Test Requirements for Hardening Equipment to Withstand Wave Impact Shock in Small High Speed Craft

    DTIC Science & Technology

    2017-02-06

    and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and example requirements for...engineering rationale, assumptions, and methodology for transitioning craft acceleration data to laboratory shock test requirements are summarized and... Methodologies for Small High-Speed Craft Structure, Equipment, Shock Isolation Seats, and Human Performance At-Sea, 10 th Symposium on High

  11. Methodology for designing accelerated aging tests for predicting life of photovoltaic arrays

    NASA Technical Reports Server (NTRS)

    Gaines, G. B.; Thomas, R. E.; Derringer, G. C.; Kistler, C. W.; Bigg, D. M.; Carmichael, D. C.

    1977-01-01

    A methodology for designing aging tests in which life prediction was paramount was developed. The methodology builds upon experience with regard to aging behavior in those material classes which are expected to be utilized as encapsulant elements, viz., glasses and polymers, and upon experience with the design of aging tests. The experiences were reviewed, and results are discussed in detail.

  12. Methodological Approaches in MOOC Research: Retracing the Myth of Proteus

    ERIC Educational Resources Information Center

    Raffaghelli, Juliana Elisa; Cucchiara, Stefania; Persico, Donatella

    2015-01-01

    This paper explores the methodological approaches most commonly adopted in the scholarly literature on Massive Open Online Courses (MOOCs), published during the period January 2008-May 2014. In order to identify trends, gaps and criticalities related to the methodological approaches of this emerging field of research, we analysed 60 papers…

  13. Using Design-Based Research in Gifted Education

    ERIC Educational Resources Information Center

    Jen, Enyi; Moon, Sidney; Samarapungavan, Ala

    2015-01-01

    Design-based research (DBR) is a new methodological framework that was developed in the context of the learning sciences; however, it has not been used very often in the field of gifted education. Compared with other methodologies, DBR is more process-oriented and context-sensitive. In this methodological brief, the authors introduce DBR and…

  14. Improving the Quality of Experience Journals: Training Educational Psychology Students in Basic Qualitative Methodology

    ERIC Educational Resources Information Center

    Reynolds-Keefer, Laura

    2010-01-01

    This study evaluates the impact of teaching basic qualitative methodology to preservice teachers enrolled in an educational psychology course in the quality of observation journals. Preservice teachers enrolled in an educational psychology course requiring 45 hr of field experience were given qualitative methodological training as a part of the…

  15. A systematic review of grounded theory studies in physiotherapy.

    PubMed

    Ali, Nancy; May, Stephen; Grafton, Kate

    2018-05-23

    This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.

  16. Validating a Finite Element Model of a Structure Subjected to Mine Blast with Experimental Modal Analysis

    DTIC Science & Technology

    2017-11-01

    The Under-body Blast Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and... Evaluation Command to assess the vulnerability of vehicles to under-body blast. Finite element (FE) models are part of the current UBM for T&E methodology...Methodology (UBM) for the Test and Evaluation (T&E) program was established to provide a capability for the US Army Test and Evaluation Command

  17. Tests for the Assessment of Sport-Specific Performance in Olympic Combat Sports: A Systematic Review With Practical Recommendations

    PubMed Central

    Chaabene, Helmi; Negra, Yassine; Bouguezzi, Raja; Capranica, Laura; Franchini, Emerson; Prieske, Olaf; Hbacha, Hamdi; Granacher, Urs

    2018-01-01

    The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports. PMID:29692739

  18. Analysis of the penumbra enlargement in lung versus the quality index of photon beams: a methodology to check the dose calculation algorithm.

    PubMed

    Tsiakalos, Miltiadis F; Theodorou, Kiki; Kappas, Constantin; Zefkili, Sofia; Rosenwold, Jean-Claude

    2004-04-01

    It is well known that considerable underdosage can occur at the edges of a tumor inside the lung because of the degradation of penumbra due to lack of lateral electronic equilibrium. Although present even at smaller energies, this phenomenon is more pronounced for higher energies. Apart from Monte Carlo calculation, most of the existing Treatment Planning Systems (TPSs) cannot deal at all, or with acceptable accuracy, with this effect. A methodology has been developed for assessing the dose calculation algorithms in the lung region where lateral electronic disequilibrium exists, based on the Quality Index (QI) of the incident beam. A phantom, consisting of layers of polystyrene and lung material, has been irradiated using photon beams of 4, 6, 15, and 20 MV. The cross-plane profiles of each beam for 5x5, 10x10, and 25x10 fields have been measured at the middle of the phantom with the use of films. The penumbra (20%-80%) and fringe (50%-90%) enlargement was measured and the ratio of the widths for the lung to that of polystyrene was defined as the Correction Factor (CF). Monte Carlo calculations in the two phantoms have also been performed for energies of 6, 15, and 20 MV. Five commercial TPS's algorithms were tested for their ability to predict the penumbra and fringe enlargement. A linear relationship has been found between the QI of the beams and the CF of the penumbra and fringe enlargement for all the examined fields. Monte Carlo calculations agree very well (less than 1% difference) with the film measurements. The CF values range between 1.1 for 4 MV (QI 0.620) and 2.28 for 20 MV (QI 0.794). Three of the tested TPS's algorithms could not predict any enlargement at all for all energies and all fields and two of them could predict the penumbra enlargement to some extent. The proposed methodology can help any user or developer to check the accuracy of its algorithm for lung cases, based on a simple phantom geometry and the QI of the incident beam. This check is very important especially when higher energies are used, as the inaccuracies in existing algorithms can lead to an incorrect choice of energy for lung treatment and consequently to a failure in tumor control.

  19. Validation of a Rapid Rabies Diagnostic Tool for Field Surveillance in Developing Countries

    PubMed Central

    Léchenne, Monique; Naïssengar, Kemdongarti; Lepelletier, Anthony; Alfaroukh, Idriss Oumar; Bourhy, Hervé; Zinsstag, Jakob; Dacheux, Laurent

    2016-01-01

    Background One root cause of the neglect of rabies is the lack of adequate diagnostic tests in the context of low income countries. A rapid, performance friendly and low cost method to detect rabies virus (RABV) in brain samples will contribute positively to surveillance and consequently to accurate data reporting, which is presently missing in the majority of rabies endemic countries. Methodology/Principal findings We evaluated a rapid immunodiagnostic test (RIDT) in comparison with the standard fluorescent antibody test (FAT) and confirmed the detection of the viral RNA by real time reverse transcription polymerase chain reaction (RT-qPCR). Our analysis is a multicentre approach to validate the performance of the RIDT in both a field laboratory (N’Djamena, Chad) and an international reference laboratory (Institut Pasteur, Paris, France). In the field laboratory, 48 samples from dogs were tested and in the reference laboratory setting, a total of 73 samples was tested, representing a wide diversity of RABV in terms of animal species tested (13 different species), geographical origin of isolates with special emphasis on Africa, and different phylogenetic clades. Under reference laboratory conditions, specificity was 93.3% and sensitivity was 95.3% compared to the gold standard FAT test. Under field laboratory conditions, the RIDT yielded a higher reliability than the FAT test particularly on fresh and decomposed samples. Viral RNA was later extracted directly from the test filter paper and further used successfully for sequencing and genotyping. Conclusion/Significance The RIDT shows excellent performance qualities both in regard to user friendliness and reliability of the result. In addition, the test cassettes can be used as a vehicle to ship viral RNA to reference laboratories for further laboratory confirmation of the diagnosis and for epidemiological investigations using nucleotide sequencing. The potential for satisfactory use in remote locations is therefore very high to improve the global knowledge of rabies epidemiology. However, we suggest some changes to the protocol, as well as careful further validation, before promotion and wider use. PMID:27706156

  20. Methodology for testing infrared focal plane arrays in simulated nuclear radiation environments

    NASA Astrophysics Data System (ADS)

    Divita, E. L.; Mills, R. E.; Koch, T. L.; Gordon, M. J.; Wilcox, R. A.; Williams, R. E.

    1992-07-01

    This paper summarizes test methodology for focal plane array (FPA) testing that can be used for benign (clear) and radiation environments, and describes the use of custom dewars and integrated test equipment in an example environment. The test methodology, consistent with American Society for Testing Materials (ASTM) standards, is presented for the total accumulated gamma dose, transient dose rate, gamma flux, and neutron fluence environments. The merits and limitations of using Cobalt 60 for gamma environment simulations and of using various fast-neutron reactors and neutron sources for neutron simulations are presented. Test result examples are presented to demonstrate test data acquisition and FPA parameter performance under different measurement conditions and environmental simulations.

  1. VFMA: Topographic Analysis of Sensitivity Data From Full-Field Static Perimetry

    PubMed Central

    Weleber, Richard G.; Smith, Travis B.; Peters, Dawn; Chegarnov, Elvira N.; Gillespie, Scott P.; Francis, Peter J.; Gardiner, Stuart K.; Paetzold, Jens; Dietzsch, Janko; Schiefer, Ulrich; Johnson, Chris A.

    2015-01-01

    Purpose: To analyze static visual field sensitivity with topographic models of the hill of vision (HOV), and to characterize several visual function indices derived from the HOV volume. Methods: A software application, Visual Field Modeling and Analysis (VFMA), was developed for static perimetry data visualization and analysis. Three-dimensional HOV models were generated for 16 healthy subjects and 82 retinitis pigmentosa patients. Volumetric visual function indices, which are measures of quantity and comparable regardless of perimeter test pattern, were investigated. Cross-validation, reliability, and cross-sectional analyses were performed to assess this methodology and compare the volumetric indices to conventional mean sensitivity and mean deviation. Floor effects were evaluated by computer simulation. Results: Cross-validation yielded an overall R2 of 0.68 and index of agreement of 0.89, which were consistent among subject groups, indicating good accuracy. Volumetric and conventional indices were comparable in terms of test–retest variability and discriminability among subject groups. Simulated floor effects did not negatively impact the repeatability of any index, but large floor changes altered the discriminability for regional volumetric indices. Conclusions: VFMA is an effective tool for clinical and research analyses of static perimetry data. Topographic models of the HOV aid the visualization of field defects, and topographically derived indices quantify the magnitude and extent of visual field sensitivity. Translational Relevance: VFMA assists with the interpretation of visual field data from any perimetric device and any test location pattern. Topographic models and volumetric indices are suitable for diagnosis, monitoring of field loss, patient counseling, and endpoints in therapeutic trials. PMID:25938002

  2. Children's knowledge of the earth: a new methodological and statistical approach.

    PubMed

    Straatemeier, Marthe; van der Maas, Han L J; Jansen, Brenda R J

    2008-08-01

    In the field of children's knowledge of the earth, much debate has concerned the question of whether children's naive knowledge-that is, their knowledge before they acquire the standard scientific theory-is coherent (i.e., theory-like) or fragmented. We conducted two studies with large samples (N=328 and N=381) using a new paper-and-pencil test, denoted the EARTH (EArth Representation Test for cHildren), to discriminate between these two alternatives. We performed latent class analyses on the responses to the EARTH to test mental models associated with these alternatives. The naive mental models, as formulated by Vosniadou and Brewer, were not supported by the results. The results indicated that children's knowledge of the earth becomes more consistent as children grow older. These findings support the view that children's naive knowledge is fragmented.

  3. Radar remote sensing for crop classification and canopy condition assessment: Ground-data documentation

    NASA Technical Reports Server (NTRS)

    Ulaby, F. T. (Principal Investigator); Jung, B.; Gillespie, K.; Hemmat, M.; Aslam, A.; Brunfeldt, D.; Dobson, M. C.

    1983-01-01

    A vegetation and soil-moisture experiment was conducted in order to examine the microwave emission and backscattering from vegetation canopies and soils. The data-acquisition methodology used in conjunction with the mobile radar scatterometer (MRS) systems is described and associated ground-truth data are documented. Test fields were located in the Kansas River floodplain north of Lawrence, Kansas. Ten fields each of wheat, corn, and soybeans were monitored over the greater part of their growing seasons. The tabulated data summarize measurements made by the sensor systems and represent target characteristics. Target parameters describing the vegetation and soil characteristics include plant moisture, density, height, and growth stage, as well as soil moisture and soil-bulk density. Complete listings of pertinent crop-canopy and soil measurements are given.

  4. Preliminary evaluation of the Environmental Research Institute of Michigan crop calendar shift algorithm for estimation of spring wheat development stage. [North Dakota, South Dakota, Montana, and Minnesota

    NASA Technical Reports Server (NTRS)

    Phinney, D. E. (Principal Investigator)

    1980-01-01

    An algorithm for estimating spectral crop calendar shifts of spring small grains was applied to 1978 spring wheat fields. The algorithm provides estimates of the date of peak spectral response by maximizing the cross correlation between a reference profile and the observed multitemporal pattern of Kauth-Thomas greenness for a field. A methodology was developed for estimation of crop development stage from the date of peak spectral response. Evaluation studies showed that the algorithm provided stable estimates with no geographical bias. Crop development stage estimates had a root mean square error near 10 days. The algorithm was recommended for comparative testing against other models which are candidates for use in AgRISTARS experiments.

  5. Electrical insulation system for the shell-vacuum vessel and poloidal field gap in the ZTH machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reass, W.A.; Ballard, E.O.

    1989-01-01

    The electrical insulation systems for the ZTH machine have many unusual design problems. The poloidal field gap insulation must be capable of conforming to poloidal and toroidal contours, provide a 25 kV hold off, and sufficiently adhere to the epoxy back fill between the overlapping conductors. The shell-vacuum vessel system will use stretchable and flexible insulation along with protective hats, boots and sleeves. The shell-vacuum vessel system must be able to withstand a 12.5 kV pulse with provision for thermal insulation to limit the effects of the 300{degrees}C vacuum vessel during operation and bakeout. Methodology required to provide the electricalmore » protection along with testing data and material characteristics will be presented. 7 figs.« less

  6. Guidelines for reporting evaluations based on observational methodology.

    PubMed

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  7. An enhanced methodology for spacecraft correlation activity using virtual testing tools

    NASA Astrophysics Data System (ADS)

    Remedia, Marcello; Aglietti, Guglielmo S.; Appolloni, Matteo; Cozzani, Alessandro; Kiley, Andrew

    2017-11-01

    Test planning and post-test correlation activity have been issues of growing importance in the last few decades and many methodologies have been developed to either quantify or improve the correlation between computational and experimental results. In this article the methodologies established so far are enhanced with the implementation of a recently developed procedure called Virtual Testing. In the context of fixed-base sinusoidal tests (commonly used in the space sector for correlation), there are several factors in the test campaign that affect the behaviour of the satellite and are not normally taken into account when performing analyses: different boundary conditions created by the shaker's own dynamics, non-perfect control system, signal delays etc. All these factors are the core of the Virtual Testing implementation, which will be thoroughly explained in this article and applied to the specific case of Bepi-Colombo spacecraft tested on the ESA QUAD Shaker. Correlation activity will be performed in the various stages of the process, showing important improvements observed after applying the final complete methodology.

  8. Interactive Methods for Teaching Action Potentials, an Example of Teaching Innovation from Neuroscience Postdoctoral Fellows in the Fellowships in Research and Science Teaching (FIRST) Program.

    PubMed

    Keen-Rhinehart, E; Eisen, A; Eaton, D; McCormack, K

    2009-01-01

    Acquiring a faculty position in academia is extremely competitive and now typically requires more than just solid research skills and knowledge of one's field. Recruiting institutions currently desire new faculty that can teach effectively, but few postdoctoral positions provide any training in teaching methods. Fellowships in Research and Science Teaching (FIRST) is a successful postdoctoral training program funded by the National Institutes of Health (NIH) providing training in both research and teaching methodology. The FIRST program provides fellows with outstanding interdisciplinary biomedical research training in fields such as neuroscience. The postdoctoral research experience is integrated with a teaching program which includes a How to Teach course, instruction in classroom technology and course development and mentored teaching. During their mentored teaching experiences, fellows are encouraged to explore innovative teaching methodologies and to perform science teaching research to improve classroom learning. FIRST fellows teaching neuroscience to undergraduates have observed that many of these students have difficulty with the topic of neuroscience. Therefore, we investigated the effects of interactive teaching methods for this topic. We tested two interactive teaching methodologies to determine if they would improve learning and retention of this information when compared with standard lectures. The interactive methods for teaching action potentials increased understanding and retention. Therefore, FIRST provides excellent teaching training, partly by enhancing the ability of fellows to integrate innovative teaching methods into their instruction. This training in turn provides fellows that matriculate from this program more of the characteristics that hiring institutions desire in their new faculty.

  9. Interactive Methods for Teaching Action Potentials, an Example of Teaching Innovation from Neuroscience Postdoctoral Fellows in the Fellowships in Research and Science Teaching (FIRST) Program

    PubMed Central

    Keen-Rhinehart, E.; Eisen, A.; Eaton, D.; McCormack, K.

    2009-01-01

    Acquiring a faculty position in academia is extremely competitive and now typically requires more than just solid research skills and knowledge of one’s field. Recruiting institutions currently desire new faculty that can teach effectively, but few postdoctoral positions provide any training in teaching methods. Fellowships in Research and Science Teaching (FIRST) is a successful postdoctoral training program funded by the National Institutes of Health (NIH) providing training in both research and teaching methodology. The FIRST program provides fellows with outstanding interdisciplinary biomedical research training in fields such as neuroscience. The postdoctoral research experience is integrated with a teaching program which includes a How to Teach course, instruction in classroom technology and course development and mentored teaching. During their mentored teaching experiences, fellows are encouraged to explore innovative teaching methodologies and to perform science teaching research to improve classroom learning. FIRST fellows teaching neuroscience to undergraduates have observed that many of these students have difficulty with the topic of neuroscience. Therefore, we investigated the effects of interactive teaching methods for this topic. We tested two interactive teaching methodologies to determine if they would improve learning and retention of this information when compared with standard lectures. The interactive methods for teaching action potentials increased understanding and retention. Therefore, FIRST provides excellent teaching training, partly by enhancing the ability of fellows to integrate innovative teaching methods into their instruction. This training in turn provides fellows that matriculate from this program more of the characteristics that hiring institutions desire in their new faculty. PMID:23493377

  10. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  11. Experimental Methodology for Measuring Combustion and Injection-Coupled Responses

    NASA Technical Reports Server (NTRS)

    Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.

    2006-01-01

    A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.

  12. Data Mining: A Hybrid Methodology for Complex and Dynamic Research

    ERIC Educational Resources Information Center

    Lang, Susan; Baehr, Craig

    2012-01-01

    This article provides an overview of the ways in which data and text mining have potential as research methodologies in composition studies. It introduces data mining in the context of the field of composition studies and discusses ways in which this methodology can complement and extend our existing research practices by blending the best of what…

  13. Determining Faculty and Student Views: Applications of Q Methodology in Higher Education

    ERIC Educational Resources Information Center

    Ramlo, Susan

    2012-01-01

    William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…

  14. A Methodological Review of the Articles Published in "Georgia Educational Researcher" from 2003-2010

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Griffin, Andrea E.; Zeiger, Samara R.; Falbe, Kristina N.; Freeman, Noreen A.; Taylor, Bridget E.; Westbrook, Amy F.; Lico, Cheryl C.; Starling, Cristy N.; Sprull, Nakiesha M.; Holt, Carolyn; Smith, Kristie; McAnespie, Hannah

    2011-01-01

    Methodological reviews, reviews that concentrate on research methods rather than research outcomes, have been used in a variety of fields to improve research practice, inform debate, and identify islands of practice. In this article, we report on the results of a methodological review of all of the articles published in "Georgia Educational…

  15. Development of Techniques for Visualization of Scalar and Vector Fields in the Immersive Environment

    NASA Technical Reports Server (NTRS)

    Bidasaria, Hari B.; Wilson, John W.; Nealy, John E.

    2005-01-01

    Visualization of scalar and vector fields in the immersive environment (CAVE - Cave Automated Virtual Environment) is important for its application to radiation shielding research at NASA Langley Research Center. A complete methodology and the underlying software for this purpose have been developed. The developed software has been put to use for the visualization of the earth s magnetic field, and in particular for the study of the South Atlantic Anomaly. The methodology has also been put to use for the visualization of geomagnetically trapped protons and electrons within Earth's magnetosphere.

  16. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  17. Participatory methodologies in research with children: creative and innovative approaches.

    PubMed

    Pereira, Viviane Ribeiro; Coimbra, Valéria Cristina Christello; Cardoso, Clarissa de Souza; Oliveira, Naiana Alves; Vieira, Ana Cláudia Garcia; Nobre, Márcia de Oliveira; Nino, Magda Eliete Lamas

    2017-05-18

    To describe the use of participatory methodologies in research with children. Experience report with a qualitative approach, conducted with children between six and eleven years of age, from a municipal school in Pelotas and in the Psychosocial Children and Youth Care Center, in São Lourenço do Sul, both municipalities of the Rio Grande do Sul State. Data collection was based on records made in field and observation diaries, held from April to July 2016. The report pointed out that the Photovoice promoted motivation in the group, in addition to increasing the self-esteem and self-confidence of children. The Five Field Map made it possible to help children express feelings through the game. Photovoice and the Five Field Map are seen as tools that enable new methodological approaches in research with children, facilitating the construction of the proposed activities aimed at innovative and creative research processes in health/nursing.

  18. Inter-ethnic variation of ocular traits-design and methodology of comparison study among American Caucasians, American Chinese and mainland Chinese.

    PubMed

    Wang, Dan Dan; Huang, Guo Fu; He, Ming Guang; Wu, Ling Ling; Lin, Shan

    2011-03-01

    To summarize the design and methodology of a multi-center study. With the existed ethnic differences of glaucoma, this survey will explore the differences with regard to anterior and posterior ocular segment parameters between Caucasians and Chinese. In this study, four cohorts including American Caucasians and American Chinese from San Francisco, southern mainland Chinese from Guangzhou, and northern mainland Chinese from Beijing were prospectively enrolled for a series of eye examinations and tests from May 2008 to December 2010. A total of 120 subjects including 15 of each gender in each age decade from 40s to 70s were recruited for each group. Data of the following tests were collected: a questionnaire eliciting systemic and ocular disease history, blood pressure, presenting and best corrected visual acuity, auto-refraction, Goldmann applanation tonometry, gonioscopy, A-scan, anterior segment optical coherence tomography (ASOCT), ultrasound biomicroscopy (UBM), visual field (VF), Heidelberg retinal tomography (HRT), OCT for optic nerve, and digital fundus photography. this study will provide insights to the etiologies of glaucoma especially PACG through inter-ethnic comparisons of relevant ocular anatomic and functional parameters.

  19. Dry-Surface Simulation Method for the Determination of the Work of Adhesion of Solid-Liquid Interfaces.

    PubMed

    Leroy, Frédéric; Müller-Plathe, Florian

    2015-08-04

    We introduce a methodology, referred to as the dry-surface method, to calculate the work of adhesion of heterogeneous solid-liquid interfaces by molecular simulation. This method employs a straightforward thermodynamic integration approach to calculate the work of adhesion as the reversible work to turn off the attractive part of the actual solid-liquid interaction potential. It is formulated in such a way that it may be used either to evaluate the ability of force fields to reproduce reference values of the work of adhesion or to optimize force-field parameters with reference values of the work of adhesion as target quantities. The methodology is tested in the case of water on a generic model of nonpolar substrates with the structure of gold. It is validated through a quantitative comparison to phantom-wall calculations and against a previous characterization of the thermodynamics of the gold-water interface. It is found that the work of adhesion of water on nonpolar substrates is a nonlinear function of the microscopic solid-liquid interaction energy parameter. We also comment on the ability of mean-field approaches to predict the work of adhesion of water on nonpolar substrates. In addition, we discuss in detail the information on the solid-liquid interfacial thermodynamics delivered by the phantom-wall approach. We show that phantom-wall calculations yield the solid-liquid interfacial tension relative to the solid surface tension rather than the absolute solid-liquid interfacial tension as previously believed.

  20. Community Profiling of Fusarium in Combination with Other Plant-Associated Fungi in Different Crop Species Using SMRT Sequencing.

    PubMed

    Walder, Florian; Schlaeppi, Klaus; Wittwer, Raphaël; Held, Alain Y; Vogelgsang, Susanne; van der Heijden, Marcel G A

    2017-01-01

    Fusarium head blight, caused by fungi from the genus Fusarium , is one of the most harmful cereal diseases, resulting not only in severe yield losses but also in mycotoxin contaminated and health-threatening grains. Fusarium head blight is caused by a diverse set of species that have different host ranges, mycotoxin profiles and responses to agricultural practices. Thus, understanding the composition of Fusarium communities in the field is crucial for estimating their impact and also for the development of effective control measures. Up to now, most molecular tools that monitor Fusarium communities on plants are limited to certain species and do not distinguish other plant associated fungi. To close these gaps, we developed a sequencing-based community profiling methodology for crop-associated fungi with a focus on the genus Fusarium . By analyzing a 1600 bp long amplicon spanning the highly variable segments ITS and D1-D3 of the ribosomal operon by PacBio SMRT sequencing, we were able to robustly quantify Fusarium down to species level through clustering against reference sequences. The newly developed methodology was successfully validated in mock communities and provided similar results as the culture-based assessment of Fusarium communities by seed health tests in grain samples from different crop species. Finally, we exemplified the newly developed methodology in a field experiment with a wheat-maize crop sequence under different cover crop and tillage regimes. We analyzed wheat straw residues, cover crop shoots and maize grains and we could reveal that the cover crop hairy vetch ( Vicia villosa ) acts as a potent alternative host for Fusarium (OTU F.ave/tri ) showing an eightfold higher relative abundance compared with other cover crop treatments. Moreover, as the newly developed methodology also allows to trace other crop-associated fungi, we found that vetch and green fallow hosted further fungal plant pathogens including Zymoseptoria tritici . Thus, besides their beneficial traits, cover crops can also entail phytopathological risks by acting as alternative hosts for Fusarium and other noxious plant pathogens. The newly developed sequencing based methodology is a powerful diagnostic tool to trace Fusarium in combination with other fungi associated to different crop species.

  1. Community Profiling of Fusarium in Combination with Other Plant-Associated Fungi in Different Crop Species Using SMRT Sequencing

    PubMed Central

    Walder, Florian; Schlaeppi, Klaus; Wittwer, Raphaël; Held, Alain Y.; Vogelgsang, Susanne; van der Heijden, Marcel G. A.

    2017-01-01

    Fusarium head blight, caused by fungi from the genus Fusarium, is one of the most harmful cereal diseases, resulting not only in severe yield losses but also in mycotoxin contaminated and health-threatening grains. Fusarium head blight is caused by a diverse set of species that have different host ranges, mycotoxin profiles and responses to agricultural practices. Thus, understanding the composition of Fusarium communities in the field is crucial for estimating their impact and also for the development of effective control measures. Up to now, most molecular tools that monitor Fusarium communities on plants are limited to certain species and do not distinguish other plant associated fungi. To close these gaps, we developed a sequencing-based community profiling methodology for crop-associated fungi with a focus on the genus Fusarium. By analyzing a 1600 bp long amplicon spanning the highly variable segments ITS and D1–D3 of the ribosomal operon by PacBio SMRT sequencing, we were able to robustly quantify Fusarium down to species level through clustering against reference sequences. The newly developed methodology was successfully validated in mock communities and provided similar results as the culture-based assessment of Fusarium communities by seed health tests in grain samples from different crop species. Finally, we exemplified the newly developed methodology in a field experiment with a wheat-maize crop sequence under different cover crop and tillage regimes. We analyzed wheat straw residues, cover crop shoots and maize grains and we could reveal that the cover crop hairy vetch (Vicia villosa) acts as a potent alternative host for Fusarium (OTU F.ave/tri) showing an eightfold higher relative abundance compared with other cover crop treatments. Moreover, as the newly developed methodology also allows to trace other crop-associated fungi, we found that vetch and green fallow hosted further fungal plant pathogens including Zymoseptoria tritici. Thus, besides their beneficial traits, cover crops can also entail phytopathological risks by acting as alternative hosts for Fusarium and other noxious plant pathogens. The newly developed sequencing based methodology is a powerful diagnostic tool to trace Fusarium in combination with other fungi associated to different crop species. PMID:29234337

  2. Development and Validation of a Translation Test.

    ERIC Educational Resources Information Center

    Ghonsooly, Behzad

    1993-01-01

    Translation testing methodology has been criticized for its subjective character. No real strides have so far been made in developing an objective translation test. In this paper, certain detailed procedures including various phases of pretesting have been performed to achieve objectivity and scorability in translation testing methodology. In…

  3. Assays of homeopathic remedies in rodent behavioural and psychopathological models.

    PubMed

    Bellavite, Paolo; Magnani, Paolo; Marzotto, Marta; Conforti, Anita

    2009-10-01

    The first part of this paper reviews the effects of homeopathic remedies on several models of anxiety-like behaviours developed and described in rodents. The existing literature in this field comprises some fifteen exploratory studies, often published in non-indexed and non-peer-reviewed journals. Only a few results have been confirmed by multiple laboratories, and concern Ignatia, Gelsemium, Chamomilla (in homeopathic dilutions/potencies). Nevertheless, there are some interesting results pointing to the possible efficacy of other remedies, and confirming a statistically significant effect of high dilutions of neurotrophic molecules and antibodies. In the second part of this paper we report some recent results obtained in our laboratory, testing Aconitum, Nux vomica, Belladonna, Argentum nitricum, Tabacum (all 5CH potency) and Gelsemium (5, 7, 9 and 30CH potencies) on mice using ethological models of behaviour. The test was performed using coded drugs and controls in double blind (operations and calculations). After an initial screening that showed all the tested remedies (except for Belladonna) to have some effects on the behavioural parameters (light-dark test and open-field test), but with high experimental variability, we focused our study on Gelsemium, and carried out two complete series of experiments. The results showed that Gelsemium had several effects on the exploratory behaviour of mice, which in some models were highly statistically significant (p < 0.001), in all the dilutions/dynamizations used, but with complex differences according to the experimental conditions and test performed. Finally, some methodological issues of animal research in this field of homeopathy are discussed. The "Gelsemium model" - encompassing experimental studies in vitro and in vivo from different laboratories and with different methods, including significant effects of its major active principle gelsemine - may play a pivotal rule for investigations on other homeopathic remedies.

  4. Developments toward more accurate molecular modeling of liquids

    NASA Astrophysics Data System (ADS)

    Evans, Tom J.

    2000-12-01

    The general goal of this research has been to improve upon existing combined quantum mechanics/molecular mechanics (QM/MM) methodologies. Error weighting functions have been introduced into the perturbative Monte Carlo (PMC) method for use with QM/MM. The PMC approach, introduced earlier, provides a means to reduce the number of full self-consistent field (SCF) calculations in simulations using the QM/MM potential by evoking perturbation theory to calculate energy changes due to displacements of a MM molecule. This will allow the ab initio QM/MM approach to be applied to systems that require more advanced, computationally demanding treatments of the QM and/or MM regions. Efforts have also been made to improve the accuracy of the representation of the solvent molecules usually represented by MM force fields. Results from an investigation of the applicability of the embedded density functional theory (EDFT) for studying physical properties of solutions will be presented. In this approach, the solute wavefunction is solved self- consistently in the field of individually frozen electron-density solvent molecules. To test its accuracy, the potential curves for interactions between Li+, Cl- and H2O with a single frozen-density H 2O molecule in different orientations have been calculated. With the development of the more sophisticated effective fragment potential (EFP) representation of solvent molecules, a QM/EFP technique was created. This hybrid QM/EFP approach was used to investigate the solvation of Li + by small clusters of water, as a test case for larger ionic dusters. The EFP appears to provide an accurate representation of the strong interactions that exist between Li+ and H2O. With the QM/EFP methodology comes an increased computational expense, resulting in an even greater need to rely on the PMC approach. However, while including the PMC into the hybrid QM/EFP technique, it was discovered that the previous implementation of the PMC was done incorrectly, invalidating earlier test results. The PMC implementation was therefore reworked, and tests were performed to investigate the methods usefulness in reducing the computational load of these types of simulations. The results that were obtained while studying F-(H2O) and F-(H 2O)2 show that PMC can be used cautiously to increase computational efficiency.

  5. Lagrangian condensation microphysics with Twomey CCN activation

    NASA Astrophysics Data System (ADS)

    Grabowski, Wojciech W.; Dziekan, Piotr; Pawlowska, Hanna

    2018-01-01

    We report the development of a novel Lagrangian microphysics methodology for simulations of warm ice-free clouds. The approach applies the traditional Eulerian method for the momentum and continuous thermodynamic fields such as the temperature and water vapor mixing ratio, and uses Lagrangian super-droplets to represent condensed phase such as cloud droplets and drizzle or rain drops. In other applications of the Lagrangian warm-rain microphysics, the super-droplets outside clouds represent unactivated cloud condensation nuclei (CCN) that become activated upon entering a cloud and can further grow through diffusional and collisional processes. The original methodology allows for the detailed study of not only effects of CCN on cloud microphysics and dynamics, but also CCN processing by a cloud. However, when cloud processing is not of interest, a simpler and computationally more efficient approach can be used with super-droplets forming only when CCN is activated and no super-droplet existing outside a cloud. This is possible by applying the Twomey activation scheme where the local supersaturation dictates the concentration of cloud droplets that need to be present inside a cloudy volume, as typically used in Eulerian bin microphysics schemes. Since a cloud volume is a small fraction of the computational domain volume, the Twomey super-droplets provide significant computational advantage when compared to the original super-droplet methodology. Additional advantage comes from significantly longer time steps that can be used when modeling of CCN deliquescence is avoided. Moreover, other formulation of the droplet activation can be applied in case of low vertical resolution of the host model, for instance, linking the concentration of activated cloud droplets to the local updraft speed. This paper discusses the development and testing of the Twomey super-droplet methodology, focusing on the activation and diffusional growth. Details of the activation implementation, transport of super-droplets in the physical space, and the coupling between super-droplets and the Eulerian temperature and water vapor field are discussed in detail. Some of these are relevant to the original super-droplet methodology as well and to the ice phase modeling using the Lagrangian approach. As a computational example, the scheme is applied to an idealized moist thermal rising in a stratified environment, with the original super-droplet methodology providing a benchmark to which the new scheme is compared.

  6. [Conceptual and methodological issues involved in the research field of diagnostic reasoning].

    PubMed

    Di Persia, Francisco N

    2016-05-01

    The psychopathological field is crossed by dilemmas that put in question its methodological, conceptual and philosophical filiations. Since the early works of Ey and Jaspers until recent work of Berrios it has been in question the position psychopathology has in the field of medicine in general, and in the field of psychiatry in particular, especially if it should follow the principles of natural science or if it has an autonomous position between them. This debate has led to two opposing positions facing two different models of psychopathology: the biomedical model and the socio-constructionist model. In this work it is proposed to review the scope and difficulties involved in each model following two central axes: diagnostic reasoning and mental illness conceptual problem. Later, as a synthesis of the analysis proposed they are identified central concepts of each model that could allow the development of a hybrid model in psychopathology; in between them the comprehensive framework employed in symptoms recognition and the social component that characterizes it are highlighted. As a conclusion, these concepts are proposed as central aspects for conceptual and methodological clarification of the research field of diagnostic reasoning in psychopathology.

  7. 76 FR 52892 - Energy Conservation Program: Energy Conservation Standards for Fluorescent Lamp Ballasts

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-24

    ... between the DOE test data and the data submitted by NEMA; describe the methodological changes DOE is... differences between test data obtained by DOE and test data submitted by NEMA; (3) describe the methodological...

  8. Measure Guideline: Combined Space and Water Heating Installation and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenbauer, B.; Bohac, D.; Huelman, P.

    Combined space and water heater (combi or combo) systems are defined by their dual functionality. Combi systems provide both space heating and water heating capabilities with a single heat source. This guideline will focus on the installation and operation of residential systems with forced air heating and domestic hot water (DHW) functionality. Past NorthernSTAR research has used a combi system to replace a natural gas forced air distribution system furnace and tank type water heater (Schoenbauer et al. 2012; Schoenbauer, Bohac, and McAlpine 2014). The combi systems consisted of a water heater or boiler heating plant teamed with a hydronicmore » air handler that included an air handler, water coil, and water pump to circulate water between the heating plant and coil. The combi water heater or boiler had a separate circuit for DHW. Past projects focused on laboratory testing, field characterization, and control optimization of combi systems. Laboratory testing was done to fully characterize and test combi system components; field testing was completed to characterize the installed performance of combi systems; and control methodologies were analyzed to understand the potential of controls to simplify installation and design and to improve system efficiency and occupant comfort. This past work was relied upon on to create this measure guideline.« less

  9. Methodological strategies in using home sleep apnea testing in research and practice.

    PubMed

    Miller, Jennifer N; Schulz, Paula; Pozehl, Bunny; Fiedler, Douglas; Fial, Alissa; Berger, Ann M

    2017-11-14

    Home sleep apnea testing (HSAT) has increased due to improvements in technology, accessibility, and changes in third party reimbursement requirements. Research studies using HSAT have not consistently reported procedures and methodological challenges. This paper had two objectives: (1) summarize the literature on use of HSAT in research of adults and (2) identify methodological strategies to use in research and practice to standardize HSAT procedures and information. Search strategy included studies of participants undergoing sleep testing for OSA using HSAT. MEDLINE via PubMed, CINAHL, and Embase with the following search terms: "polysomnography," "home," "level III," "obstructive sleep apnea," and "out of center testing." Research articles that met inclusion criteria (n = 34) inconsistently reported methods and methodological challenges in terms of: (a) participant sampling; (b) instrumentation issues; (c) clinical variables; (d) data processing; and (e) patient acceptability. Ten methodological strategies were identified for adoption when using HSAT in research and practice. Future studies need to address the methodological challenges summarized in this paper as well as identify and report consistent HSAT procedures and information.

  10. Real-time PCR detection of Plasmodium directly from whole blood and filter paper samples

    PubMed Central

    2011-01-01

    Background Real-time PCR is a sensitive and specific method for the analysis of Plasmodium DNA. However, prior purification of genomic DNA from blood is necessary since PCR inhibitors and quenching of fluorophores from blood prevent efficient amplification and detection of PCR products. Methods Reagents designed to specifically overcome PCR inhibition and quenching of fluorescence were evaluated for real-time PCR amplification of Plasmodium DNA directly from blood. Whole blood from clinical samples and dried blood spots collected in the field in Colombia were tested. Results Amplification and fluorescence detection by real-time PCR were optimal with 40× SYBR® Green dye and 5% blood volume in the PCR reaction. Plasmodium DNA was detected directly from both whole blood and dried blood spots from clinical samples. The sensitivity and specificity ranged from 93-100% compared with PCR performed on purified Plasmodium DNA. Conclusions The methodology described facilitates high-throughput testing of blood samples collected in the field by fluorescence-based real-time PCR. This method can be applied to a broad range of clinical studies with the advantages of immediate sample testing, lower experimental costs and time-savings. PMID:21851640

  11. Automated test-site radiometer for vicarious calibration

    NASA Astrophysics Data System (ADS)

    Li, Xin; Yin, Ya-peng; Liu, En-chao; Zhang, Yan-na; Xun, Li-na; Wei, Wei; Zhang, Zhi-peng; Qiu, Gang-gang; Zhang, Quan; Zheng, Xiao-bing

    2014-11-01

    In order to realize unmanned vicarious calibration, Automated Test-site Radiometer (ATR) was developed for surface reflectance measurements. ATR samples the spectrum from 400nm-1600 nm with 8 interference filters coupled with silicon and InGaAs detectors. The field of view each channel is 10 ° with parallel optical axis. One SWIR channel lies in the center and the other seven VNIR channels are on the circle of 4.8cm diameters which guarantee each channel to view nearly the same section of ground. The optical head as a whole is temperature controlled utilizing a TE cooler for greater stability and lower noise. ATR is powered by a solar panel and transmit its data through a BDS (China's BeiDou Navigation Satellite System) terminator for long-term measurements without personnel in site. ATR deployed in Dunhuang test site with ground field about 30-cm-diameter area for multi-spectral reflectance measurements. Other instruments at the site include a Cimel sunphotometer and a diffuser-to-globe irradiance meter for atmosphere observations. The methodology for band-averaged reflectance retrieval and hyperspectral reflectance fitting process are described. Then the hyperspectral reflectance and atmospheric parameters are put into 6s code to predict TOA radiance which compare with MODIS radiance.

  12. Collection of Infrasonic Sound From Sources of Military Importance

    NASA Technical Reports Server (NTRS)

    Masterman, Michael; Shams, Qamar A.; Burkett, Cecil G.; Zuckerwar, Allan J.; Stihler, Craig; Wallace, Jack

    2008-01-01

    Extreme Endeavors is collaborating with NASA Langley Research Center (LaRC) in the development, testing and analysis of infrasonic detection system under a Space Act Agreement. Acoustic studies of atmospheric events like convective storms, shear-induced turbulence, acoustic gravity waves, microbursts, hurricanes, and clear air turbulence (CAT) over the past thirty years have established that these events are strong emitters of infrasound. Recently NASA Langley Research Center has designed and developed a portable infrasonic detection system which can be used to make useful infrasound measurements at locations where it was not possible previously, such as a mountain crag, inside a cave or on the battlefield. The system comprises an electret condenser microphone, having a 3-inch membrane diameter, and a small, compact windscreen. Extreme Endeavors will present the findings from field testing using this portable infrasonic detection system. Field testing of the infrasonic detection system was partly funded by Greer Industries and support provided by the West Virginia Division of Natural Resources. The findings from this work illustrate the ability to detect structure and other information about the contents inside the caves. The presentation will describe methodology for utilizing infrasonic to locate and portray underground facilities.

  13. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol.

    PubMed

    Guglielmi, Dina; Simbula, Silvia; Vignoli, Michela; Bruni, Ilaria; Depolo, Marco; Bonfiglioli, Roberta; Tabanelli, Maria Carla; Violante, Francesco Saverio

    2013-06-22

    Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers' stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands.

  14. Solving a methodological challenge in work stress evaluation with the Stress Assessment and Research Toolkit (StART): a study protocol

    PubMed Central

    2013-01-01

    Background Stress evaluation is a field of strong interest and challenging due to several methodological aspects in the evaluation process. The aim of this study is to propose a study protocol to test a new method (i.e., the Stress Assessment and Research Toolkit) to assess psychosocial risk factors at work. Design This method addresses several methodological issues (e.g., subjective vs. objective, qualitative vs quantitative data) by assessing work-related stressors using different kinds of data: i) organisational archival data (organisational indicators sheet); ii) qualitative data (focus group); iii) worker perception (questionnaire); and iv) observational data (observational checklist) using mixed methods research. In addition, it allows positive and negative aspects of work to be considered conjointly, using an approach that considers at the same time job demands and job resources. Discussion The integration of these sources of data can reduce the theoretical and methodological bias related to stress research in the work setting, allows researchers and professionals to obtain a reliable description of workers’ stress, providing a more articulate vision of psychosocial risks, and allows a large amount of data to be collected. Finally, the implementation of the method ensures in the long term a primary prevention for psychosocial risk management in that it aims to reduce or modify the intensity, frequency or duration of organisational demands. PMID:23799950

  15. A proof of concept study to assess the potential of PCR testing to detect natural Mycobacterium bovis infection in South American camelids.

    PubMed

    Crawshaw, Timothy R; Chanter, Jeremy I; McGoldrick, Adrian; Line, Kirsty

    2014-02-07

    Cases of Mycobacterium bovis infection South American camelids have been increasing in Great Britain. Current antemortem immunological tests have some limitations. Cases at post mortem examination frequently show extensive pathology. The feasibility of detecting Mycobacterium bovis DNA in clinical samples was investigated. A sensitive extraction methodology was developed and used on nasal swabs and faeces taken post-mortem to assess the potential for a PCR test to detect Mycobacterium bovis in clinical samples. The gross pathology of the studied South American camelids was scored and a significantly greater proportion of South American camelids with more severe pathology were positive in both the nasal swab and faecal PCR tests. A combination of the nasal swab and faecal PCR tests detected 63.9% of all the South American camelids with pathology that were tested. The results suggest that antemortem diagnosis of Mycobacterium bovis in South American camelids may be possible using a PCR test on clinical samples, however more work is required to determine sensitivity and specificity, and the practicalities of applying the test in the field.

  16. Quantification and regionalization of groundwater recharge in South-Central Kansas: Integrating field characterization, statistical analysis, and GIS

    USGS Publications Warehouse

    Sophocleous, M.

    2000-01-01

    A practical methodology for recharge characterization was developed based on several years of field-oriented research at 10 sites in the Great Bend Prairie of south-central Kansas. This methodology combines the soil-water budget on a storm-by-storm year-round basis with the resulting watertable rises. The estimated 1985-1992 average annual recharge was less than 50mm/year with a range from 15 mm/year (during the 1998 drought) to 178 mm/year (during the 1993 flood year). Most of this recharge occurs during the spring months. To regionalize these site-specific estimates, an additional methodology based on multiple (forward) regression analysis combined with classification and GIS overlay analyses was developed and implemented. The multiple regression analysis showed that the most influential variables were, in order of decreasing importance, total annual precipitation, average maximum springtime soil-profile water storage, average shallowest springtime depth to watertable, and average springtime precipitation rate. Therefore, four GIS (ARC/INFO) data "layers" or coverages were constructed for the study region based on these four variables, and each such coverage was classified into the same number of data classes to avoid biasing the results. The normalized regression coefficients were employed to weigh the class rankings of each recharge-affecting variable. This approach resulted in recharge zonations that agreed well with the site recharge estimates. During the "Great Flood of 1993," when rainfall totals exceeded normal levels by -200% in the northern portion of the study region, the developed regionalization methodology was tested against such extreme conditions, and proved to be both practical, based on readily available or easily measurable data, and robust. It was concluded that the combination of multiple regression and GIS overlay analyses is a powerful and practical approach to regionalizing small samples of recharge estimates.

  17. Damage methodology approach on a composite panel based on a combination of Fringe Projection and 2D Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Felipe-Sesé, Luis; Díaz, Francisco A.

    2018-02-01

    The recent improvement in accessibility to high speed digital cameras has enabled three dimensional (3D) vibration measurements employing full-field optical techniques. Moreover, there is a need to develop a cost-effective and non-destructive testing method to quantify the severity of damages arising from impacts and thus, enhance the service life. This effect is more interesting in composite structures since possible internal damage has low external manifestation. Those possible damages have been previously studied experimentally by using vibration testing. Namely, those analyses were focused on variations in the modal frequencies or, more recently, mode shapes variations employing punctual accelerometers or vibrometers. In this paper it is presented an alternative method to investigate the severity of damage on a composite structure and how the damage affects to its integrity through the analysis of the full field modal behaviour. In this case, instead of punctual measurements, displacement maps are analysed by employing a combination of FP + 2D-DIC during vibration experiments in an industrial component. In addition, to analyse possible mode shape changes, differences between damaged and undamaged specimens are studied by employing a recent methodology based on Adaptive Image Decomposition (AGMD) procedure. It will be demonstrated that AGMD Image decomposition procedure, which decompose the displacement field into shape descriptors, is capable to detect and quantify the differences between mode shapes. As an application example, the proposed approach has been evaluated on two large industrial components (car bonnets) made of short-fibre reinforced composite. Specifically, the evolution of normalized AGMD shape descriptors has been evaluated for three different components with different damage levels. Results demonstrate the potential of the presented approach making it possible to measure the severity of a structural damage by evaluating the mode shape based in the analysis of its shape descriptors.

  18. Field Placement Treatments: A Comparative Study

    ERIC Educational Resources Information Center

    Parkison, Paul T.

    2008-01-01

    Field placement within teacher education represents a topic of interest for all preservice teacher programs. Present research addresses a set of important questions regarding field placement: (1) What pedagogical methodologies facilitate deep learning during field experiences? (2) Is there a significant difference in treatment effect for…

  19. From Unity to Diversity: Twenty-Five Years of Language-Teaching Methodology

    ERIC Educational Resources Information Center

    Larsen-Freeman, Diane

    2012-01-01

    This article was written for the 25th anniversary of "English Teaching Forum" and published in 1987. In this article, the author describes methodological developments in the field of English language teaching over the past 25 years. The author has found it helpful to think of methodology being depicted as a triangle, with each angle of the…

  20. Excavating and (Re)Presenting Stories: Narrative Inquiry as an Emergent Methodology in the Field of Adult Vocational Education and Technology

    ERIC Educational Resources Information Center

    Zimmerman, Aaron Samuel; Kim, Jeong-Hee

    2017-01-01

    Narrative inquiry has been a popular methodology in different disciplines for the last few decades. Using stories, narrative inquiry illuminates lived experience, serving as a valuable complement to research methodologies that are rooted in positivist epistemologies. In this article, we present a brief introduction to narrative inquiry including…

  1. Methodological Challenges in Researching Threshold Concepts: A Comparative Analysis of Three Projects

    ERIC Educational Resources Information Center

    Quinlan, K. M.; Male, S.; Baillie, C.; Stamboulis, A.; Fill, J.; Jaffer, Z.

    2013-01-01

    Threshold concepts were introduced nearly 10 years ago by Ray Land and Jan Meyer. This work has spawned four international conferences and hundreds of papers. Although the idea has clearly gained traction in higher education, this sub-field does not yet have a fully fledged research methodology or a strong critical discourse about methodology.…

  2. A mapping closure for turbulent scalar mixing using a time-evolving reference field

    NASA Technical Reports Server (NTRS)

    Girimaji, Sharath S.

    1992-01-01

    A general mapping-closure approach for modeling scalar mixing in homogeneous turbulence is developed. This approach is different from the previous methods in that the reference field also evolves according to the same equations as the physical scalar field. The use of a time-evolving Gaussian reference field results in a model that is similar to the mapping closure model of Pope (1991), which is based on the methodology of Chen et al. (1989). Both models yield identical relationships between the scalar variance and higher-order moments, which are in good agreement with heat conduction simulation data and can be consistent with any type of epsilon(phi) evolution. The present methodology can be extended to any reference field whose behavior is known. The possibility of a beta-pdf reference field is explored. The shortcomings of the mapping closure methods are discussed, and the limit at which the mapping becomes invalid is identified.

  3. Comparison of geochemical data obtained using four brine sampling methods at the SECARB Phase III Anthropogenic Test CO2 injection site, Citronelle Oil Field, Alabama

    USGS Publications Warehouse

    Conaway, Christopher; Thordsen, James J.; Manning, Michael A.; Cook, Paul J.; Trautz, Robert C.; Thomas, Burt; Kharaka, Yousif K.

    2016-01-01

    The chemical composition of formation water and associated gases from the lower Cretaceous Paluxy Formation was determined using four different sampling methods at a characterization well in the Citronelle Oil Field, Alabama, as part of the Southeast Regional Carbon Sequestration Partnership (SECARB) Phase III Anthropogenic Test, which is an integrated carbon capture and storage project. In this study, formation water and gas samples were obtained from well D-9-8 #2 at Citronelle using gas lift, electric submersible pump, U-tube, and a downhole vacuum sampler (VS) and subjected to both field and laboratory analyses. Field chemical analyses included electrical conductivity, dissolved sulfide concentration, alkalinity, and pH; laboratory analyses included major, minor and trace elements, dissolved carbon, volatile fatty acids, free and dissolved gas species. The formation water obtained from this well is a Na–Ca–Cl-type brine with a salinity of about 200,000 mg/L total dissolved solids. Differences were evident between sampling methodologies, particularly in pH, Fe and alkalinity. There was little gas in samples, and gas composition results were strongly influenced by sampling methods. The results of the comparison demonstrate the difficulty and importance of preserving volatile analytes in samples, with the VS and U-tube system performing most favorably in this aspect.

  4. Malaria surveys using rapid diagnostic tests and validation of results using post hoc quantification of Plasmodium falciparum histidine-rich protein 2.

    PubMed

    Plucinski, Mateusz; Dimbu, Rafael; Candrinho, Baltazar; Colborn, James; Badiane, Aida; Ndiaye, Daouda; Mace, Kimberly; Chang, Michelle; Lemoine, Jean F; Halsey, Eric S; Barnwell, John W; Udhayakumar, Venkatachalam; Aidoo, Michael; Rogier, Eric

    2017-11-07

    Rapid diagnostic test (RDT) positivity is supplanting microscopy as the standard measure of malaria burden at the population level. However, there is currently no standard for externally validating RDT results from field surveys. Individuals' blood concentration of the Plasmodium falciparum histidine rich protein 2 (HRP2) protein were compared to results of HRP2-detecting RDTs in participants from field surveys in Angola, Mozambique, Haiti, and Senegal. A logistic regression model was used to estimate the HRP2 concentrations corresponding to the 50 and 90% level of detection (LOD) specific for each survey. There was a sigmoidal dose-response relationship between HRP2 concentration and RDT positivity for all surveys. Variation was noted in estimates for field RDT sensitivity, with the 50% LOD ranging between 0.076 and 6.1 ng/mL and the 90% LOD ranging between 1.1 and 53 ng/mL. Surveys conducted in two different provinces of Angola using the same brand of RDT and same study methodology showed a threefold difference in LOD. Measures of malaria prevalence estimated using population RDT positivity should be interpreted in the context of potentially large variation in RDT LODs between, and even within, surveys. Surveys based on RDT positivity would benefit from external validation of field RDT results by comparing RDT positivity and antigen concentration.

  5. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deline, Chris; MacAlpine, Sara; Marion, Bill

    2016-11-21

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less

  6. Evaluation and Field Assessment of Bifacial Photovoltaic Module Power Rating Methodologies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deline, Chris; MacAlpine, Sara; Marion, Bill

    2016-06-16

    1-sun power ratings for bifacial modules are currently undefined. This is partly because there is no standard definition of rear irradiance given 1000 Wm-2 on the front. Using field measurements and simulations, we evaluate multiple deployment scenarios for bifacial modules and provide details on the amount of irradiance that could be expected. A simplified case that represents a single module deployed under conditions consistent with existing 1-sun irradiance standards leads to a bifacial reference condition of 1000 Wm-2 Gfront and 130-140 Wm-2 Grear. For fielded systems of bifacial modules, Grear magnitude and spatial uniformity will be affected by self-shade frommore » adjacent modules, varied ground cover, and ground-clearance height. A standard measurement procedure for bifacial modules is also currently undefined. A proposed international standard is under development, which provides the motivation for this work. Here, we compare outdoor field measurements of bifacial modules with irradiance on both sides with proposed indoor test methods where irradiance is only applied to one side at a time. The indoor method has multiple advantages, including controlled and repeatable irradiance and thermal environment, along with allowing the use of conventional single-sided flash test equipment. The comparison results are promising, showing that the indoor and outdoor methods agree within 1%-2% for multiple rear-irradiance conditions and bifacial module types.« less

  7. Sampling flies or sampling flaws? Experimental design and inference strength in forensic entomology.

    PubMed

    Michaud, J-P; Schoenly, Kenneth G; Moreau, G

    2012-01-01

    Forensic entomology is an inferential science because postmortem interval estimates are based on the extrapolation of results obtained in field or laboratory settings. Although enormous gains in scientific understanding and methodological practice have been made in forensic entomology over the last few decades, a majority of the field studies we reviewed do not meet the standards for inference, which are 1) adequate replication, 2) independence of experimental units, and 3) experimental conditions that capture a representative range of natural variability. Using a mock case-study approach, we identify design flaws in field and lab experiments and suggest methodological solutions for increasing inference strength that can inform future casework. Suggestions for improving data reporting in future field studies are also proposed.

  8. The Impact of Explicit Teaching of Methodological Aspects of Physics on Scientistic Beliefs and Interest

    ERIC Educational Resources Information Center

    Korte, Stefan; Berger, Roland; Hänze, Martin

    2017-01-01

    We assessed the impact of teaching methodological aspects of physics on students' scientistic beliefs and subject interest in physics in a repeated-measurement design with a total of 142 students of upper secondary physics classes. Students gained knowledge of methodological aspects from the pre-test to the post-test and reported reduced…

  9. On Improving the Experiment Methodology in Pedagogical Research

    ERIC Educational Resources Information Center

    Horakova, Tereza; Houska, Milan

    2014-01-01

    The paper shows how the methodology for a pedagogical experiment can be improved through including the pre-research stage. If the experiment has the form of a test procedure, an improvement of methodology can be achieved using for example the methods of statistical and didactic analysis of tests which are traditionally used in other areas, i.e.…

  10. Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets

    DTIC Science & Technology

    2017-07-01

    principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for

  11. 77 FR 6971 - Establishment of User Fees for Filovirus Testing of Nonhuman Primate Liver Samples

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... (ELISA) or other appropriate methodology. Each specimen will be held for six months. After six months.../CDC's analysis of costs to the Government is based on the current methodology (ELISA) used to test NHP... different methodology or changes in the availability of ELISA reagents will affect the amount of the user...

  12. Instrumentation Methodology for Automobile Crash Testing

    DOT National Transportation Integrated Search

    1974-08-01

    Principal characteristics of existing data acquisition practices and instrumentation methodologies have been reviewed to identify differences which are responsible for difficulties in comparing and interpreting structural crash test data. Recommendat...

  13. A primer on systematic reviews in toxicology.

    PubMed

    Hoffmann, Sebastian; de Vries, Rob B M; Stephens, Martin L; Beck, Nancy B; Dirven, Hubert A A M; Fowle, John R; Goodman, Julie E; Hartung, Thomas; Kimber, Ian; Lalu, Manoj M; Thayer, Kristina; Whaley, Paul; Wikoff, Daniele; Tsaioun, Katya

    2017-07-01

    Systematic reviews, pioneered in the clinical field, provide a transparent, methodologically rigorous and reproducible means of summarizing the available evidence on a precisely framed research question. Having matured to a well-established approach in many research fields, systematic reviews are receiving increasing attention as a potential tool for answering toxicological questions. In the larger framework of evidence-based toxicology, the advantages and obstacles of, as well as the approaches for, adapting and adopting systematic reviews to toxicology are still being explored. To provide the toxicology community with a starting point for conducting or understanding systematic reviews, we herein summarized available guidance documents from various fields of application. We have elaborated on the systematic review process by breaking it down into ten steps, starting with planning the project, framing the question, and writing and publishing the protocol, and concluding with interpretation and reporting. In addition, we have identified the specific methodological challenges of toxicological questions and have summarized how these can be addressed. Ultimately, this primer is intended to stimulate scientific discussions of the identified issues to fuel the development of toxicology-specific methodology and to encourage the application of systematic review methodology to toxicological issues.

  14. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  15. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. Has Toxicity Testing Moved into the 21st Century? A Survey and Analysis of Perceptions in the Field of Toxicology.

    PubMed

    Zaunbrecher, Virginia; Beryt, Elizabeth; Parodi, Daniela; Telesca, Donatello; Doherty, Joseph; Malloy, Timothy; Allard, Patrick

    2017-08-30

    Ten years ago, leaders in the field of toxicology called for a transformation of the discipline and a shift from primarily relying on traditional animal testing to incorporating advances in biotechnology and predictive methodologies into alternative testing strategies (ATS). Governmental agencies and academic and industry partners initiated programs to support such a transformation, but a decade later, the outcomes of these efforts are not well understood. We aimed to assess the use of ATS and the perceived barriers and drivers to their adoption by toxicologists and by others working in, or closely linked with, the field of toxicology. We surveyed 1,381 toxicologists and experts in associated fields regarding the viability and use of ATS and the perceived barriers and drivers of ATS for a range of applications. We performed ranking, hierarchical clustering, and correlation analyses of the survey data. Many respondents indicated that they were already using ATS, or believed that ATS were already viable approaches, for toxicological assessment of one or more end points in their primary area of interest or concern (26-86%, depending on the specific ATS/application pair). However, the proportions of respondents reporting use of ATS in the previous 12 mo were smaller (4.5-41%). Concern about regulatory acceptance was the most commonly cited factor inhibiting the adoption of ATS, and a variety of technical concerns were also cited as significant barriers to ATS viability. The factors most often cited as playing a significant role (currently or in the future) in driving the adoption of ATS were the need for expedited toxicology information, the need for reduced toxicity testing costs, demand by regulatory agencies, and ethical or moral concerns. Our findings indicate that the transformation of the field of toxicology is partly implemented, but significant barriers to acceptance and adoption remain. https://doi.org/10.1289/EHP1435.

  18. Hypothesis testing on the fractal structure of behavioral sequences: the Bayesian assessment of scaling methodology.

    PubMed

    Moscoso del Prado Martín, Fermín

    2013-12-01

    I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Virtual Shaker Testing: Simulation Technology Improves Vibration Test Performance

    NASA Technical Reports Server (NTRS)

    Ricci, Stefano; Peeters, Bart; Fetter, Rebecca; Boland, Doug; Debille, Jan

    2008-01-01

    In the field of vibration testing, the interaction between the structure being tested and the instrumentation hardware used to perform the test is a critical issue. This is particularly true when testing massive structures (e.g. satellites), because due to physical design and manufacturing limits, the dynamics of the testing facility often couples with the test specimen one in the frequency range of interest. A further issue in this field is the standard use of a closed loop real-time vibration control scheme, which could potentially shift poles and change damping of the aforementioned coupled system. Virtual shaker testing is a novel approach to deal with these issues. It means performing a simulation which closely represents the real vibration test on the specific facility by taking into account all parameters which might impact the dynamic behavior of the specimen. In this paper, such a virtual shaker testing approach is developed. It consists of the following components: (1) Either a physical-based or an equation-based coupled electro-mechanical lumped parameter shaker model is created. The model parameters are obtained from manufacturer's specifications or by carrying out some dedicated experiments; (2) Existing real-time vibration control algorithm are ported to the virtual simulation environment; and (3) A structural model of the test object is created and after defining proper interface conditions structural modes are computed by means of the well-established Craig-Bampton CMS technique. At this stage, a virtual shaker test has been run, by coupling the three described models (shaker, control loop, structure) in a co-simulation routine. Numerical results have eventually been correlated with experimental ones in order to assess the robustness of the proposed methodology.

  20. Development of Methodology and Technology for Identifying and Quantifying Emission Products from Open Burning and Open Detonation Thermal Treatment Methods. Field Test Series A, B, and C. Volume 2, Part B. Quality Assurance and Quality Control. Appendices

    DTIC Science & Technology

    1992-01-01

    the uncertainty. The above method can give an estimate of the precision of the * analysis. However, determining the accuracy can not be done as...speciation has been determined from analyzing model samples as well as comparison with other methods and combinations of other methods with this method . 3...laboratory. The output of the sensor is characterized over its working range and an appropriate response factor determined by linear regression of the

  1. Application of computational aeroacoustic methodologies to advanced propeller configurations - A review

    NASA Technical Reports Server (NTRS)

    Korkan, Kenneth D.; Eagleson, Lisa A.; Griffiths, Robert C.

    1991-01-01

    Current research in the area of advanced propeller configurations for performance and acoustics are briefly reviewed. Particular attention is given to the techniques of Lock and Theodorsen modified for use in the design of counterrotating propeller configurations; a numerical method known as SSTAGE, which is a Euler solver for the unducted fan concept; the NASPROP-E numerical analysis also based on a Euler solver and used to study the near acoustic fields for the SR series propfan configurations; and a counterrotating propeller test rig designed to obtain an experimental performance/acoustic data base for various propeller configurations.

  2. Development of Methodology and Technology for Identifying and Quantifying Emission Products from Open Burning and Open Detonation Thermal Treatment Methods. Field Test Series A, B, and C. Volume 2, Part A. Quality Assurance and Quality Control

    DTIC Science & Technology

    1992-01-01

    instrument logbook was maintained, but all calibration printouts for the SFC/MS were put in a dedicated loose- leaf notebook. The temperature of the...to-date temperature - monitoring sheets were located at the freezer. Each worker maintained a project-specific personal logbook to enter data...driven 10-cm-diameter gate valve into a 1.5-m3 carbon-impregnated polyethylene ( Velostat 7") sampling bag. The bag, constructed of electrically

  3. Designing a Field Experience Tracking System in the Area of Special Education

    ERIC Educational Resources Information Center

    He, Wu; Watson, Silvana

    2014-01-01

    Purpose: To improve the quality of field experience, support field experience cooperation and streamline field experience management, the purpose of this paper is to describe the experience in using Activity Theory to design and develop a web-based field experience tracking system for a special education program. Design/methodology/approach: The…

  4. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  5. A methodological approach to improve the sexual health of vulnerable female populations: incentivized peer-recruitment and field-based STD testing.

    PubMed

    Roth, Alexis M; Rosenberger, Joshua G; Reece, Michael; Van Der Pol, Barbara

    2012-02-01

    Transactional sex has been associated with increased risk of adverse health outcomes, including sexually transmitted infections (STIs). Participants included female sex workers and men they recruited utilizing incentivized snowball sampling. Participants provided specimens for STI diagnostic testing and completed a semi-structured interview. Forty-four participants aged 19-65 were interviewed. Participants found self-sampling to be acceptable and overwhelmingly endorsed sampling outside of a clinic (90%) for reasons such as convenience, privacy, and lack of stigma. A substantial minority (38%) tested positive for at least one STI. Novel strategies may encourage sexual health care and prevent STIs among sex workers. High infection and screening acceptance rates across the sample suggests that individuals engaged in transactional sex would benefit from, and would be responsive to, community-based self-sampling for STI screening.

  6. Theory and research in audiology education: understanding and representing complexity through informed methodological decisions.

    PubMed

    Ng, Stella L

    2013-05-01

    The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.

  7. Spectral characteristics and the extent of paleosols of the Palouse formation

    NASA Technical Reports Server (NTRS)

    Frazier, B. E.; Busacca, Alan; Cheng, Yaan; Wherry, David; Hart, Judy; Gill, Steve

    1988-01-01

    The objective of this study is to test the hypothesis that TM data is adequate in band selection and width and in spatial resolution to distinguish soil organic matter, iron oxide, and lime-silica contents to map several severity classes of erosion in soils of the Palouse region. The methodology used is as follows: (1) To develop spectral relationships from TM data that define the spatial distribution of soil areas by levels of (1) organic matter in the surface soil, (2) iron oxide and clay in exposed paleosol B horizons, and (3) lime-silica accumulations in exposed paleosol B horizons; (2) To compare areas determined by the method outlined in 1 to patterns interpreted from color aerial photos, and to ground observations on bare-soil fields; and (3) To define, on the basis of results of 1 and 2 to the extent possible, where exposed paleosols exist within fields that are not bare, but have a crop cover, and the distribution of desirable and undesirable soil properties in each field.

  8. The rise of moral cognition.

    PubMed

    Greene, Joshua D

    2015-02-01

    The field of moral cognition has grown rapidly in recent years thanks in no small part to Cognition. Consistent with its interdisciplinary tradition, Cognition encouraged the growth of this field by supporting empirical research conducted by philosophers as well as research native to neighboring fields such as social psychology, evolutionary game theory, and behavioral economics. This research has been exceptionally diverse both in its content and methodology. I argue that this is because morality is unified at the functional level, but not at the cognitive level, much as vehicles are unified by shared function rather than shared mechanics. Research in moral cognition, then, has progressed by explaining the phenomena that we identify as "moral" (for high-level functional reasons) in terms of diverse cognitive components that are not specific to morality. In light of this, research on moral cognition may continue to flourish, not as the identification and characterization of distinctive moral processes, but as a testing ground for theories of high-level, integrative cognitive function. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Aquatic ecotoxicology: from the ecosystem to the cellular and molecular levels.

    PubMed Central

    Boudou, A; Ribeyre, F

    1997-01-01

    This review of aquatic ecotoxicology is presented in three parts. First, we discuss the fundamental concepts and stress the importance of its ecological basis and the complexity and diversity of the field of investigation, which result from actions and interactions between the physicochemical characteristics of the biotopes, the structural and functional properties of the living organisms, and the contamination modalities. Ecotoxicological mechanisms, regardless of the level of biological complexity, primarily depend on the bioavailability of the toxic products. Numerous processes control the chemical fate of contaminants in the water column and/or sediment compartments; accessibility to the biological barriers that separate the organisms from their surrounding medium depends directly on bioavailability. Second, we review the principal methodologies of the field, from in situ studies at the ecosystem/ecocomplex level to bioassays or single species tests. Third, we focus on mercury, selected as a reference contaminant, in order to illustrate the main ecotoxicological concepts, the complementarity between field and laboratory studies, and the indispensable multidisciplinarity of the approaches. PMID:9114275

  10. Measurement of ²²⁶Ra in soil from oil field: advantages of γ-ray spectrometry and application to the IAEA-448 CRM.

    PubMed

    Ceccatelli, A; Katona, R; Kis-Benedek, G; Pitois, A

    2014-05-01

    The analytical performance of gamma-ray spectrometry for the measurement of (226)Ra in TENORM (Technically Enhanced Naturally Occurring Radioactive Material) soil was investigated by the IAEA. Fast results were obtained for characterization and certification of a new TENORM Certified Reference Material (CRM), identified as IAEA-448 (soil from oil field). The combined standard uncertainty of the gamma-ray spectrometry results is of the order of 2-3% for massic activity measurement values ranging from 16500 Bq kg(-1) to 21500 Bq kg(-1). Methodologies used for the production and certification of the IAEA-448 CRM are presented. Analytical results were confirmed by alpha spectrometry. The "t" test showed agreement between alpha and gamma results at 95% confidence level. © 2013 Published by Elsevier Ltd.

  11. A 3D QSAR CoMFA study of non-peptide angiotensin II receptor antagonists

    NASA Astrophysics Data System (ADS)

    Belvisi, Laura; Bravi, Gianpaolo; Catalano, Giovanna; Mabilia, Massimo; Salimbeni, Aldo; Scolastico, Carlo

    1996-12-01

    A series of non-peptide angiotensin II receptor antagonists was investigated with the aim of developing a 3D QSAR model using comparative molecular field analysis descriptors and approaches. The main goals of the study were dictated by an interest in methodologies and an understanding of the binding requirements to the AT1 receptor. Consistency with the previously derived activity models was always checked to contemporarily test the validity of the various hypotheses. The specific conformations chosen for the study, the procedures invoked to superimpose all structures, the conditions employed to generate steric and electrostatic field values and the various PCA/PLS runs are discussed in detail. The effect of experimental design techniques to select objects (molecules) and variables (descriptors) with respect to the predictive power of the QSAR models derived was especially analysed.

  12. Phthalocyanine identification in paintings by reflectance spectroscopy. A laboratory and in situ study

    NASA Astrophysics Data System (ADS)

    Poldi, G.; Caglio, S.

    2013-06-01

    The importance of identifying pigments using non invasive (n.i.) analyses has gained increasing importance in the field of spectroscopy applied to art conservation and art studies. Among the large set of pigments synthesized and marketed during 20th century, surely phthalocyanine blue and green pigments occupy an important role in the field of painting (including restoration) and printing, thanks to their characteristics like brightness and fastness. This research focused on the most used phthalocyanine blue (PB15:1 and PB15:3) and green pigments (PG7), and on the possibility to identify these organic compounds using a methodology like reflectance spectroscopy in the UV, visible and near IR range (UV-vis-NIR RS), performed easily through portable instruments. Laboratory tests and three examples carried out on real paintings are discussed.

  13. Bringing Standardized Processes in Atom-Probe Tomography: I Establishing Standardized Terminology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ian M; Danoix, F; Forbes, Richard

    2011-01-01

    Defining standardized methods requires careful consideration of the entire field and its applications. The International Field Emission Society (IFES) has elected a Standards Committee, whose task is to determine the needed steps to establish atom-probe tomography as an accepted metrology technique. Specific tasks include developing protocols or standards for: terminology and nomenclature; metrology and instrumentation, including specifications for reference materials; test methodologies; modeling and simulations; and science-based health, safety, and environmental practices. The Committee is currently working on defining terminology related to atom-probe tomography with the goal to include terms into a document published by the International Organization for Standardsmore » (ISO). A lot of terms also used in other disciplines have already been defined) and will be discussed for adoption in the context of atom-probe tomography.« less

  14. Developpement energetique par modelisation et intelligence territoriale: Un outil de prise de decision participative pour le developpement durable des projets eoliens

    NASA Astrophysics Data System (ADS)

    Vazquez Rascon, Maria de Lourdes

    This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.

  15. Methodological Review of Studies on Educational Leaders and Emotions (1992-2012): Insights into the Meaning of an Emerging Research Field in Educational Administration

    ERIC Educational Resources Information Center

    Berkovich, Izhak; Eyal, Ori

    2017-01-01

    Purpose: The purpose of this paper is to do methodological review of the literature on educational leaders and emotions that includes 49 empirical studies published in peer-reviewed journals between 1992 and 2012. Design/methodology/approach: The work systematically analyzes descriptive information, methods, and designs in these studies, and their…

  16. A protocol for a three-arm cluster randomized controlled superiority trial investigating the effects of two pedagogical methodologies in Swedish preschool settings on language and communication, executive functions, auditive selective attention, socioemotional skills and early maths skills.

    PubMed

    Gerholm, Tove; Hörberg, Thomas; Tonér, Signe; Kallioinen, Petter; Frankenberg, Sofia; Kjällander, Susanne; Palmer, Anna; Taguchi, Hillevi Lenz

    2018-06-19

    During the preschool years, children develop abilities and skills in areas crucial for later success in life. These abilities include language, executive functions, attention, and socioemotional skills. The pedagogical methods used in preschools hold the potential to enhance these abilities, but our knowledge of which pedagogical practices aid which abilities, and for which children, is limited. The aim of this paper is to describe an intervention study designed to evaluate and compare two pedagogical methodologies in terms of their effect on the above-mentioned skills in Swedish preschool children. The study is a randomized control trial (RCT) where two pedagogical methodologies were tested to evaluate how they enhanced children's language, executive functions and attention, socioemotional skills, and early maths skills during an intensive 6-week intervention. Eighteen preschools including 28 units and 432 children were enrolled in a municipality close to Stockholm, Sweden. The children were between 4;0 and 6;0 years old and each preschool unit was randomly assigned to either of the interventions or to the control group. Background information on all children was collected via questionnaires completed by parents and preschools. Pre- and post-intervention testing consisted of a test battery including tests on language, executive functions, selective auditive attention, socioemotional skills and early maths skills. The interventions consisted of 6 weeks of intensive practice of either a socioemotional and material learning paradigm (SEMLA), for which group-based activities and interactional structures were the main focus, or an individual, digitally implemented attention and math training paradigm, which also included a set of self-regulation practices (DIL). All preschools were evaluated with the ECERS-3. If this intervention study shows evidence of a difference between group-based learning paradigms and individual training of specific skills in terms of enhancing children's abilities in fundamental areas like language, executive functions and attention, socioemotional skills and early math, this will have big impact on the preschool agenda in the future. The potential for different pedagogical methodologies to have different impacts on children of different ages and with different backgrounds invites a wider discussion within the field of how to develop a preschool curriculum suited for all children.

  17. Development of a biocidal treatment regime to inhibit biological growths on cultural heritage: BIODAM

    NASA Astrophysics Data System (ADS)

    Young, M. E.; Alakomi, H.-L.; Fortune, I.; Gorbushina, A. A.; Krumbein, W. E.; Maxwell, I.; McCullagh, C.; Robertson, P.; Saarela, M.; Valero, J.; Vendrell, M.

    2008-12-01

    Existing chemical treatments to prevent biological damage to monuments often involve considerable amounts of potentially dangerous and even poisonous biocides. The scientific approach described in this paper aims at a drastic reduction in the concentration of biocide applications by a polyphasic approach of biocides combined with cell permeabilisers, polysaccharide and pigment inhibitors and a photodynamic treatment. A variety of potential agents were screened to determine the most effective combination. Promising compounds were tested under laboratory conditions with cultures of rock deteriorating bacteria, algae, cyanobacteria and fungi. A subsequent field trial involved two sandstone types with natural biofilms. These were treated with multiple combinations of chemicals and exposed to three different climatic conditions. Although treatments proved successful in the laboratory, field trials were inconclusive and further testing will be required to determine the most effective treatment regime. While the most effective combination of chemicals and their application methodology is still being optimised, results to date indicate that this is a promising and effective treatment for the control of a wide variety of potentially damaging organisms colonising stone substrates.

  18. Combining machine learning and ontological data handling for multi-source classification of nature conservation areas

    NASA Astrophysics Data System (ADS)

    Moran, Niklas; Nieland, Simon; Tintrup gen. Suntrup, Gregor; Kleinschmit, Birgit

    2017-02-01

    Manual field surveys for nature conservation management are expensive and time-consuming and could be supplemented and streamlined by using Remote Sensing (RS). RS is critical to meet requirements of existing laws such as the EU Habitats Directive (HabDir) and more importantly to meet future challenges. The full potential of RS has yet to be harnessed as different nomenclatures and procedures hinder interoperability, comparison and provenance. Therefore, automated tools are needed to use RS data to produce comparable, empirical data outputs that lend themselves to data discovery and provenance. These issues are addressed by a novel, semi-automatic ontology-based classification method that uses machine learning algorithms and Web Ontology Language (OWL) ontologies that yields traceable, interoperable and observation-based classification outputs. The method was tested on European Union Nature Information System (EUNIS) grasslands in Rheinland-Palatinate, Germany. The developed methodology is a first step in developing observation-based ontologies in the field of nature conservation. The tests show promising results for the determination of the grassland indicators wetness and alkalinity with an overall accuracy of 85% for alkalinity and 76% for wetness.

  19. Detection of cocaine in cargo containers by high-volume vapor sampling: field test at Port of Miami

    NASA Astrophysics Data System (ADS)

    Neudorfl, Pavel; Hupe, Michael; Pilon, Pierre; Lawrence, Andre H.; Drolet, Gerry; Su, Chih-Wu; Rigdon, Stephen W.; Kunz, Terry D.; Ulwick, Syd; Hoglund, David E.; Wingo, Jeff J.; Demirgian, Jack C.; Shier, Patrick

    1997-02-01

    The use of marine containers is a well known smuggling method for large shipments of drugs. Such containers present an ideal method of smuggling as the examination method is time consuming, difficult and expensive for the importing community. At present, various methods are being studied for screening containers which would allow to rapidly distinguish between innocent and suspicious cargo. Air sampling is one such method. Air is withdrawn for the inside of containers and analyzed for telltale vapors uniquely associated with the drug. The attractive feature of the technique is that the containers could be sampled without destuffing and opening, since air could be conveniently withdrawn via ventilation ducts. In the present paper, the development of air sampling methodology for the detection of cocaine hydrochloride will be discussed, and the results from a recent field test will be presented. The results indicated that vapors of cocaine and its decomposition product, ecgonidine methyl ester, could serve as sensitive indicators of the presence of the drug in the containers.

  20. One-dimensional statistical parametric mapping in Python.

    PubMed

    Pataky, Todd C

    2012-01-01

    Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.

Top