Dobbertin, Matthias; Hug, Christian; Mizoue, Nobuya
2004-11-01
In this study we used photographs of tree crowns to test whether the assessment methods for tree defoliation in Switzerland have changed over time. We randomly selected 24 series of slides of Norway spruce with field assessments made between 1986 and 1995. The slides were randomly arranged and assessed by three experts without prior knowledge of the year when the slide was taken or the tree number. Defoliation was assessed using the Swiss reference photo guide. Although the correlations between the field assessments and slide assessments were high (Spearman's rank correlation coefficient ranged between 0.79 and 0.83), we found significant differences between field and slide assessments (4.3 to 9% underprediction by the slide assessors) and between the slide assessments. However, no significant trends in field assessment methods could be detected. When the mean differences between field and slide assessments were subtracted, in some years, field assessors consistently underpredicted (1990, 1992) or overpredicted defoliation (1987, 1991). Defoliation tended to be overpredicted in slides taken against the light, and underpredicted for trees with more than 25% crown overlap. We conclude that slide series can be used to detect changes in assessment methods. However, potential observer bias calls for more objective methods of assessment.
Thors, Björn; Thielens, Arno; Fridén, Jonas; Colombi, Davide; Törnevik, Christer; Vermeeren, Günter; Martens, Luc; Joseph, Wout
2014-05-01
In this paper, different methods for practical numerical radio frequency exposure compliance assessments of radio base station products were investigated. Both multi-band base station antennas and antennas designed for multiple input multiple output (MIMO) transmission schemes were considered. For the multi-band case, various standardized assessment methods were evaluated in terms of resulting compliance distance with respect to the reference levels and basic restrictions of the International Commission on Non-Ionizing Radiation Protection. Both single frequency and multiple frequency (cumulative) compliance distances were determined using numerical simulations for a mobile communication base station antenna transmitting in four frequency bands between 800 and 2600 MHz. The assessments were conducted in terms of root-mean-squared electromagnetic fields, whole-body averaged specific absorption rate (SAR) and peak 10 g averaged SAR. In general, assessments based on peak field strengths were found to be less computationally intensive, but lead to larger compliance distances than spatial averaging of electromagnetic fields used in combination with localized SAR assessments. For adult exposure, the results indicated that even shorter compliance distances were obtained by using assessments based on localized and whole-body SAR. Numerical simulations, using base station products employing MIMO transmission schemes, were performed as well and were in agreement with reference measurements. The applicability of various field combination methods for correlated exposure was investigated, and best estimate methods were proposed. Our results showed that field combining methods generally considered as conservative could be used to efficiently assess compliance boundary dimensions of single- and dual-polarized multicolumn base station antennas with only minor increases in compliance distances. © 2014 Wiley Periodicals, Inc.
Park, SangWook; Kim, Minhyuk
2016-01-01
In this paper, a numerical exposure assessment method is presented for a quasi-static analysis by the use of finite-difference time-domain (FDTD) algorithm. The proposed method is composed of scattered field FDTD method and quasi-static approximation for analyzing of the low frequency band electromagnetic problems. The proposed method provides an effective tool to compute induced electric fields in an anatomically realistic human voxel model exposed to an arbitrary non-uniform field source in the low frequency ranges. The method is verified, and excellent agreement with theoretical solutions is found for a dielectric sphere model exposed to a magnetic dipole source. The assessment method serves a practical example of the electric fields, current densities, and specific absorption rates induced in a human head and body in close proximity to a 150-kHz wireless power transfer system for cell phone charging. The results are compared to the limits recommended by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the IEEE standard guidelines.
Kim, Minhyuk
2016-01-01
In this paper, a numerical exposure assessment method is presented for a quasi-static analysis by the use of finite-difference time-domain (FDTD) algorithm. The proposed method is composed of scattered field FDTD method and quasi-static approximation for analyzing of the low frequency band electromagnetic problems. The proposed method provides an effective tool to compute induced electric fields in an anatomically realistic human voxel model exposed to an arbitrary non-uniform field source in the low frequency ranges. The method is verified, and excellent agreement with theoretical solutions is found for a dielectric sphere model exposed to a magnetic dipole source. The assessment method serves a practical example of the electric fields, current densities, and specific absorption rates induced in a human head and body in close proximity to a 150-kHz wireless power transfer system for cell phone charging. The results are compared to the limits recommended by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the IEEE standard guidelines. PMID:27898688
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
ERIC Educational Resources Information Center
Paule, Lynde; Murray, Stephen L.
As a precursor to making recommendations for more effective assessment methods for teacher certification, an examination is made of the professions of law and medicine. Specifically, the examination covers the relationship between training programs and the methods employed for evaluating prospective lawyers and doctors to ensure that they have the…
Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples
Ahlbrandt, T.S.; Klett, T.R.
2005-01-01
Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods. A geologically based model, such as one using the total petroleum system approach, is preferred in that it combines the elements of petroleum source, reservoir, trap and seal with the tectono-stratigraphic history of basin evolution with petroleum resource potential. Care must be taken to demonstrate that homogeneous populations in terms of geology, geologic risk, exploration, and discovery processes are used in the assessment process. The USGS 2000 method (7th Approximation Model, EMC computational program) is robust; that is, it can be used in both mature and immature areas, and provides comparable results when using different geologic models (e.g. stratigraphic or structural) with differing amounts of subdivisions, assessment units, within the total petroleum system. ?? 2005 International Association for Mathematical Geology.
Assessment of concentrated flow through riparian buffers
M.G. Dosskey; M.J. Helmers; D.E. Eisenhauer; T.G. Franti; K.D. Hoagland
2002-01-01
Concentrated flow of surface runoff from agricultural fields may limit the capability of riparian buffers to remove pollutants. This study was conducted on four farms in southeastern Nebraska to develop a method for assessing the extent of concentrated flow in riparian buffers and for evaluating the impact that it has on sediment-trapping efficiency. Field methods...
ERIC Educational Resources Information Center
Common, Eric Alan; Lane, Kathleen Lynne; Pustejovsky, James E.; Johnson, Austin H.; Johl, Liane Elizabeth
2017-01-01
This systematic review investigated one systematic approach to designing, implementing, and evaluating functional assessment-based interventions (FABI) for use in supporting school-age students with or at-risk for high-incidence disabilities. We field tested several recently developed methods for single-case design syntheses. First, we appraised…
ERIC Educational Resources Information Center
Patton, Beth J.; Marty-Snyder, Melissa
2014-01-01
Peer assessment (PA) occurs in many higher education programs. However, there is limited research examining PA in physical education teacher education (PETE) in regards to student teaching experiences. PA may be a method to better prepare PETE students to assess their future students. The field experience students assessed their fellow peers on…
Dyer, Bryce; Disley, B Xavier
2018-02-01
Lower-limb amputees typically require some form of prosthetic limb to ride a bicycle for recreation or when competing. At elite-level racing speeds, aerodynamic drag can represent the majority of the resistance acting against a cyclists' forward motion. As a result, the reduction of such resistance is beneficial to an amputee whereby the form and function of the prosthetic limb can be optimized through engineering. To measure the performance of such limbs, field testing provides a cost-effective and context-specific method of aerodynamic drag measurement. However, few methods have been formally validated and none have been applied to amputees with lower-limb amputations. In this paper, an elite level para-cyclist wore two different prosthetic limb designs and had their total aerodynamic drag of a wind tunnel reference method statistically correlated against a velodrome-based virtual elevation field test method. The calculated coefficient of variation was in the range of 0.7-0.9% for the wind tunnel method and 2-3% for the virtual elevation method. A 0.03 m 2 difference was identified in the absolute values recorded between the two methods. Ultimately, both methods exhibited high levels of precision, yet relative results to each other. The virtual elevation method is proposed as a suitable technique to assess the aerodynamic drag of amputee para-cyclists. Implications for rehabilitation This assessment method will provide practitioners a reliable means of assessing the impact of changes made to prosthetics design for cyclists with limb absence. The proposed method offers a low cost and geographically accessible solution compared to others proposed in the past. This assessment method has significant potential for impact among prosthetic limb users looking to improve their cycling performance whereas previous attention in this field has been extremely limited.
Rapid assessment of rice seed availability for wildlife in harvested fields
Halstead, B.J.; Miller, M.R.; Casazza, Michael L.; Coates, P.S.; Farinha, M.A.; Benjamin, Gustafson K.; Yee, J.L.; Fleskes, J.P.
2011-01-01
Rice seed remaining in commercial fields after harvest (waste rice) is a critical food resource for wintering waterfowl in rice-growing regions of North America. Accurate and precise estimates of the seed mass density of waste rice are essential for planning waterfowl wintering habitat extents and management. In the Sacramento Valley of California, USA, the existing method for obtaining estimates of availability of waste rice in harvested fields produces relatively precise estimates, but the labor-, time-, and machineryintensive process is not practical for routine assessments needed to examine long-term trends in waste rice availability. We tested several experimental methods designed to rapidly derive estimates that would not be burdened with disadvantages of the existing method. We first conducted a simulation study of the efficiency of each method and then conducted field tests. For each approach, methods did not vary in root mean squared error, although some methods did exhibit bias for both simulations and field tests. Methods also varied substantially in the time to conduct each sample and in the number of samples required to detect a standard trend. Overall, modified line-intercept methods performed well for estimating the density of rice seeds. Waste rice in the straw, although not measured directly, can be accounted for by a positive relationship with density of rice on the ground. Rapid assessment of food availability is a useful tool to help waterfowl managers establish and implement wetland restoration and agricultural habitat-enhancement goals for wintering waterfowl. ?? 2011 The Wildlife Society.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... and its relationship to several other key Agency initiatives that are currently under development and... Assessment Methods for Workers, Children of Workers in Agricultural Fields, and Pesticides with No Food Uses... for comment a policy paper entitled ``Revised Risk Assessment Methods for Workers, Children of Workers...
Vibrational near-field mapping of planar and buried three-dimensional plasmonic nanostructures
Dregely, Daniel; Neubrech, Frank; Duan, Huigao; Vogelgesang, Ralf; Giessen, Harald
2013-01-01
Nanoantennas confine electromagnetic fields at visible and infrared wavelengths to volumes of only a few cubic nanometres. Assessing their near-field distribution offers fundamental insight into light–matter coupling and is of special interest for applications such as radiation engineering, attomolar sensing and nonlinear optics. Most experimental approaches to measure near-fields employ either diffraction-limited far-field methods or intricate near-field scanning techniques. Here, using diffraction-unlimited far-field spectroscopy in the infrared, we directly map the intensity of the electric field close to plasmonic nanoantennas. We place a patch of probe molecules with 10 nm accuracy at different locations in the near-field of a resonant antenna and extract the molecular vibrational excitation. We map the field intensity along a dipole antenna and gap-type antennas. Moreover, this method is able to assess the near-field intensity of complex buried plasmonic structures. We demonstrate this by measuring for the first time the near-field intensity of a three-dimensional plasmonic electromagnetically induced transparency structure. PMID:23892519
Field test comparison of two dermal tolerance assessment methods of hand hygiene products.
Girard, R; Carré, E; Pires-Cronenberger, S; Bertin-Mandy, M; Favier-Bulit, M C; Coyault, C; Coudrais, S; Billard, M; Regard, A; Kerhoas, A; Valdeyron, M L; Cracco, B; Misslin, P
2008-06-01
This study aimed to compare the sensitivity and workload requirement of two dermal tolerance assessment methods of hand hygiene products, in order to select a suitable pilot testing method for field tests. An observer-rating method and a self-assessment method were compared in 12 voluntary hospital departments (autumn/winter of 2005-2006). Three test-periods of three weeks were separated by two-week intervals during which the routine products were reintroduced. The observer rating method scored dryness and irritation on four-point scales. In the self-assessment method, the user rated appearance, intactness, moisture content, and sensation on a visual analogue scale which was converted into a 10-point numerical scale. Eleven products (soaps) were tested (223/250 complete reports for observer rating, 131/251 for self-assessment). Two products were significantly less well tolerated than the routine product according to the observers, four products according to the self-assessments. There was no significant difference between the two methods when products were classified according to tolerance (Fisher's test: P=0.491). For the symptom common to both assessment methods (dryness), there is a good correlation between the two methods (Spearman's Rho: P=0.032). The workload was higher for observer rating method (288 h of observer time plus 122 h of prevention team and pharmacist time compared with 15 h of prevention team and pharmacist time for self-assessment). In conclusion, the self-assessment method was considered more suitable for pilot testing, although further time should be allocated for educational measures as the return rate of complete self-assessment forms was poor.
Assessment of soil compaction properties based on surface wave techniques
NASA Astrophysics Data System (ADS)
Jihan Syamimi Jafri, Nur; Rahim, Mohd Asri Ab; Zahid, Mohd Zulham Affandi Mohd; Faizah Bawadi, Nor; Munsif Ahmad, Muhammad; Faizal Mansor, Ahmad; Omar, Wan Mohd Sabki Wan
2018-03-01
Soil compaction plays an important role in every construction activities to reduce risks of any damage. Traditionally, methods of assessing compaction include field tests and invasive penetration tests for compacted areas have great limitations, which caused time-consuming in evaluating large areas. Thus, this study proposed the possibility of using non-invasive surface wave method like Multi-channel Analysis of Surface Wave (MASW) as a useful tool for assessing soil compaction. The aim of this study was to determine the shear wave velocity profiles and field density of compacted soils under varying compaction efforts by using MASW method. Pre and post compaction of MASW survey were conducted at Pauh Campus, UniMAP after applying rolling compaction with variation of passes (2, 6 and 10). Each seismic data was recorded by GEODE seismograph. Sand replacement test was conducted for each survey line to obtain the field density data. All seismic data were processed using SeisImager/SW software. The results show the shear wave velocity profiles increase with the number of passes from 0 to 6 passes, but decrease after 10 passes. This method could attract the interest of geotechnical community, as it can be an alternative tool to the standard test for assessing of soil compaction in the field operation.
Work toward a standardized version of a mobile tracer correlation measurement method is discussed. The method used for assessment of methane emissions from 15 landfills in 56 field deployments from 2009 to 2013. This general area source measurement method uses advances in instrum...
Results from the field testing of some innovative sampling methods developed to evaluate risk management strategies for polychlorinated biphenyl (PCB) contaminated sediments are presented. Semipermeable membrane devices (SPMDs) were combined with novel deployment methods to quan...
Use of refractometry and colorimetry as field methods to rapidly assess antimalarial drug quality.
Green, Michael D; Nettey, Henry; Villalva Rojas, Ofelia; Pamanivong, Chansapha; Khounsaknalath, Lamphet; Grande Ortiz, Miguel; Newton, Paul N; Fernández, Facundo M; Vongsack, Latsamy; Manolin, Ot
2007-01-04
The proliferation of counterfeit and poor-quality drugs is a major public health problem; especially in developing countries lacking adequate resources to effectively monitor their prevalence. Simple and affordable field methods provide a practical means of rapidly monitoring drug quality in circumstances where more advanced techniques are not available. Therefore, we have evaluated refractometry, colorimetry and a technique combining both processes as simple and accurate field assays to rapidly test the quality of the commonly available antimalarial drugs; artesunate, chloroquine, quinine, and sulfadoxine. Method bias, sensitivity, specificity and accuracy relative to high-performance liquid chromatographic (HPLC) analysis of drugs collected in the Lao PDR were assessed for each technique. The HPLC method for each drug was evaluated in terms of assay variability and accuracy. The accuracy of the combined method ranged from 0.96 to 1.00 for artesunate tablets, chloroquine injectables, quinine capsules, and sulfadoxine tablets while the accuracy was 0.78 for enterically coated chloroquine tablets. These techniques provide a generally accurate, yet simple and affordable means to assess drug quality in resource-poor settings.
Evaluating co-creation of knowledge: from quality criteria and indicators to methods
NASA Astrophysics Data System (ADS)
Schuck-Zöller, Susanne; Cortekar, Jörg; Jacob, Daniela
2017-11-01
Basic research in the natural sciences rests on a long tradition of evaluation. However, since the San Francisco Declaration on Research Assessment (DORA) came out in 2012, there has been intense discussion in the natural sciences, above all amongst researchers and funding agencies in the different fields of applied research and scientific service. This discussion was intensified when climate services and other fields, used to make users participate in research and development activities (co-creation), demanded new evaluation methods appropriate to this new research mode. This paper starts by describing a comprehensive and interdisciplinary literature overview of indicators to evaluate co-creation of knowledge, including the different fields of integrated knowledge production. Then the authors harmonize the different elements of evaluation from literature in an evaluation cascade that scales down from very general evaluation dimensions to tangible assessment methods. They describe evaluation indicators already being documented and include a mixture of different assessment methods for two exemplary criteria. It is shown what can be deduced from already existing methodology for climate services and envisaged how climate services can further to develop their specific evaluation method.
Until recently, lake physical habitat assessment has been an underemployed tool for assessing lake and reservoir ecological condition. We outline and evaluate a rapid field sampling and analytical approach for quantifying near-shore physical habitat. We quantified the repeatabil...
Abstract - A standardized version of a mobile tracer correlation measurement method was developed and used for assessment of methane emissions from 15 landfills in 56 field deployments from 2009 to 2013. This general area source measurement method uses advances in instrumentation...
Untangling Autophagy Measurements: All Fluxed Up
Gottlieb, Roberta A.; Andres, Allen M.; Sin, Jon; Taylor, David
2015-01-01
Autophagy is an important physiological process in the heart, and alterations in autophagic activity can exacerbate or mitigate injury during various pathological processes. Methods to assess autophagy have changed rapidly as the field of research has expanded. As with any new field, methods and standards for data analysis and interpretation evolve as investigators acquire experience and insight. The purpose of this review is to summarize current methods to measure autophagy, selective mitochondrial autophagy (mitophagy), and autophagic flux. We will examine several published studies where confusion arose in in data interpretation, in order to illustrate the challenges. Finally we will discuss methods to assess autophagy in vivo and in patients. PMID:25634973
Information form the previously approved extended abstract A standardized area source measurement method based on mobile tracer correlation was used for methane emissions assessment in 52 field deployments...
Lee, Ping-Shin; Gan, Han Ming; Clements, Gopalasamy Reuben; Wilson, John-James
2016-11-01
Mammal diversity assessments based on DNA derived from invertebrates have been suggested as alternatives to assessments based on traditional methods; however, no study has field-tested both approaches simultaneously. In Peninsular Malaysia, we calibrated the performance of mammal DNA derived from blowflies (Diptera: Calliphoridae) against traditional methods used to detect species. We first compared five methods (cage trapping, mist netting, hair trapping, scat collection, and blowfly-derived DNA) in a forest reserve with no recent reports of megafauna. Blowfly-derived DNA and mist netting detected the joint highest number of species (n = 6). Only one species was detected by multiple methods. Compared to the other methods, blowfly-derived DNA detected both volant and non-volant species. In another forest reserve, rich in megafauna, we calibrated blowfly-derived DNA against camera traps. Blowfly-derived DNA detected more species (n = 11) than camera traps (n = 9), with only one species detected by both methods. The rarefaction curve indicated that blowfly-derived DNA would continue to detect more species with greater sampling effort. With further calibration, blowfly-derived DNA may join the list of traditional field methods. Areas for further investigation include blowfly feeding and dispersal biology, primer biases, and the assembly of a comprehensive and taxonomically-consistent DNA barcode reference library.
Lake shore and littoral habitat structure: a field survey method and its precision
Until recently, lake physical habitat assessment has been and underemployed tool for assessing lake and reservoir ecological condition. Herein, we outline and evaluate a rapid (2 persons: 1.5-3.5 h) field sampling and analytical approach for quantifying near-shore physical habit...
Many ecosystem monitoring and assessment programs are expanding their focus to address changes in ecosystem condition. This is a challenging task, given the complexity of ecosystems and the changes they undergo in response to a variety of human activities and landscape alteration...
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
Formal Method of Description Supporting Portfolio Assessment
ERIC Educational Resources Information Center
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
Böttrich, Marcel; Tanskanen, Jarno M A; Hyttinen, Jari A K
2017-06-26
Our aim is to introduce a method to enhance the design process of microelectrode array (MEA) based electric bioimpedance measurement systems for improved detection and viability assessment of living cells and tissues. We propose the application of electromagnetic lead field theory and reciprocity for MEA design and measurement result interpretation. Further, we simulated impedance spectroscopy (IS) with two- and four-electrode setups and a biological cell to illustrate the tool in the assessment of the capabilities of given MEA electrode constellations for detecting cells on or in the vicinity of the microelectrodes. The results show the power of the lead field theory in electromagnetic simulations of cell-microelectrode systems depicting the fundamental differences of two- and four-electrode IS measurement configurations to detect cells. Accordingly, the use in MEA system design is demonstrated by assessing the differences between the two- and four-electrode IS configurations. Further, our results show how cells affect the lead fields in these MEA system, and how we can utilize the differences of the two- and four-electrode setups in cell detection. The COMSOL simulator model is provided freely in public domain as open source. Lead field theory can be successfully applied in MEA design for the IS based assessment of biological cells providing the necessary visualization and insight for MEA design. The proposed method is expected to enhance the design and usability of automated cell and tissue manipulation systems required for bioreactors, which are intended for the automated production of cell and tissue grafts for medical purposes. MEA systems are also intended for toxicology to assess the effects of chemicals on living cells. Our results demonstrate that lead field concept is expected to enhance also the development of such methods and devices.
Using Benchmarking To Strengthen the Assessment of Persistence.
McLachlan, Michael S; Zou, Hongyan; Gouin, Todd
2017-01-03
Chemical persistence is a key property for assessing chemical risk and chemical hazard. Current methods for evaluating persistence are based on laboratory tests. The relationship between the laboratory based estimates and persistence in the environment is often unclear, in which case the current methods for evaluating persistence can be questioned. Chemical benchmarking opens new possibilities to measure persistence in the field. In this paper we explore how the benchmarking approach can be applied in both the laboratory and the field to deepen our understanding of chemical persistence in the environment and create a firmer scientific basis for laboratory to field extrapolation of persistence test results.
C. Alina Cansler; Donald McKenzie
2012-01-01
Remotely sensed indices of burn severity are now commonly used by researchers and land managers to assess fire effects, but their relationship to field-based assessments of burn severity has been evaluated only in a few ecosystems. This analysis illustrates two cases in which methodological refinements to field-based and remotely sensed indices of burn severity...
Fischer, David L
2005-11-01
Long-term risks of pesticides to birds and mammals are currently assessed by comparing effects thresholds determined in chronic laboratory studies to exposure levels expected to occur in the field. However, there is often a mismatch between exposure patterns tested in the laboratory tests (exposure levels held constant) and those experienced by animals in the field (exposure levels varying over time). Three methods for addressing this problem are presented and discussed. Time-weighted averaging (TWA) converts a variable field exposure regime to a single value that can be compared directly to the laboratory test results. Body-burden modeling (BBM) is applied to both laboratory and field exposure regimes allowing a straightforward comparison of body residue levels expected for each situation. Temporal analysis (TA) uses expert judgment to decide if the length of time exposure exceeds a toxicity threshold is long enough to cause biologically significant effects. To reduce uncertainty in long-term assessments, the conduct of specialized laboratory tests in which test subjects are administered a time-varying exposure that mimics what occurs in the field should be considered. Such tests may also be useful testing the validity of each of these assessment methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Downing, D.J.
1993-10-01
This paper discusses Carol Gotway`s paper, ``The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.`` The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out.
Leisman, Gerald; Ashkenazi, Maureen
1979-01-01
Objective psychophysical techniques for investigating visual fields are described. The paper concerns methods for the collection and analysis of evoked potentials using a small laboratory computer and provides efficient methods for obtaining information about the conduction pathways of the visual system.
Numerical assessment of low-frequency dosimetry from sampled magnetic fields
NASA Astrophysics Data System (ADS)
Freschi, Fabio; Giaccone, Luca; Cirimele, Vincenzo; Canova, Aldo
2018-01-01
Low-frequency dosimetry is commonly assessed by evaluating the electric field in the human body using the scalar potential finite difference method. This method is effective only when the sources of the magnetic field are completely known and the magnetic vector potential can be analytically computed. The aim of the paper is to present a rigorous method to characterize the source term when only the magnetic flux density is available at discrete points, e.g. in case of field measurements. The method is based on the solution of the discrete magnetic curl equation. The system is restricted to the independent set of magnetic fluxes and circulations of magnetic vector potential using the topological information of the computational mesh. The solenoidality of the magnetic flux density is preserved using a divergence-free interpolator based on vector radial basis functions. The analysis of a benchmark problem shows that the complexity of the proposed algorithm is linearly dependent on the number of elements with a controllable accuracy. The method proposed in this paper also proves to be useful and effective when applied to a real world scenario, where the magnetic flux density is measured in proximity of a power transformer. A 8 million voxel body model is then used for the numerical dosimetric analysis. The complete assessment is completed in less than 5 min, that is more than acceptable for these problems.
Numerical assessment of low-frequency dosimetry from sampled magnetic fields.
Freschi, Fabio; Giaccone, Luca; Cirimele, Vincenzo; Canova, Aldo
2017-12-29
Low-frequency dosimetry is commonly assessed by evaluating the electric field in the human body using the scalar potential finite difference method. This method is effective only when the sources of the magnetic field are completely known and the magnetic vector potential can be analytically computed. The aim of the paper is to present a rigorous method to characterize the source term when only the magnetic flux density is available at discrete points, e.g. in case of field measurements. The method is based on the solution of the discrete magnetic curl equation. The system is restricted to the independent set of magnetic fluxes and circulations of magnetic vector potential using the topological information of the computational mesh. The solenoidality of the magnetic flux density is preserved using a divergence-free interpolator based on vector radial basis functions. The analysis of a benchmark problem shows that the complexity of the proposed algorithm is linearly dependent on the number of elements with a controllable accuracy. The method proposed in this paper also proves to be useful and effective when applied to a real world scenario, where the magnetic flux density is measured in proximity of a power transformer. A 8 million voxel body model is then used for the numerical dosimetric analysis. The complete assessment is completed in less than 5 min, that is more than acceptable for these problems.
Sommerlot, Andrew R; Pouyan Nejadhashemi, A; Woznicki, Sean A; Prohaska, Michael D
2013-10-15
Non-point source pollution from agricultural lands is a significant contributor of sediment pollution in United States lakes and streams. Therefore, quantifying the impact of individual field management strategies at the watershed-scale provides valuable information to watershed managers and conservation agencies to enhance decision-making. In this study, four methods employing some of the most cited models in field and watershed scale analysis were compared to find a practical yet accurate method for evaluating field management strategies at the watershed outlet. The models used in this study including field-scale model (the Revised Universal Soil Loss Equation 2 - RUSLE2), spatially explicit overland sediment delivery models (SEDMOD), and a watershed-scale model (Soil and Water Assessment Tool - SWAT). These models were used to develop four modeling strategies (methods) for the River Raisin watershed: Method 1) predefined field-scale subbasin and reach layers were used in SWAT model; Method 2) subbasin-scale sediment delivery ratio was employed; Method 3) results obtained from the field-scale RUSLE2 model were incorporated as point source inputs to the SWAT watershed model; and Method 4) a hybrid solution combining analyses from the RUSLE2, SEDMOD, and SWAT models. Method 4 was selected as the most accurate among the studied methods. In addition, the effectiveness of six best management practices (BMPs) in terms of the water quality improvement and associated cost were assessed. Economic analysis was performed using Method 4, and producer requested prices for BMPs were compared with prices defined by the Environmental Quality Incentives Program (EQIP). On a per unit area basis, producers requested higher prices than EQIP in four out of six BMP categories. Meanwhile, the true cost of sediment reduction at the field and watershed scales was greater than EQIP in five of six BMP categories according to producer requested prices. Copyright © 2013 Elsevier Ltd. All rights reserved.
Using Unmanned Helicopters to Assess Vegetation Cover in Sagebrush Steppe Ecosystems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert P. Breckenridge; Maxine Dakins; Stephen Bunting
2012-07-01
Evaluating vegetation cover is an important factor in understanding the sustainability of many ecosystems. Methods that have sufficient accuracy and improved cost efficiency could dramatically alter how biotic resources are monitored on both public and private lands. This will be of interest to land managers because there are rarely enough resource specialists or funds available for comprehensive ground evaluations. In this project, unmanned helicopters were used to collect still-frame imagery to assess vegetation cover during May, June, and July in 2005. The images were used to estimate percent cover for six vegetative cover classes (shrub, dead shrub, grass, forbs, litter,more » and bare ground). The field plots were located on the INL site west of Idaho Falls, Idaho. Ocular assessments of digital imagery were performed using a software program called SamplePoint, and the results were compared against field measurements collected using a point-frame method to assess accuracy. The helicopter imagery evaluation showed a high degree of agreement with field cover class values for litter, bare ground, and grass, and reasonable agreement for dead shrubs. Shrub cover was often overestimated and forbs were generally underestimated. The helicopter method took 45% less time than the field method to set plots and collect and analyze data. This study demonstrates that UAV technology provides a viable method for monitoring vegetative cover on rangelands in less time and with lower costs. Tradeoffs between cost and accuracy are critical management decisions that are important when managing vegetative conditions across vast sagebrush ecosystems throughout the Intermountain West.« less
Tenax extraction as a simple approach to improve environmental risk assessments.
Harwood, Amanda D; Nutile, Samuel A; Landrum, Peter F; Lydy, Michael J
2015-07-01
It is well documented that using exhaustive chemical extractions is not an effective means of assessing exposure of hydrophobic organic compounds in sediments and that bioavailability-based techniques are an improvement over traditional methods. One technique that has shown special promise as a method for assessing the bioavailability of hydrophobic organic compounds in sediment is the use of Tenax-extractable concentrations. A 6-h or 24-h single-point Tenax-extractable concentration correlates to both bioaccumulation and toxicity. This method has demonstrated effectiveness for several hydrophobic organic compounds in various organisms under both field and laboratory conditions. In addition, a Tenax bioaccumulation model was developed for multiple compounds relating 24-h Tenax-extractable concentrations to oligochaete tissue concentrations exposed in both the laboratory and field. This model has demonstrated predictive capacity for additional compounds and species. Use of Tenax-extractable concentrations to estimate exposure is rapid, simple, straightforward, and relatively inexpensive, as well as accurate. Therefore, this method would be an invaluable tool if implemented in risk assessments. © 2015 SETAC.
Lewis-Fernández, Roberto; Aggarwal, Neil Krishan; Lam, Peter C; Galfalvy, Hanga; Weiss, Mitchell G; Kirmayer, Laurence J; Paralikar, Vasudeo; Deshpande, Smita N; Díaz, Esperanza; Nicasio, Andel V; Boiler, Marit; Alarcón, Renato D; Rohlof, Hans; Groen, Simon; van Dijk, Rob C J; Jadhav, Sushrut; Sarmukaddam, Sanjeev; Ndetei, David; Scalco, Monica Z; Bassiri, Kavoos; Aguilar-Gaxiola, Sergio; Ton, Hendry; Westermeyer, Joseph; Vega-Dienstmaier, Johann M
2017-04-01
Background There is a need for clinical tools to identify cultural issues in diagnostic assessment. Aims To assess the feasibility, acceptability and clinical utility of the DSM-5 Cultural Formulation Interview (CFI) in routine clinical practice. Method Mixed-methods evaluation of field trial data from six countries. The CFI was administered to diagnostically diverse psychiatric out-patients during a diagnostic interview. In post-evaluation sessions, patients and clinicians completed debriefing qualitative interviews and Likert-scale questionnaires. The duration of CFI administration and the full diagnostic session were monitored. Results Mixed-methods data from 318 patients and 75 clinicians found the CFI feasible, acceptable and useful. Clinician feasibility ratings were significantly lower than patient ratings and other clinician-assessed outcomes. After administering one CFI, however, clinician feasibility ratings improved significantly and subsequent interviews required less time. Conclusions The CFI was included in DSM-5 as a feasible, acceptable and useful cultural assessment tool. © The Royal College of Psychiatrists 2017.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Video Documentaries in the Assessment of Human Geography Field Courses
ERIC Educational Resources Information Center
Mavroudi, Elizabeth; Jons, Heike
2011-01-01
This paper critically reviews the use of video documentaries in the assessment of human geography field courses. It aims to contribute to recent debates about the role of visual methods for developing active and deep learning in student-centred teaching. Based on four days of group work in Crete, 30 third-year students produced individual…
An overview on the emerging area of identification, characterization, and assessment of health apps.
Paglialonga, Alessia; Lugo, Alessandra; Santoro, Eugenio
2018-05-28
The need to characterize and assess health apps has inspired a significant amount of research in the past years, in search for methods able to provide potential app users with relevant, meaningful knowledge. This article presents an overview of the recent literature in this field and categorizes - by discussing some specific examples - the various methodologies introduced so far for the identification, characterization, and assessment of health apps. Specifically, this article outlines the most significant web-based resources for app identification, relevant frameworks for descriptive characterization of apps' features, and a number of methods for the assessment of quality along its various components (e.g., evidence base, trustworthiness, privacy, or user engagement). The development of methods to characterize the apps' features and to assess their quality is important to define benchmarks and minimum requirements. Similarly, such methods are important to categorize potential risks and challenges in the field so that risks can be minimized, whenever possible, by design. Understanding methods to assess apps is key to raise the standards of quality of health apps on the market, towards the final goal of delivering apps that are built on the pillars of evidence-base, reliability, long-term effectiveness, and user-oriented quality. Copyright © 2018. Published by Elsevier Inc.
Linder, G.; ,
2003-01-01
Mining activities frequently impact wildlife habitats, and a wide range of habitats may require evaluations of the linkages between wildlife and environmental stressors common to mining activities (e.g., physical alteration of habitat, releases of chemicals such as metals and other inorganic constituents as part of the mining operation). Wetlands, for example, are frequently impacted by mining activities. Within an ecological assessment for a wetland, toxicity evaluations for representative species may be advantageous to the site evaluation, since these species could be exposed to complex chemical mixtures potentially released from the site. Amphibian species common to these transition zones between terrestrial and aquatic habitats are one key biological indicator of exposure, and integrated approaches which involve both field and laboratory methods focused on amphibians are critical to the assessment process. The laboratory and field evaluations of a wetland in western Montana illustrates the integrated approach to risk assessment and causal analysis. Here, amphibians were used to evaluate the potential toxicity associated with heavy metal-laden sediments deposited in a reservoir. Field and laboratory methods were applied to a toxicity assessment for metals characteristic of mine tailings to reduce potential "lab to field" extrapolation errors and provide adaptive management programs with critical site-specific information targeted on remediation.
Jabbour, Noel; Sidman, James
2011-10-01
There has been an increasing interest in assessment of technical skills in most medical and surgical disciplines. Many of these assessments involve microscopy or endoscopy and are thus amenable to video recording for post hoc review. An ideal skills assessment video would provide the reviewer with a simultaneous view of the examinee's instrument handling and the operative field. Ideally, a reviewer should be blinded to the identity of the examinee and whether the assessment was performed as a pretest or posttest examination, when given in conjunction with an educational intervention. We describe a simple method for reliably creating deidentified, multicamera, time-synced videos, which may be used in technical skills assessments. We pilot tested this method in a pediatric airway endoscopy Objective Assessment of Technical Skills (OSATS). Total video length was compared with the OSATS administration time. Thirty-nine OSATS were administered. There were no errors encountered in time-syncing the videos using this method. Mean duration of OSATS videos was 11 minutes and 20 seconds, which was significantly less than the time needed for an expert to be present at the administration of each 30-minute OSATS (P < 0.001). The described method for creating time-synced, multicamera skills assessment videos is reliable and may be used in endosurgical or microsurgical skills assessments. Compared with live review, post hoc video review using this method can save valuable expert reviewer time. Most importantly, this method allows a reviewer to simultaneously evaluate an examinee's instrument handling and the operative field while being blinded to the examinee's identity and timing of examination administration.
Clarke, Christina K; Gregory, Peter J; Lukac, Martin; Burridge, Amanda J; Allen, Alexandra M; Edwards, Keith J; Gooding, Mike J
2017-09-01
The genetic basis of increased rooting below the plough layer, post-anthesis in the field, of an elite wheat line (Triticum aestivum 'Shamrock') with recent introgression from wild emmer (T. dicoccoides), is investigated. Shamrock has a non-glaucous canopy phenotype mapped to the short arm of chromosome 2B (2BS), derived from the wild emmer. A secondary aim was to determine whether genetic effects found in the field could have been predicted by other assessment methods. Roots of doubled haploid (DH) lines from a winter wheat ('Shamrock' × 'Shango') population were assessed using a seedling screen in moist paper rolls, in rhizotrons to the end of tillering, and in the field post-anthesis. A linkage map was produced using single nucleotide polymorphism markers to identify quantitative trait loci (QTLs) for rooting traits. Shamrock had greater root length density (RLD) at depth than Shango, in the field and within the rhizotrons. The DH population exhibited diversity for rooting traits within the three environments studied. QTLs were identified on chromosomes 5D, 6B and 7B, explaining variation in RLD post-anthesis in the field. Effects associated with the non-glaucous trait on RLD interacted significantly with depth in the field, and some of this interaction mapped to 2BS. The effect of genotype was strongly influenced by the method of root assessment, e.g. glaucousness expressed in the field was negatively associated with root length in the rhizotrons, but positively associated with length in the seedling screen. To our knowledge, this is the first study to identify QTLs for rooting at depth in field-grown wheat at mature growth stages. Within the population studied here, our results are consistent with the hypothesis that some of the variation in rooting is associated with recent introgression from wild emmer. The expression of genetic effects differed between the methods of root assessment. © The Author 2017. Published by Oxford University Press on behalf of the Annals of Botany Company.
ERIC Educational Resources Information Center
Angeli, Charoula; Valanides, Nicos
2013-01-01
The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…
ERIC Educational Resources Information Center
Patel, Kamna
2015-01-01
Development studies employs theories, tools and methods often found in geography, including the international field trip to a "developing" country. In 2013 and 2014, I led a two-week trip to Ethiopia. To better comprehend the effects of "the field" on students' learning, I introduced an assessed reflexive field diary to…
Test site suitability assessment for radiation measurements
NASA Astrophysics Data System (ADS)
Borsero, M.; Nano, E.
1980-04-01
Field and attenuation methods for site suitability assessment for radiation measurements are presented. Attention is given to the IEC procedure for checking the suitability of radiation measurement site.
NASA Astrophysics Data System (ADS)
Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata
2016-09-01
Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.
Bioelectrical Impedance and Body Composition Assessment
ERIC Educational Resources Information Center
Martino, Mike
2006-01-01
This article discusses field tests that can be used in physical education programs. The most common field tests are anthropometric measurements, which include body mass index (BMI), girth measurements, and skinfold testing. Another field test that is gaining popularity is bioelectrical impedance analysis (BIA). Each method has particular strengths…
This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distribut...
Benthic macroinvertebrate field sampling effort required to ...
This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distributed across three rivers, were sampled. At each site, benthic macroinvertebrates were collected at 11 transects. Each sample was processed independently in the field and laboratory. Based on a literature review and resource considerations, the collection of 300 organisms (minimum) at each site was determined to be necessary to support a robust condition assessment, and therefore, selected as the criterion for judging the adequacy of the method. This targeted number of organisms was collected at all sites, at a minimum, when collections from all 11 transects were combined. Subsequent bootstrapping analysis of data was used to estimate whether collecting at fewer transects would reach the minimum target number of organisms for all sites. In a subset of sites, the total number of organisms frequently fell below the target when fewer than 11 transects collections were combined.Site conditions where <300 organisms might be collected are discussed. These preliminary results suggest that the proposed field method results in a sample that is adequate for robust condition assessment of the rivers and streams of interest. When data become available from a broader range of sites, the adequacy of the field
NASA Astrophysics Data System (ADS)
Roger-Estrade, Jean; Boizard, Hubert; Peigné, Josephine; Sasal, Maria Carolina; Guimaraes, Rachel; Piron, Denis; Tomis, Vincent; Vian, Jean-François; Cadoux, Stephane; Ralisch, Ricardo; Filho, Tavares; Heddadj, Djilali; de Battista, Juan; Duparque, Annie
2016-04-01
In France, agronomists have studied the effects of cropping systems on soil structure, using a field method based on a visual description of soil structure. The "profil cultural" method (Manichon and Gautronneau, 1987) has been designed to perform a field diagnostic of the effects of tillage and compaction on soil structure dynamics. This method is of great use to agronomists improving crop management for a better preservation of soil structure. However, this method was developed and mainly used in conventional tillage systems, with ploughing. As several forms of reduced, minimum and no tillage systems are expanding in many parts of the world, it is necessary to re-evaluate the ability of this method to describe and interpret soil macrostructure in unploughed situations. In unploughed fields, soil structure dynamics of untilled layers is mainly driven by compaction and regeneration by natural agents (climatic conditions, root growth and macrofauna) and it is of major importance to evaluate the importance of these natural processes on soil structure regeneration. These concerns have led us to adapt the standard method and to propose amendments based on a series of field observations and experimental work in different situations of cropping systems, soil types and climatic conditions. We improved the description of crack type and we introduced an index of biological activity, based on the visual examination of clods. To test the improved method, a comparison with the reference method was carried out and the ability of the "profil cultural" method to make a diagnosis was tested on five experiments in France, Brazil and Argentina. Using the improved method, the impact of cropping systems on soil functioning was better assessed when natural processes were integrated into the description.
Puppa, Giacomo; Risio, Mauro; Sheahan, Kieran; Vieth, Michael; Zlobec, Inti; Lugli, Alessandro; Pecori, Sara; Wang, Lai Mun; Langner, Cord; Mitomi, Hiroyuki; Nakamura, Takatoshi; Watanabe, Masahiko; Ueno, Hideki; Chasle, Jacques; Senore, Carlo; Conley, Stephen A; Herlin, Paulette; Lauwers, Gregory Y
2011-01-01
In histopathology, the quantitative assessment of various morphologic features is based on methods originally conceived on specific areas observed through the microscope used. Failure to reproduce the same reference field of view using a different microscope will change the score assessed. Visualization of a digital slide on a screen through a dedicated viewer allows selection of the magnification. However, the field of view is rectangular, unlike the circular field of optical microscopy. In addition, the size of the selected area is not evident, and must be calculated. A digital slide morphometric system was conceived to reproduce the various methods published for assessing tumor budding in colorectal cancer. Eighteen international experts in colorectal cancer were invited to participate in a web-based study by assessing tumor budding with five different methods in 100 digital slides. The specific areas to be tested by each method were marked by colored circles. The areas were grouped in a target-like pattern and then saved as an .xml file. When a digital slide was opened, the .xml file was imported in order to perform the measurements. Since the morphometric tool is composed of layers that can be freely moved on top of the digital slide, the technique was named digital slide dynamic morphometry. Twelve investigators completed the task, the majority of them performing the multiple evaluations of each of the cases in less than 12 minutes. Digital slide dynamic morphometry has various potential applications and might be a useful tool for the assessment of histologic parameters originally conceived for optical microscopy that need to be quantified.
Hans-Erik Andersen; Stephen E. Reutebuch; Robert J. McGaughey
2006-01-01
Tree height is an important variable in forest inventory programs but is typically time-consuming and costly to measure in the field using conventional techniques. Airborne light detection and ranging (LIDAR) provides individual tree height measurements that are highly correlated with field-derived measurements, but the imprecision of conventional field techniques does...
Nevada STORMS project: Measurement of mercury emissions from naturally enriched surfaces
Gustin, M.S.; Lindberg, S.; Marsik, F.; Casimir, A.; Ebinghaus, R.; Edwards, G.; Hubble-Fitzgerald, C.; Kemp, R.; Kock, H.; Leonard, T.; London, J.; Majewski, M.; Montecinos, C.; Owens, J.; Pilote, M.; Poissant, L.; Rasmussen, P.; Schaedlich, F.; Schneeberger, D.; Schroeder, W.; Sommar, J.; Turner, R.; Vette, A.; Wallschlaeger, D.; Xiao, Z.; Zhang, H.
1999-01-01
Diffuse anthropogenic and naturally mercury-enriched areas represent long-lived sources of elemental mercury to the atmosphere. The Nevada Study and Tests of the Release of Mercury From Soils (STORMS) project focused on the measurement of mercury emissions from a naturally enriched area. During the project, concurrent measurements of mercury fluxes from naturally mercury-enriched substrate were made September 1-4, 1997, using four micrometeorological methods and seven field flux chambers. Ambient air mercury concentrations ranged from 2 to nearly 200 ng m-3 indicating that the field site is a source of atmospheric mercury. The mean daytime mercury fluxes, during conditions of no precipitation, measured with field chambers were 50 to 360 ng m-2 h-1, and with the micrometeorological methods were 230 to 600 ng m-2 h-1. This wide range in mercury emission rates reflects differences in method experimental designs and local source strengths. Mercury fluxes measured by many field chambers were significantly different (p < 0.05) but linearly correlated. This indicates that field chambers responded similarly to environmental conditions, but differences in experimental design and site heterogeneity had a significant influence on the magnitude of mercury fluxes. Data developed during the field study demonstrated that field flux chambers are ideal for assessment of the physicochemical processes driving mercury flux and development of an understanding of the magnitude of the influence of individual factors on flux. In general, mean mercury fluxes measured with micrometeorological methods during daytime periods were nearly 3 times higher than mean fluxes measured with field flux chambers. Micrometeorological methods allow for derivation of a representative mercury flux occurring from an unconstrained system and provide an assessment of the actual magnitude and variability of fluxes occurring from an area. Copyright 1999 by the American Geophysical Union.
Helicopter noise in hover: Computational modelling and experimental validation
NASA Astrophysics Data System (ADS)
Kopiev, V. F.; Zaytsev, M. Yu.; Vorontsov, V. I.; Karabasov, S. A.; Anikin, V. A.
2017-11-01
The aeroacoustic characteristics of a helicopter rotor are calculated by a new method, to assess its applicability in assessing rotor performance in hovering. Direct solution of the Euler equations in a noninertial coordinate system is used to calculate the near-field flow around the spinning rotor. The far-field noise field is calculated by the Ffowcs Williams-Hawkings (FW-H) method using permeable control surfaces that include the blade. For a multiblade rotor, the signal obtained is duplicated and shifted in phase for each successive blade. By that means, the spectral characteristics of the far-field noise may be obtained. To determine the integral aerodynamic characteristics of the rotor, software is written to calculate the thrust and torque characteristics from the near-field flow solution. The results of numerical simulation are compared with experimental acoustic and aerodynamic data for a large-scale model of a helicopter main rotor in an open test facility. Two- and four-blade configurations of the rotor are considered, in different hover conditions. The proposed method satisfactorily predicts the aerodynamic characteristics of the blades in such conditions and gives good estimates for the first harmonics of the noise. That permits the practical use of the proposed method, not only for hovering but also for forward flight.
Peripheral neuropathy is a classical symptom of arsenic poisoning. Nerve conduction velocity (NCV) is the preferred measure for clinical assessment of peripheral neuropathy, but this method is not practical for field studies. Alternative methods available for assessing functi...
ERIC Educational Resources Information Center
Potter, Penny F.; Graham-Moore, Brian E.
Most organizations planning to assess adverse impact or perform a stock analysis for affirmative action planning must correctly classify their jobs into appropriate occupational categories. Two methods of job classification were assessed in a combination archival and field study. Classification results from expert judgment of functional job…
A novel approach to assess the treatment response using Gaussian random field in PET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Mengdie; Guo, Ning; Hu, Guangshu
2016-02-15
Purpose: The assessment of early therapeutic response to anticancer therapy is vital for treatment planning and patient management in clinic. With the development of personal treatment plan, the early treatment response, especially before any anatomically apparent changes after treatment, becomes urgent need in clinic. Positron emission tomography (PET) imaging serves an important role in clinical oncology for tumor detection, staging, and therapy response assessment. Many studies on therapy response involve interpretation of differences between two PET images, usually in terms of standardized uptake values (SUVs). However, the quantitative accuracy of this measurement is limited. This work proposes a statistically robustmore » approach for therapy response assessment based on Gaussian random field (GRF) to provide a statistically more meaningful scale to evaluate therapy effects. Methods: The authors propose a new criterion for therapeutic assessment by incorporating image noise into traditional SUV method. An analytical method based on the approximate expressions of the Fisher information matrix was applied to model the variance of individual pixels in reconstructed images. A zero mean unit variance GRF under the null hypothesis (no response to therapy) was obtained by normalizing each pixel of the post-therapy image with the mean and standard deviation of the pretherapy image. The performance of the proposed method was evaluated by Monte Carlo simulation, where XCAT phantoms (128{sup 2} pixels) with lesions of various diameters (2–6 mm), multiple tumor-to-background contrasts (3–10), and different changes in intensity (6.25%–30%) were used. The receiver operating characteristic curves and the corresponding areas under the curve were computed for both the proposed method and the traditional methods whose figure of merit is the percentage change of SUVs. The formula for the false positive rate (FPR) estimation was developed for the proposed therapy response assessment utilizing local average method based on random field. The accuracy of the estimation was validated in terms of Euler distance and correlation coefficient. Results: It is shown that the performance of therapy response assessment is significantly improved by the introduction of variance with a higher area under the curve (97.3%) than SUVmean (91.4%) and SUVmax (82.0%). In addition, the FPR estimation serves as a good prediction for the specificity of the proposed method, consistent with simulation outcome with ∼1 correlation coefficient. Conclusions: In this work, the authors developed a method to evaluate therapy response from PET images, which were modeled as Gaussian random field. The digital phantom simulations demonstrated that the proposed method achieved a large reduction in statistical variability through incorporating knowledge of the variance of the original Gaussian random field. The proposed method has the potential to enable prediction of early treatment response and shows promise for application to clinical practice. In future work, the authors will report on the robustness of the estimation theory for application to clinical practice of therapy response evaluation, which pertains to binary discrimination tasks at a fixed location in the image such as detection of small and weak lesion.« less
Towards standardized assessment of endoscope optical performance: geometric distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua
2013-12-01
Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.
Teaching Structure from Motion to Undergraduates: New Learning Module for Field Geoscience Courses
NASA Astrophysics Data System (ADS)
Pratt-Sitaula, B. A.; Shervais, K.; Crosby, C. J.; Douglas, B. J.; Crosby, B. T.; Charlevoix, D. J.
2016-12-01
With photogrammetry use expanding rapidly, it is essential to integrate these methods into undergraduate geosciences courses. The NSF-funded "GEodetic Tools for Societal Issues" (GETSI) project has recently published a module for field geoscience courses called "Analyzing High Resolution Topography with TLS and SfM" (serc.carleton.edu/getsi/teaching_materials/high-rez-topo/index.html). Structure from motion (SfM) and terrestrial laser scanning (TLS) are two valuable methods for generating high-resolution topographic landscape models. In addition to teaching the basic surveying methods, the module includes several specific applications that are tied to societally important geoscience research questions. The module goals are that students will be able to: 1) design and conduct a complex TLS and/or SfM survey to address a geologic research question; 2) articulate the societal impetus for answering a given research question; and 3) justify why TLS and/or SfM is the appropriate method in some circumstances. The module includes 6 units: Unit 1-TLS Introduction, Unit 1-SfM Introduction, Unit 2 Stratigraphic Section Survey, Unit 3 Fault Scarp Survey, Unit 4 Geomorphic Change Detection Survey, and Unit 5 Summative Assessment. One or both survey methods can be taught. Instructors choose which application/s to use from Units 2-4. Unit 5 Summative Assessment is flexibly written and can be used to assess any of the learned applications or others such as dinosaur tracks or seismic trench photomosaics. Prepared data sets are also provided for courses unable to visit the field. The included SfM learning manuals may also be of interest to researchers seeking to start with SfM; these are "SfM Guide of Instructors and Investigators" and "SfM Data Exploration and Processing Manual (Agisoft)". The module is appropriate for geoscience courses with field components such as field methods, geomorphology, geophysics, tectonics, and structural geology. All GETSI modules are designed and developed by teams of faculty and content experts and undergo rigorous review and classroom testing. GETSI is a collaborative project by UNAVCO (which runs NSF's Geodetic Facility), Indiana University, and Idaho State University. The Science Education Resource Center (SERC) provides assessment and evaluation expertise and webhosting.
Niemiec, Joanna; Adamczyk, Agnieszka; Ambicka, Aleksandra; Mucha-Małecka, Anna; Wysocki, Wojciech; Mituś, Jerzy; Ryś, Janusz
2012-11-01
Lymphangiogenesis is a potential indicator of cancer patients' survival. However, there is no standardisation of methodologies applied to the assessment of lymphatic vessel density. In 156 invasive ductal breast cancers (T 1/N+/M0), lymphatic and blood vessels were visualised using podoplanin and CD34, respectively. Based on these markers expression, four parameters were assessed: (i) distribution of podoplanin-stained vessels (DPV) - the percentage of fields with at least one lymphatic vessel (a simple method proposed by us), (ii) lymphatic vessel density (LVD), (iii) LVD to microvessel density ratio (LVD/MVD) and (iv) the expression of podoplanin in cancer-associated fibroblasts. Next, we estimated relations between the above-mentioned parameters and: (i) breast cancer subtype, (ii) tumour grade, and (iii) basal markers expression. We found that intensive lymphangiogenesis, assessed using all studied methods, is positively related to high tumour grade, triple negative or HER2 subtype and expression of basal markers. Whereas, the absence of podoplanin expression in fibroblasts of cancer stroma is related to luminal A subtype, low tumour grade or lack of basal markers expression. Distribution of podoplanin-stained vessels, assessed by a simple method proposed by us (indicating the percentage of fields with at least one lymphatic vessel), might be used instead of the "hot-spot" method.
Research Designs and Methods in Self-Assessment Studies: A Content Analysis
ERIC Educational Resources Information Center
Pastore, Serafina
2017-01-01
This paper focuses on self-assessment studies in the higher education field. In the assessment for learning perspective, self-assessment is related to reflection, metacognition, and self-regulation: all these aspects are considered as fundamental prerequisites for students' future professional development. Despite the recognition of…
Natarajan, Annamalai; Angarita, Gustavo; Gaiser, Edward; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin M
2016-09-01
Mobile health research on illicit drug use detection typically involves a two-stage study design where data to learn detectors is first collected in lab-based trials, followed by a deployment to subjects in a free-living environment to assess detector performance. While recent work has demonstrated the feasibility of wearable sensors for illicit drug use detection in the lab setting, several key problems can limit lab-to-field generalization performance. For example, lab-based data collection often has low ecological validity, the ground-truth event labels collected in the lab may not be available at the same level of temporal granularity in the field, and there can be significant variability between subjects. In this paper, we present domain adaptation methods for assessing and mitigating potential sources of performance loss in lab-to-field generalization and apply them to the problem of cocaine use detection from wearable electrocardiogram sensor data.
Predicting implementation from organizational readiness for change: a study protocol
2011-01-01
Background There is widespread interest in measuring organizational readiness to implement evidence-based practices in clinical care. However, there are a number of challenges to validating organizational measures, including inferential bias arising from the halo effect and method bias - two threats to validity that, while well-documented by organizational scholars, are often ignored in health services research. We describe a protocol to comprehensively assess the psychometric properties of a previously developed survey, the Organizational Readiness to Change Assessment. Objectives Our objective is to conduct a comprehensive assessment of the psychometric properties of the Organizational Readiness to Change Assessment incorporating methods specifically to address threats from halo effect and method bias. Methods and Design We will conduct three sets of analyses using longitudinal, secondary data from four partner projects, each testing interventions to improve the implementation of an evidence-based clinical practice. Partner projects field the Organizational Readiness to Change Assessment at baseline (n = 208 respondents; 53 facilities), and prospectively assesses the degree to which the evidence-based practice is implemented. We will conduct predictive and concurrent validities using hierarchical linear modeling and multivariate regression, respectively. For predictive validity, the outcome is the change from baseline to follow-up in the use of the evidence-based practice. We will use intra-class correlations derived from hierarchical linear models to assess inter-rater reliability. Two partner projects will also field measures of job satisfaction for convergent and discriminant validity analyses, and will field Organizational Readiness to Change Assessment measures at follow-up for concurrent validity (n = 158 respondents; 33 facilities). Convergent and discriminant validities will test associations between organizational readiness and different aspects of job satisfaction: satisfaction with leadership, which should be highly correlated with readiness, versus satisfaction with salary, which should be less correlated with readiness. Content validity will be assessed using an expert panel and modified Delphi technique. Discussion We propose a comprehensive protocol for validating a survey instrument for assessing organizational readiness to change that specifically addresses key threats of bias related to halo effect, method bias and questions of construct validity that often go unexplored in research using measures of organizational constructs. PMID:21777479
ERIC Educational Resources Information Center
Faikhamta, Chatree; Jantarakantee, Ekgapoom; Roadrangka, Vantipa
2011-01-01
This research explored the current situation in managing the field experience of a five-year science teacher education program in one university in Thailand. A number of methods were used to assess field experience situation: (1) a questionnaire on the perceptions of pre-service science teachers of field experience management; (2) participant…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raktoe, Sawan A.S.; Dehnad, Homan, E-mail: h.dehnad@umcutrecht.nl; Raaijmakers, Cornelis P.J.
Purpose: To model locoregional recurrences of oropharyngeal squamous cell carcinomas (OSCC) treated with primary intensity modulated radiation therapy (IMRT) in order to find the origins from which recurrences grow and relate their location to original target volume borders. Methods and Materials: This was a retrospective analysis of OSCC treated with primary IMRT between January 2002 and December 2009. Locoregional recurrence volumes were delineated on diagnostic scans and coregistered rigidly with treatment planning computed tomography scans. Each recurrence was analyzed with two methods. First, overlapping volumes of a recurrence and original target were measured ('volumetric approach') and assessed as 'in-field', 'marginal',more » or 'out-field'. Then, the center of mass (COM) of a recurrence volume was assumed as the origin from where a recurrence expanded, the COM location was compared with original target volume borders and assessed as 'in-field', 'marginal', or 'out-field'. Results: One hundred thirty-one OSCC were assessed. For all patients alive at the end of follow-up, the mean follow-up time was 40 months (range, 12-83 months); 2 patients were lost to follow-up. The locoregional recurrence rate was 27%. Of all recurrences, 51% were local, 23% were regional, and 26% had both local and regional recurrences. Of all recurrences, 74% had imaging available for assessment. Regarding volumetric analysis of local recurrences, 15% were in-field gross tumor volume (GTV), and 65% were in-field clinical tumor volume (CTV). Using the COM approach, we found that 70% of local recurrences were in-field GTV and 90% were in-field CTV. Of the regional recurrences, 25% were volumetrically in-field GTV, and using the COM approach, we found 54% were in-field GTV. The COM of local out-field CTV recurrences were maximally 16 mm outside CTV borders, whereas for regional recurrences, this was 17 mm. Conclusions: The COM model is practical and specific for recurrence assessment. Most recurrences originated in the GTV. This suggests radioresistance in certain tumor parts.« less
NASA Astrophysics Data System (ADS)
Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.
2017-05-01
An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.
Monajjemzadeh, Farnaz; Shokri, Javad; Mohajel Nayebi, Ali Reza; Nemati, Mahboob; Azarmi, Yadollah; Charkhpour, Mohammad; Najafi, Moslem
2014-01-01
Purpose: This study was aimed to design Objective Structured Field Examination (OSFE) and also standardize the course plan of community pharmacy clerkship at Pharmacy Faculty of Tabriz University of Medical Sciences (Iran). Methods: The study was composed of several stages including; evaluation of the old program, standardization and implementation of the new course plan, design and implementation of OSFE, and finally results evaluation. Results: Lack of a fair final assessment protocol and proper organized educating system in various fields of community pharmacy clerkship skills were assigned as the main weaknesses of the old program. Educational priorities were determined and student’s feedback was assessed to design the new curriculum consisting of sessions to fulfill a 60-hour training course. More than 70% of the students were satisfied and successfulness and efficiency of the new clerkship program was significantly greater than the old program (P<0.05). In addition, they believed that OSFE was a suitable testing method. Conclusion: The defined course plan was successfully improved different skills of the students and OSFE was concluded as a proper performance based assessment method. This is easily adoptable by pharmacy faculties to improve the educational outcomes of the clerkship course. PMID:24511477
Delatour, Vincent; Lalere, Beatrice; Saint-Albin, Karène; Peignaux, Maryline; Hattchouel, Jean-Marc; Dumont, Gilles; De Graeve, Jacques; Vaslin-Reimann, Sophie; Gillery, Philippe
2012-11-20
The reliability of biological tests is a major issue for patient care in terms of public health that involves high economic stakes. Reference methods, as well as regular external quality assessment schemes (EQAS), are needed to monitor the analytical performance of field methods. However, control material commutability is a major concern to assess method accuracy. To overcome material non-commutability, we investigated the possibility of using lyophilized serum samples together with a limited number of frozen serum samples to assign matrix-corrected target values, taking the example of glucose assays. Trueness of the current glucose assays was first measured against a primary reference method by using human frozen sera. Methods using hexokinase and glucose oxidase with spectroreflectometric detection proved very accurate, with bias ranging between -2.2% and +2.3%. Bias of methods using glucose oxidase with spectrophotometric detection was +4.5%. Matrix-related bias of the lyophilized materials was then determined and ranged from +2.5% to -14.4%. Matrix-corrected target values were assigned and used to assess trueness of 22 sub-peer groups. We demonstrated that matrix-corrected target values can be a valuable tool to assess field method accuracy in large scale surveys where commutable materials are not available in sufficient amount with acceptable costs. Copyright © 2012 Elsevier B.V. All rights reserved.
2004-05-01
following digestion using method 3005A. Copper concentrations were verified using atomic absorption spectroscopy/graphite furnace. Each chamber...1995. Ammonia Variation in Sediments: Spatial, Temporal and Method -Related Effects. Environ. Toxicol. Chem. 14:1499-1506. Savage, W.K., F.W...Regulator Approved Methods and Protocols for Conducting Marine and Terrestrial Risk Assessments 1.III.01.k - Improved Field Analytical Sensors
Pamela G. Sikkink; Roy Renkin; Geneva Chong; Art Sikkink
2013-01-01
The five field sampling methods tested for this study differed in richness and Simpson's Index values calculated from the raw data. How much the methods differed, and which ones were most similar to each other, depended on which diversity measure and which type of data were used for comparisons. When the number of species (richness) was used as a measure of...
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
10 CFR 851.21 - Hazard identification and assessment.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...
Testing a simple field method for assessing nitrate removal in riparian zones
Philippe Vidon; Michael G. Dosskey
2008-01-01
Being able to identify riparian sites that function better for nitrate removal from groundwater is critical to using efficiently the riparian zones for water quality management. For this purpose, managers need a method that is quick, inexpensive, and accurate enough to enable effective management decisions. This study assesses the precision and accuracy of a simple...
Learning when Serious: Psychophysiological Evaluation of a Technology-Enhanced Learning Game
ERIC Educational Resources Information Center
Cowley, Ben; Fantato, Martino; Jennett, Charlene; Ruskov, Martin; Ravaja, Niklas
2014-01-01
We report an evaluation study for a novel learning platform, motivated by the growing need for methods to do assessment of serious game efficacy. The study was a laboratory experiment combining evaluation methods from the fields of learning assessment and psychophysiology. 15 participants used the TARGET game platform for 25 minutes, while the…
Evaluation of Bare Ground on Rangelands using Unmanned Aerial Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert P. Breckenridge; Maxine Dakins
2011-01-01
Attention is currently being given to methods that assess the ecological condition of rangelands throughout the United States. There are a number of different indicators that assess ecological condition of rangelands. Bare Ground is being considered by a number of agencies and resource specialists as a lead indicator that can be evaluated over a broad area. Traditional methods of measuring bare ground rely on field technicians collecting data along a line transect or from a plot. Unmanned aerial vehicles (UAVs) provide an alternative to collecting field data, can monitor a large area in a relative short period of time, andmore » in many cases can enhance safety and time required to collect data. In this study, both fixed wing and helicopter UAVs were used to measure bare ground in a sagebrush steppe ecosystem. The data were collected with digital imagery and read using the image analysis software SamplePoint. The approach was tested over seven different plots and compared against traditional field methods to evaluate accuracy for assessing bare ground. The field plots were located on the Idaho National Laboratory (INL) site west of Idaho Falls, Idaho in locations where there is very little disturbance by humans and the area is grazed only by wildlife. The comparison of fixed-wing and helicopter UAV technology against field estimates shows good agreement for the measurement of bare ground. This study shows that if a high degree of detail and data accuracy is desired, then a helicopter UAV may be a good platform. If the data collection objective is to assess broad-scale landscape level changes, then the collection of imagery with a fixed-wing system is probably more appropriate.« less
Preprocessing of gravity gradients at the GOCE high-level processing facility
NASA Astrophysics Data System (ADS)
Bouman, Johannes; Rispens, Sietse; Gruber, Thomas; Koop, Radboud; Schrama, Ernst; Visser, Pieter; Tscherning, Carl Christian; Veicherts, Martin
2009-07-01
One of the products derived from the gravity field and steady-state ocean circulation explorer (GOCE) observations are the gravity gradients. These gravity gradients are provided in the gradiometer reference frame (GRF) and are calibrated in-flight using satellite shaking and star sensor data. To use these gravity gradients for application in Earth scienes and gravity field analysis, additional preprocessing needs to be done, including corrections for temporal gravity field signals to isolate the static gravity field part, screening for outliers, calibration by comparison with existing external gravity field information and error assessment. The temporal gravity gradient corrections consist of tidal and nontidal corrections. These are all generally below the gravity gradient error level, which is predicted to show a 1/ f behaviour for low frequencies. In the outlier detection, the 1/ f error is compensated for by subtracting a local median from the data, while the data error is assessed using the median absolute deviation. The local median acts as a high-pass filter and it is robust as is the median absolute deviation. Three different methods have been implemented for the calibration of the gravity gradients. All three methods use a high-pass filter to compensate for the 1/ f gravity gradient error. The baseline method uses state-of-the-art global gravity field models and the most accurate results are obtained if star sensor misalignments are estimated along with the calibration parameters. A second calibration method uses GOCE GPS data to estimate a low-degree gravity field model as well as gravity gradient scale factors. Both methods allow to estimate gravity gradient scale factors down to the 10-3 level. The third calibration method uses high accurate terrestrial gravity data in selected regions to validate the gravity gradient scale factors, focussing on the measurement band. Gravity gradient scale factors may be estimated down to the 10-2 level with this method.
Artificial intelligence in radiology.
Hosny, Ahmed; Parmar, Chintan; Quackenbush, John; Schwartz, Lawrence H; Aerts, Hugo J W L
2018-05-17
Artificial intelligence (AI) algorithms, particularly deep learning, have demonstrated remarkable progress in image-recognition tasks. Methods ranging from convolutional neural networks to variational autoencoders have found myriad applications in the medical image analysis field, propelling it forward at a rapid pace. Historically, in radiology practice, trained physicians visually assessed medical images for the detection, characterization and monitoring of diseases. AI methods excel at automatically recognizing complex patterns in imaging data and providing quantitative, rather than qualitative, assessments of radiographic characteristics. In this Opinion article, we establish a general understanding of AI methods, particularly those pertaining to image-based tasks. We explore how these methods could impact multiple facets of radiology, with a general focus on applications in oncology, and demonstrate ways in which these methods are advancing the field. Finally, we discuss the challenges facing clinical implementation and provide our perspective on how the domain could be advanced.
Individualized Educational Assessment: Twelfth-Grade Science.
ERIC Educational Resources Information Center
Bock, R. Darrell; Zimowski, Michele
The goals, principles, and methods of an individualized educational assessment are described as implemented in a 12th-grade science assessment instrument undergoing field trials in Ohio. Pilot tests were planned for December 1990 and March and April 1991. The assessment design incorporates the duplex design of R. D. Bock and R. J. Mislevy (1988)…
ERIC Educational Resources Information Center
Kingdon, J. M.; Hartley, D. J.
1982-01-01
Candidates taking University of London Advanced Level Biology Examination submit their practical/field-work notebooks for assessment (contributing 10 percent to final grade). Describes research undertaken during the first operation examination, reviewing assessment method and analyzing and discussing moderation techniques. Indicates assessment and…
Respirable dust and respirable silica exposure in Ontario gold mines.
Verma, Dave K; Rajhans, Gyan S; Malik, Om P; des Tombe, Karen
2014-01-01
A comprehensive survey of respirable dust and respirable silica in Ontario gold mines was conducted by the Ontario Ministry of Labor during 1978-1979. The aim was to assess the feasibility of introducing gravimetric sampling to replace the assessment method which used konimeters, a device which gave results in terms of number of particles per cubic centimeter (ppcc) of air. The study involved both laboratory and field assessments. The field assessment involved measurement of airborne respirable dust and respirable silica at all eight operating gold mines of the time. This article describes the details of the field assessment. A total of 288 long-term (7-8 hr) personal respirable dust air samples were collected from seven occupational categories in eight gold mines. The respirable silica (α-quartz) was determined by x-ray diffraction method. The results show that during 1978-1979, the industry wide mean respirable dust was about 1 mg/m(3), and the mean respirable silica was 0.08 mg/m(3.)The mean% silica in respirable dust was 7.5%. The data set would be useful in future epidemiological and health studies, as well as in assessment of workers' compensation claims for occupational diseases such as silicosis, chronic obstructive pulmonary disease (COPD), and autoimmune diseases such as renal disease and rheumatoid arthritis.
DO TIE LABORATORY BASED ASSESSMENT METHODS REALLY PREDICT FIELD EFFECTS?
Sediment Toxicity Identification and Evaluation (TIE) methods have been developed for both porewaters and whole sediments. These relatively simple laboratory methods are designed to identify specific toxicants or classes of toxicants in sediments; however, the question of whethe...
ERIC Educational Resources Information Center
Ling, Guangming
2012-01-01
To assess the value of individual students' subscores on the Major Field Test in Business (MFT Business), I examined the test's internal structure with factor analysis and structural equation model methods, and analyzed the subscore reliabilities using the augmented scores method. Analyses of the internal structure suggested that the MFT Business…
Field test of a new Australian method of rangeland monitoring
Suzanne Mayne; Neil West
2001-01-01
Managers need more efficient means of monitoring changes on the lands they manage. Accordingly, a new Australian approach was field tested and compared to the Daubenmire method of assessing plant cover, litter, and bare soil. The study area was a 2 mile wide by 30.15 mile long strip, mostly covered by salt desert shrub ecosystem types, centered along the SE boundary of...
NASA Astrophysics Data System (ADS)
Salhi, Mohammed Adnan; Kazemipour, Alireza; Gentille, Gennaro; Spirito, Marco; Kleine-Ostmann, Thomas; Schrader, Thorsten
2016-09-01
We present the design and characterization of planar mm-wave patch antenna arrays with waveguide-to-microstrip transition using both near- and far-field methods. The arrays were designed for metrological assessment of error sources in antenna measurement. One antenna was designed for the automotive radar frequency range at 77 GHz, while another was designed for the frequency of 94 GHz, which is used, e.g., for imaging radar applications. In addition to the antennas, a simple transition from rectangular waveguide WR-10 to planar microstrip line on Rogers 3003™ substrate has been designed based on probe coupling. For determination of the far-field radiation pattern of the antennas, we compare results from two different measurement methods to simulations. Both a far-field antenna measurement system and a planar near-field scanner with near-to-far-field transformation were used to determine the antenna diagrams. The fabricated antennas achieve a good matching and a good agreement between measured and simulated antenna diagrams. The results also show that the far-field scanner achieves more accurate measurement results with regard to simulations than the near-field scanner. The far-field antenna scanning system is built for metrological assessment and antenna calibration. The antennas are the first which were designed to be tested with the measurement system.
Methods for land use impact assessment: A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perminova, Tataina, E-mail: tatiana.perminova@utt.fr; Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk; Sirina, Natalia, E-mail: natalia.sirina@utt.fr
Many types of methods to assess land use impact have been developed. Nevertheless a systematic synthesis of all these approaches is necessary to highlight the most commonly used and most effective methods. Given the growing interest in this area of research, a review of the different methods of assessing land use impact (LUI) was performed using bibliometric analysis. One hundred eighty seven articles of agricultural and biological science, and environmental sciences were examined. According to our results, the most frequently used land use assessment methods are Life-Cycle Assessment, Material Flow Analysis/Input–Output Analysis, Environmental Impact Assessment and Ecological Footprint. Comparison ofmore » the methods allowed their specific features to be identified and to arrive at the conclusion that a combination of several methods is the best basis for a comprehensive analysis of land use impact assessment. - Highlights: • We identified the most frequently used methods in land use impact assessment. • A comparison of the methods based on several criteria was carried out. • Agricultural land use is by far the most common area of study within the methods. • Incentive driven methods, like LCA, arouse the most interest in this field.« less
NASA Astrophysics Data System (ADS)
Hand, J. W.
2008-08-01
Numerical modelling of the interaction between electromagnetic fields (EMFs) and the dielectrically inhomogeneous human body provides a unique way of assessing the resulting spatial distributions of internal electric fields, currents and rate of energy deposition. Knowledge of these parameters is of importance in understanding such interactions and is a prerequisite when assessing EMF exposure or when assessing or optimizing therapeutic or diagnostic medical applications that employ EMFs. In this review, computational methods that provide this information through full time-dependent solutions of Maxwell's equations are summarized briefly. This is followed by an overview of safety- and medical-related applications where modelling has contributed significantly to development and understanding of the techniques involved. In particular, applications in the areas of mobile communications, magnetic resonance imaging, hyperthermal therapy and microwave radiometry are highlighted. Finally, examples of modelling the potentially new medical applications of recent technologies such as ultra-wideband microwaves are discussed.
Reconciling laboratory and field assessments of neonicotinoid toxicity to honeybees
Henry, Mickaël; Cerrutti, Nicolas; Aupinel, Pierrick; Decourtye, Axel; Gayrard, Mélanie; Odoux, Jean-François; Pissard, Aurélien; Rüger, Charlotte; Bretagnolle, Vincent
2015-01-01
European governments have banned the use of three common neonicotinoid pesticides due to insufficiently identified risks to bees. This policy decision is controversial given the absence of clear consistency between toxicity assessments of those substances in the laboratory and in the field. Although laboratory trials report deleterious effects in honeybees at trace levels, field surveys reveal no decrease in the performance of honeybee colonies in the vicinity of treated fields. Here we provide the missing link, showing that individual honeybees near thiamethoxam-treated fields do indeed disappear at a faster rate, but the impact of this is buffered by the colonies' demographic regulation response. Although we could ascertain the exposure pathway of thiamethoxam residues from treated flowers to honeybee dietary nectar, we uncovered an unexpected pervasive co-occurrence of similar concentrations of imidacloprid, another neonicotinoid normally restricted to non-entomophilous crops in the study country. Thus, its origin and transfer pathways through the succession of annual crops need be elucidated to conveniently appraise the risks of combined neonicotinoid exposures. This study reconciles the conflicting laboratory and field toxicity assessments of neonicotinoids on honeybees and further highlights the difficulty in actually detecting non-intentional effects on the field through conventional risk assessment methods. PMID:26582026
Peng, Henry T; Savage, Erin; Vartanian, Oshin; Smith, Shane; Rhind, Shawn G; Tenn, Catherine; Bjamason, Stephen
2016-05-01
A convenient biosensor for real-time measurement of biomarkers for in-field psychophysiological stress research and military operations is desirable. We evaluated a hand-held device for measuring salivary amylase as a stress marker in medical technicians undergoing combat casualty care training using two different modalities in operating room and field settings. Salivary amylase activity was measured by two biosensor methods: directly sampling saliva with a test strip placed under the tongue or pipetting a fixed volume of precollected saliva onto the test strip, followed by analyzing the sample on the strip using a biosensor. The two methods were compared for their accuracy and sensitivity to detect the stress response using an enzyme assay method as a standard. The measurements from the under-the-tongue method were not as consistent with those from the standard assay method as the values obtained from the pipetting method. The under-the-tongue method did not detect any significant increase in the amylase activity due to stress in the operating room (P > 0.1), in contrast to the significant increases observed using the pipetting method and assay method with a significance level less than 0.05 and 0.1, respectively. Furthermore, the under-the-tongue method showed no increased amylase activity in the field testing, while both the pipetting method and assay method showed increased amylase activity in the same group (P < 0.1). The accuracy and consistency of the biosensors need to be improved when used to directly measure salivary amylase activity under the tongue for stress assessment in military medical training. © 2015 Her Majesty the Queen in Right of Canada. Journal of Clinical Laboratory Analysis published by Wiley Periodicals, Inc. Reproduced with the permission DRDC Editorial Board.
Analysis of rainfall-induced slope instability using a field of local factor of safety
Lu, Ning; Şener-Kaya, Başak; Wayllace, Alexandra; Godt, Jonathan W.
2012-01-01
Slope-stability analyses are mostly conducted by identifying or assuming a potential failure surface and assessing the factor of safety (FS) of that surface. This approach of assigning a single FS to a potentially unstable slope provides little insight on where the failure initiates or the ultimate geometry and location of a landslide rupture surface. We describe a method to quantify a scalar field of FS based on the concept of the Coulomb stress and the shift in the state of stress toward failure that results from rainfall infiltration. The FS at each point within a hillslope is called the local factor of safety (LFS) and is defined as the ratio of the Coulomb stress at the current state of stress to the Coulomb stress of the potential failure state under the Mohr-Coulomb criterion. Comparative assessment with limit-equilibrium and hybrid finite element limit-equilibrium methods show that the proposed LFS is consistent with these approaches and yields additional insight into the geometry and location of the potential failure surface and how instability may initiate and evolve with changes in pore water conditions. Quantitative assessments applying the new LFS field method to slopes under infiltration conditions demonstrate that the LFS has the potential to overcome several major limitations in the classical FS methodologies such as the shape of the failure surface and the inherent underestimation of slope instability. Comparison with infinite-slope methods, including a recent extension to variably saturated conditions, shows further enhancement in assessing shallow landslide occurrence using the LFS methodology. Although we use only a linear elastic solution for the state of stress with no post-failure analysis that require more sophisticated elastoplastic or other theories, the LFS provides a new means to quantify the potential instability zones in hillslopes under variably saturated conditions using stress-field based methods.
ERIC Educational Resources Information Center
Wang, Yanqing; Ai, Wenguo; Liang, Yaowen; Liu, Ying
2015-01-01
Peer assessment is an efficient and effective learning assessment method that has been used widely in diverse fields in higher education. Despite its many benefits, a fundamental problem in peer assessment is that participants lack the motivation to assess others' work faithfully and fairly. Nonconsensus is a common challenge that makes the…
Near-field hazard assessment of March 11, 2011 Japan Tsunami sources inferred from different methods
Wei, Y.; Titov, V.V.; Newman, A.; Hayes, G.; Tang, L.; Chamberlin, C.
2011-01-01
Tsunami source is the origin of the subsequent transoceanic water waves, and thus the most critical component in modern tsunami forecast methodology. Although impractical to be quantified directly, a tsunami source can be estimated by different methods based on a variety of measurements provided by deep-ocean tsunameters, seismometers, GPS, and other advanced instruments, some in real time, some in post real-time. Here we assess these different sources of the devastating March 11, 2011 Japan tsunami by model-data comparison for generation, propagation and inundation in the near field of Japan. This study provides a comparative study to further understand the advantages and shortcomings of different methods that may be potentially used in real-time warning and forecast of tsunami hazards, especially in the near field. The model study also highlights the critical role of deep-ocean tsunami measurements for high-quality tsunami forecast, and its combination with land GPS measurements may lead to better understanding of both the earthquake mechanisms and tsunami generation process. ?? 2011 MTS.
Assessment of exposure to EMF in a Danish case-control study of childhood cancer.
Jensen, J K; Olsen, J H; Folkersen, E
1994-01-01
In Denmark it is permitted to draw overhead lines across residential areas. In connection with a Danish case-control study we developed a method for estimating the historical values of magnetic fields at residences. The study included 1,707 cases with childhood cancer and 4,788 matched population controls. A total of 16,082 different addresses had been occupied by the families from the time of conception until the date of diagnosis. The values of the extreme, maximum, middle and minimum 50 Hz magnetic field strengths originating from a 50-400 kV high-voltage installation were estimated for each of the dwellings included in a potential exposure area. 30 children were exposed to an average level of magnetic fields of 0.1 microT or more. The evaluated Danish method of exposure assessment was compared with the method for residential wiring codes developed by Wertheimer and Leeper /1/. We concluded that the US wiring codes are inappropriate for use in connection with the Danish electricity transmission system.
We developed an index of relative bed stability (LRBS) based on low flow survey data collected using the U.S. Environmental Protection Agency’s Environmental Monitoring and Assessment Program (EMAP) field methods to assess anthropogenic sedimentation in streams. LRBS is the log ...
Poster Sessions in Marketing Education: An Empirical Examination
ERIC Educational Resources Information Center
Stegemann, Nicole; Sutton-Brady, Catherine
2009-01-01
Poster sessions provide a creative and stimulating alternative to traditional assessment methods in marketing. Poster sessions, as a means of assessment, have long been used in science fields. This article presents the successful implementation of poster sessions as a means of assessment in a postgraduate unit of study. Poster sessions in…
Israel, Michel
2015-09-01
The exposure and risk evaluation process in Bulgaria concerning non-ionizing radiation health and safety started in the early 1970s. Then, the first research laboratory "Electromagnetic fields in the working environment" was founded in the framework of the Centre of Hygiene, belonging to the Medical Academy, Sofia. The main activities were connected with developing legislation, new equipment for measurement of electromagnetic fields, new methods for measurement and exposure assessment, in vivo and human studies for developing methods, studying the effect of non-ionizing radiation on human body, developing exposure limits. Most of the occupations as metal industry, plastic welding, energetics, physiotherapy, broadcasting, telephone stations, computer industry, etc., have been covered by epidemiological investigations and risk evaluation. In 1986, the ANSI standard for safe use of lasers has been implemented as national legislation that gave the start for studies in the field of risk assessment concerning the use of lasers in industry and medicine. The environmental exposure studies started in 1991 following the very fast implementation of the telecommunication technologies. Now, funds for research are very insignificant, and studies in the field of risk assessment are very few. Nevertheless, Bulgaria has been an active member of the WHO International EMF Project, since 1997, and that gives good opportunity for collaboration with other Member states, and for implementation of new approach in the EMF policy for workers and people's protection against non-ionizing radiation exposure.
A new method of quantitative cavitation assessment in the field of a lithotripter.
Jöchle, K; Debus, J; Lorenz, W J; Huber, P
1996-01-01
Transient cavitation seems to be a very important effect regarding the interaction of pulsed high-energy ultrasound with biologic tissues. Using a newly developed laser optical system we are able to determine the life-span of transient cavities (relative error less than +/- 5%) in the focal region of a lithotripter (Lithostar, Siemens). The laser scattering method is based on the detection of scattered laser light reflected during a bubble's life. This method requires no sort of sensor material in the pathway of the sound field. Thus, the method avoids any interference with bubble dynamics during the measurement. The knowledge of the time of bubble decay allows conclusions to be reached on the destructive power of the cavities. By combining the results of life-span measurements with the maximum bubble radius using stroboscopic photographs we found that the measured time of bubble decay and the predicted time using Rayleigh's law only differs by about 13% even in the case of complex bubble fields. It can be shown that the laser scattering method is feasible to assess cavitation events quantitatively. Moreover, it will enable us to compare different medical ultrasound sources that have the capability to generate cavitation.
Constructing a Grounded Theory of E-Learning Assessment
ERIC Educational Resources Information Center
Alonso-Díaz, Laura; Yuste-Tosina, Rocío
2015-01-01
This study traces the development of a grounded theory of assessment in e-learning environments, a field in need of research to establish the parameters of an assessment that is both reliable and worthy of higher learning accreditation. Using grounded theory as a research method, we studied an e-assessment model that does not require physical…
Assessment of soil moisture dynamics on an irrigated maize field using cosmic ray neutron sensing
NASA Astrophysics Data System (ADS)
Scheiffele, Lena Maria; Baroni, Gabriele; Oswald, Sascha E.
2015-04-01
In recent years cosmic ray neutron sensing (CRS) developed as a valuable, indirect and non-invasive method to estimate soil moisture at a scale of tens of hectares, covering the gap between point scale measurements and large scale remote sensing techniques. The method is particularly promising in cropped and irrigated fields where invasive installation of belowground measurement devices could conflict with the agricultural management. However, CRS is affected by all hydrogen pools in the measurement footprint and a fast growing biomass provides some challenges for the interpretation of the signal and application of the method for detecting soil moisture. For this aim, in this study a cosmic ray probe was installed on a field near Braunschweig (Germany) during one maize growing season (2014). The field was irrigated in stripes of 50 m width using sprinkler devices for a total of seven events. Three soil sampling campaigns were conducted throughout the growing season to assess the effect of different hydrogen pools on calibration results. Additionally, leaf area index and biomass measurements were collected to provide the relative contribution of the biomass on the CRS signal. Calibration results obtained with the different soil sampling campaigns showed some discrepancy well correlated with the biomass growth. However, after the calibration function was adjusted to account also for lattice water and soil organic carbon, thus representing an equivalent water content of the soil, the differences decreased. Soil moisture estimated with CRS responded well to precipitation and irrigation events, confirming also the effective footprint of the method (i.e., radius 300 m) and showing occurring water stress for the crop. Thus, the dynamics are in agreement with the soil moisture determined with point scale measurements but they are less affected by the heterogeneous moisture conditions within the field. For this reason, by applying a detailed calibration, CRS proves to be a valuable method for the application on agricultural sites to assess and improve irrigation management.
DEVELOPMENT AND APPLICATION OF METHODS TO ASSESS HUMAN EXPOSURE TO PESTICIDES
Note: this task is schedule to end September 2003. Two tasks will take its place: method development for emerging pesticides including chiral chemistry applications, and in-house laboratory operations. Field sampling methods are covered under a new task proposed this year.
<...
Assessment of Proficiency and Competency in Laboratory Animal Biomethodologies
Clifford, Paula; Melfi, Natasha; Bogdanske, John; Johnson, Elizabeth J; Kehler, James; Baran, Szczepan W
2013-01-01
Personnel working with laboratory animals are required by laws and guidelines to be trained and qualified to perform biomethodologic procedures. The assessment of competency and proficiency is a vital component of a laboratory animal training program, because this process confirms that the trainees have met the learning objectives for a particular procedure. The approach toward qualification assessment differs between organizations because laws and guidelines do not outline how the assessment should be performed or which methods and tools should be used. Assessment of clinical and surgical medicine has received considerable attention over the last few decades and has progressed from simple subjective methods to well-defined and objective methods of assessing competency. Although biomethodology competency and proficiency assessment is discussed in the literature, a standard and objective assessment method has not yet been developed. The development and implementation of an objective and standardized biomethodologic assessment program can serve as a tool to improve standards, ensure consistent training, and decrease research variables yet ensure animal welfare. Here we review the definition and goals of training and assessment, review assessment methods, and propose a method to develop a standard and objective assessment program for the laboratory animal science field, particularly training departments and IACUC. PMID:24351758
Narayan, Sreenath; Kalhan, Satish C.; Wilson, David L.
2012-01-01
I.Abstract Purpose To reduce swaps in fat-water separation methods, a particular issue on 7T small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Materials and Methods Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Results Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Conclusion Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. PMID:23023815
NASA Astrophysics Data System (ADS)
Wu, Jingjing; Wu, Xinming; Li, Pengfei; Li, Nan; Mao, Xiaomei; Chai, Lihe
2017-04-01
Meridian system is not only the basis of traditional Chinese medicine (TCM) method (e.g. acupuncture, massage), but also the core of TCM's basic theory. This paper has introduced a new informational perspective to understand the reality and the holographic field of meridian. Based on maximum information entropy principle (MIEP), a dynamic equation for the holographic field has been deduced, which reflects the evolutionary characteristics of meridian. By using self-organizing artificial neural network as algorithm, the evolutionary dynamic equation of the holographic field can be resolved to assess properties of meridians and clinically diagnose the health characteristics of patients. Finally, through some cases from clinical patients (e.g. a 30-year-old male patient, an apoplectic patient, an epilepsy patient), we use this model to assess the evolutionary properties of meridians. It is proved that this model not only has significant implications in revealing the essence of meridian in TCM, but also may play a guiding role in clinical assessment of patients based on the holographic field of meridians.
Agarwal, Siddharth; Sethi, Vani; Pandey, Ravindra Mohan; Kondal, Dimple
2008-06-01
We examined the diagnostic accuracy of human touch (HT) method in assessing hypothermia against axillary digital thermometry (ADT) by a trained non-medical field investigator (who supervised activities of community health volunteers) in seven villages of Agra district, Uttar Pradesh, India. Body temperature of 148 newborns born between March and August 2005 was measured at four points in time for each enrolled newborn (within 48 h and on days 7, 30 and 60) by the field investigator under the axilla using a digital thermometer and by HT method using standard methodology. Total observations were 533. Hypothermia assessed by HT was in agreement with that assessed by ADT (<36.5 degrees C) in 498 observations. Hypothermia assessed by HT showed a high diagnostic accuracy when compared against ADT (kappa 0.65-0.81; sensitivity 74%; specificity 96.7%; positive predictive value 22; negative predictive value 0.26). HT is a simple, quick, inexpensive and programmatically important method. However, being a subjective assessment, its reliability depends on the investigator being adequately trained and competent in making consistently accurate assessments. There is also a need to assess whether with training and supervision even the less literate mothers, traditional birth attendants and community health volunteers can accurately assess mild and moderate hypothermia before promoting HT for early identification of neonatal risk in community-based programs.
Gravity gradient preprocessing at the GOCE HPF
NASA Astrophysics Data System (ADS)
Bouman, J.; Rispens, S.; Gruber, T.; Schrama, E.; Visser, P.; Tscherning, C. C.; Veicherts, M.
2009-04-01
One of the products derived from the GOCE observations are the gravity gradients. These gravity gradients are provided in the Gradiometer Reference Frame (GRF) and are calibrated in-flight using satellite shaking and star sensor data. In order to use these gravity gradients for application in Earth sciences and gravity field analysis, additional pre-processing needs to be done, including corrections for temporal gravity field signals to isolate the static gravity field part, screening for outliers, calibration by comparison with existing external gravity field information and error assessment. The temporal gravity gradient corrections consist of tidal and non-tidal corrections. These are all generally below the gravity gradient error level, which is predicted to show a 1/f behaviour for low frequencies. In the outlier detection the 1/f error is compensated for by subtracting a local median from the data, while the data error is assessed using the median absolute deviation. The local median acts as a high-pass filter and it is robust as is the median absolute deviation. Three different methods have been implemented for the calibration of the gravity gradients. All three methods use a high-pass filter to compensate for the 1/f gravity gradient error. The baseline method uses state-of-the-art global gravity field models and the most accurate results are obtained if star sensor misalignments are estimated along with the calibration parameters. A second calibration method uses GOCE GPS data to estimate a low degree gravity field model as well as gravity gradient scale factors. Both methods allow to estimate gravity gradient scale factors down to the 10-3 level. The third calibration method uses high accurate terrestrial gravity data in selected regions to validate the gravity gradient scale factors, focussing on the measurement band. Gravity gradient scale factors may be estimated down to the 10-2 level with this method.
Ball, Sarah C; Benjamin, Sara E; Ward, Dianne S
2007-04-01
To our knowledge, a direct observation protocol for assessing dietary intake among young children in child care has not been published. This article reviews the development and testing of a diet observation system for child care facilities that occurred during a larger intervention trial. Development of this system was divided into five phases, done in conjunction with a larger intervention study; (a) protocol development, (b) training of field staff, (c) certification of field staff in a laboratory setting, (d) implementation in a child-care setting, and (e) certification of field staff in a child-care setting. During the certification phases, methods were used to assess the accuracy and reliability of all observers at estimating types and amounts of food and beverages commonly served in child care. Tests of agreement show strong agreement among five observers, as well as strong accuracy between the observers and 20 measured portions of foods and beverages with a mean intraclass correlation coefficient value of 0.99. This structured observation system shows promise as a valid and reliable approach for assessing dietary intake of children in child care and makes a valuable contribution to the growing body of literature on the dietary assessment of young children.
Hill, Ryan C; Oman, Trent J; Wang, Xiujuan; Shan, Guomin; Schafer, Barry; Herman, Rod A; Tobias, Rowel; Shippar, Jeff; Malayappan, Bhaskar; Sheng, Li; Xu, Austin; Bradshaw, Jason
2017-07-12
As part of the regulatory approval process in Europe, comparison of endogenous soybean allergen levels between genetically engineered (GE) and non-GE plants has been requested. A quantitative multiplex analytical method using tandem mass spectrometry was developed and validated to measure 10 potential soybean allergens from soybean seed. The analytical method was implemented at six laboratories to demonstrate the robustness of the method and further applied to three soybean field studies across multiple growing seasons (including 21 non-GE soybean varieties) to assess the natural variation of allergen levels. The results show environmental factors contribute more than genetic factors to the large variation in allergen abundance (2- to 50-fold between environmental replicates) as well as a large contribution of Gly m 5 and Gly m 6 to the total allergen profile, calling into question the scientific rational for measurement of endogenous allergen levels between GE and non-GE varieties in the safety assessment.
Laamrani, Ahmed; Pardo Lara, Renato; Berg, Aaron A; Branson, Dave; Joosse, Pamela
2018-02-27
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as "app" method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph-grid and script methods (R² = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by -6.3% and -10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., -5.3% vs. -7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming.
Sousa, Marcelo R; Jones, Jon P; Frind, Emil O; Rudolph, David L
2013-01-01
In contaminant travel from ground surface to groundwater receptors, the time taken in travelling through the unsaturated zone is known as the unsaturated zone time lag. Depending on the situation, this time lag may or may not be significant within the context of the overall problem. A method is presented for assessing the importance of the unsaturated zone in the travel time from source to receptor in terms of estimates of both the absolute and the relative advective times. A choice of different techniques for both unsaturated and saturated travel time estimation is provided. This method may be useful for practitioners to decide whether to incorporate unsaturated processes in conceptual and numerical models and can also be used to roughly estimate the total travel time between points near ground surface and a groundwater receptor. This method was applied to a field site located in a glacial aquifer system in Ontario, Canada. Advective travel times were estimated using techniques with different levels of sophistication. The application of the proposed method indicates that the time lag in the unsaturated zone is significant at this field site and should be taken into account. For this case, sophisticated and simplified techniques lead to similar assessments when the same knowledge of the hydraulic conductivity field is assumed. When there is significant uncertainty regarding the hydraulic conductivity, simplified calculations did not lead to a conclusive decision. Copyright © 2012 Elsevier B.V. All rights reserved.
K. Bruce Jones; Anne C. Neale; Timothy G. Wade; James D. Wickham; Chad L. Cross; Curtis M. Edmonds; Thomas R. Loveland; Maliha S. Nash; Kurt H. Riitters; Elizabeth R. Smith
2001-01-01
Spatially explicit identification of changes in ecological conditions over large areas is key to targeting and prioitizing areas for environmental protection and restoration by managers at watershed, basin, and regional scales. A critical limitation to this point has been the development of methods to conduct such broad-scale assessments. Field-based methods have...
NASA Astrophysics Data System (ADS)
Cramer, Timothy F.
The Desert National Wildlife Refuge in southern Nevada has been selected for remote sensing analysis as part of a mineral assessment required for renewal of mineral withdrawal. The area of interest is nearly 3,000 km2 and covers portions of 5 different ranges with little to no infrastructure. Assessing such a large area using traditional field methods is very time intensive and expensive. The study described here serves as a pilot study, testing the capability of Landsat ETM+ and ASTER satellite imagery to remotely identify areas of potentially mineralized lithologies. This is done by generating a number of band ratio, band index, and mineral likelihood maps identifying 5 key mineral classes (silica, clay, iron oxide, dolomite and calcite), which commonly have patterned zonation around ore deposits. When compiled with available geologic and geochemical data sets, these intermediate products can provide guidance for targeted field evaluation and exploration. Field observations and spectral data collected in the laboratory can then be integrated with ASTER imagery to guide a Spectral Angle Mapper algorithm to generate a distribution map of the five mineral classes. The methods presented found the ASTER platform to be capable of remotely assessing the distribution of various lithologies and the mineral potential of large, remote areas. Furthermore areas of both high and low potential for ore deposits can be identified and used to guide field evaluation and exploration. Remote sensing studies of this caliber can be performed relatively quickly and inexpensively resulting in datasets, which can result in more accurate mapping and the identification of both lithologic boundaries and previously unidentified alteration associated with mineralization. Future mineral assessments and exploration activity should consider similar studies prior to field work.
Bromage, Erin S; Vadas, George G; Harvey, Ellen; Unger, Michael A; Kaattari, Stephen L
2007-10-15
Nitroaromatics are common pollutants of soil and groundwater at military installations because of their manufacture, storage, and use at these sites. Long-term monitoring of these pollutants comprise a significant percentage of restoration costs. Further, remediation activities often have to be delayed, while the samples are processed via traditional chemical assessment protocols. Here we describe a rapid (<5 min), cost-effective, accurate method using a KinExA Inline Biosensor for monitoring of 2,4,6-trinitrotoluene (TNT) in field water samples. The biosensor, which is based on KinExA technology, accurately estimated the concentration of TNT in double-blind comparisons with similar accuracy to traditional high-performance liquid chromatography(HPLC). In the assessment of field samples, the biosensor accurately predicted the concentration of TNT over the range of 1-30,000 microg/L when compared to either HPLC or quantitative gas chromatography-mass spectrometry (GC-MS). Various pre-assessment techniques were explored to examine whether field samples could be assessed untreated, without the removal of particulates or the use of solvents. In most cases, the KinExA Inline Biosensor gave a uniform assessment of TNT concentration independent of pretreatment method. This indicates that this sensor possesses significant promise for rapid, on-site assessment of TNT pollution in environmental water samples.
Employing broadband spectra and cluster analysis to assess thermal defoliation of cotton
USDA-ARS?s Scientific Manuscript database
Growers and field scouts need assistance in surveying cotton (Gossypium hirsutum L.) fields subjected to thermal defoliation to reap the benefits provided by this nonchemical defoliation method. A study was conducted to evaluate broadband spectral data and unsupervised classification as tools for s...
Assessing User Needs and Requirements for Assistive Robots at Home.
Werner, Katharina; Werner, Franz
2015-01-01
'Robots in healthcare' is a very trending topic. This paper gives an overview of currently and commonly used methods to gather user needs and requirements in research projects in the field of assistive robotics. Common strategies between authors are presented as well as examples of exceptions, which can help future researchers to find methods suitable for their own work. Typical problems of the field are discussed and partial solutions are proposed.
Reconciling laboratory and field assessments of neonicotinoid toxicity to honeybees.
Henry, Mickaël; Cerrutti, Nicolas; Aupinel, Pierrick; Decourtye, Axel; Gayrard, Mélanie; Odoux, Jean-François; Pissard, Aurélien; Rüger, Charlotte; Bretagnolle, Vincent
2015-11-22
European governments have banned the use of three common neonicotinoid pesticides due to insufficiently identified risks to bees. This policy decision is controversial given the absence of clear consistency between toxicity assessments of those substances in the laboratory and in the field. Although laboratory trials report deleterious effects in honeybees at trace levels, field surveys reveal no decrease in the performance of honeybee colonies in the vicinity of treated fields. Here we provide the missing link, showing that individual honeybees near thiamethoxam-treated fields do indeed disappear at a faster rate, but the impact of this is buffered by the colonies' demographic regulation response. Although we could ascertain the exposure pathway of thiamethoxam residues from treated flowers to honeybee dietary nectar, we uncovered an unexpected pervasive co-occurrence of similar concentrations of imidacloprid, another neonicotinoid normally restricted to non-entomophilous crops in the study country. Thus, its origin and transfer pathways through the succession of annual crops need be elucidated to conveniently appraise the risks of combined neonicotinoid exposures. This study reconciles the conflicting laboratory and field toxicity assessments of neonicotinoids on honeybees and further highlights the difficulty in actually detecting non-intentional effects on the field through conventional risk assessment methods. © 2015 The Author(s).
Measuring physical activity environments: a brief history.
Sallis, James F
2009-04-01
Physical activity is usually done in specific types of places, referred to as physical activity environments. These often include parks, trails, fitness centers, schools, and streets. In recent years, scientific interest has increased notably in measuring physical activity environments. The present paper provides an historical overview of the contributions of the health, planning, and leisure studies fields to the development of contemporary measures. The emphasis is on attributes of the built environment that can be affected by policies to contribute to the promotion of physical activity. Researchers from health fields assessed a wide variety of built environment variables expected to be related to recreational physical activity. Settings of interest were schools, workplaces, and recreation facilities, and most early measures used direct observation methods with demonstrated inter-observer reliability. Investigators from the city planning field evaluated aspects of community design expected to be related to people's ability to walk from homes to destinations. GIS was used to assess walkability defined by the 3Ds of residential density, land-use diversity, and pedestrian-oriented designs. Evaluating measures for reliability or validity was rarely done in the planning-related fields. Researchers in the leisure studies and recreation fields studied mainly people's use of leisure time rather than physical characteristics of parks and other recreation facilities. Although few measures of physical activity environments were developed, measures of aesthetic qualities are available. Each of these fields made unique contributions to the contemporary methods used to assess physical activity environments.
A PDF projection method: A pressure algorithm for stand-alone transported PDFs
NASA Astrophysics Data System (ADS)
Ghorbani, Asghar; Steinhilber, Gerd; Markus, Detlev; Maas, Ulrich
2015-03-01
In this paper, a new formulation of the projection approach is introduced for stand-alone probability density function (PDF) methods. The method is suitable for applications in low-Mach number transient turbulent reacting flows. The method is based on a fractional step method in which first the advection-diffusion-reaction equations are modelled and solved within a particle-based PDF method to predict an intermediate velocity field. Then the mean velocity field is projected onto a space where the continuity for the mean velocity is satisfied. In this approach, a Poisson equation is solved on the Eulerian grid to obtain the mean pressure field. Then the mean pressure is interpolated at the location of each stochastic Lagrangian particle. The formulation of the Poisson equation avoids the time derivatives of the density (due to convection) as well as second-order spatial derivatives. This in turn eliminates the major sources of instability in the presence of stochastic noise that are inherent in particle-based PDF methods. The convergence of the algorithm (in the non-turbulent case) is investigated first by the method of manufactured solutions. Then the algorithm is applied to a one-dimensional turbulent premixed flame in order to assess the accuracy and convergence of the method in the case of turbulent combustion. As a part of this work, we also apply the algorithm to a more realistic flow, namely a transient turbulent reacting jet, in order to assess the performance of the method.
Enhancing Field Research Methods with Mobile Survey Technology
ERIC Educational Resources Information Center
Glass, Michael R.
2015-01-01
This paper assesses the experience of undergraduate students using mobile devices and a commercial application, iSurvey, to conduct a neighborhood survey. Mobile devices offer benefits for enhancing student learning and engagement. This field exercise created the opportunity for classroom discussions on the practicalities of urban research, the…
ERIC Educational Resources Information Center
Jacobson, Lena; Rydberg, Agneta; Eliasson, Ann-Christin; Kits, Annika; Flodmark, Olof
2010-01-01
Aim: To relate visual field function to brain morphology in children with unilateral cerebral palsy (CP). Method: Visual field function was assessed using the confrontation technique and Goldmann perimetry in 29 children (15 males, 14 females; age range 7-17y, median age 11y) with unilateral CP classified at Gross Motor Function Classification…
Methods for assessing the impact of avermectins on the decomposer community of sheep pastures.
King, K L
1993-06-01
This paper outlines methods which can be used in the field assessment of potentially toxic chemicals such as the avermectins. The procedures focus on measuring the effects of the drug on decomposer organisms and the nutrient cycling process in pastures grazed by sheep. Measurements of decomposer activity are described along with methods for determining dry and organic matter loss and mineral loss from dung to the underlying soil. Sampling methods for both micro- and macro-invertebrates are discussed along with determination of the percentage infection of plant roots with vesicular-arbuscular mycorrhizal fungi. An integrated sampling unit for assessing the ecotoxicity of ivermectin in pastures grazed by sheep is presented.
On noninvasive assessment of acoustic fields acting on the fetus
NASA Astrophysics Data System (ADS)
Antonets, V. A.; Kazakov, V. V.
2014-05-01
The aim of this study is to verify a noninvasive technique for assessing the characteristics of acoustic fields in the audible range arising in the uterus under the action of maternal voice, external sounds, and vibrations. This problem is very important in view of actively developed methods for delivery of external sounds to the uterus: music, maternal voice recordings, sounds from outside the mother's body, etc., that supposedly support development of the fetus at the prenatal stage psychologically and cognitively. However, the parameters of acoustic signals have been neither measured nor normalized, which may be dangerous for the fetus and hinder actual assessment of their impact on fetal development. The authors show that at frequencies below 1 kHz, acoustic pressure in the uterus may be measured noninvasively using a hydrophone placed in a soft capsule filled with liquid. It was found that the acoustic field at frequencies up to 1 kHz arising in the uterus under the action of an external sound field has amplitude-frequency parameters close to those of the external field; i.e., the external field penetrates the uterus with hardly any difficulty.
Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions
Liu, C.; Charpentier, R.R.; Su, J.
2011-01-01
Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.
Schwenke, M; Hennemuth, A; Fischer, B; Friman, O
2012-01-01
Phase-contrast MRI (PC MRI) can be used to assess blood flow dynamics noninvasively inside the human body. The acquired images can be reconstructed into flow vector fields. Traditionally, streamlines can be computed based on the vector fields to visualize flow patterns and particle trajectories. The traditional methods may give a false impression of precision, as they do not consider the measurement uncertainty in the PC MRI images. In our prior work, we incorporated the uncertainty of the measurement into the computation of particle trajectories. As a major part of the contribution, a novel numerical scheme for solving the anisotropic Fast Marching problem is presented. A computing time comparison to state-of-the-art methods is conducted on artificial tensor fields. A visual comparison of healthy to pathological blood flow patterns is given. The comparison shows that the novel anisotropic Fast Marching solver outperforms previous schemes in terms of computing time. The visual comparison of flow patterns directly visualizes large deviations of pathological flow from healthy flow. The novel anisotropic Fast Marching solver efficiently resolves even strongly anisotropic path costs. The visualization method enables the user to assess the uncertainty of particle trajectories derived from PC MRI images.
Wang, Huiyong; Johnson, Nicholas; Bernardy, Jeffrey; Hubert, Terry; Li, Weiming
2013-01-01
Pheromones guide adult sea lamprey (Petromyzon marinus) to suitable spawning streams and mates, and therefore, when quantified, can be used to assess population size and guide management. Here, we present an efficient sample preparation method where 100 mL of river water was spiked with deuterated pheromone as an internal standard and underwent rapid field-based SPE and elution in the field. The combination of field extraction with laboratory UPLC-MS/MS reduced the sample consumption from 1 to 0.1 L, decreased the sample process time from more than 1 h to 10 min, and increased the precision and accuracy. The sensitivity was improved more than one order of magnitude compared with the previous method. The influences of experimental conditions were assessed to optimize the separation and peak shapes. The analytical method has been validated by studies of stability, selectivity, precision, and linearity and by the determination of the limits of detection and quantification. The method was used to quantify pheromone concentration from five streams tributary to Lake Ontario and to estimate that the environmental half-life of 3kPZS is about 26 h.
Hybrid Upwind Splitting (HUS) by a Field-by-Field Decomposition
NASA Technical Reports Server (NTRS)
Coquel, Frederic; Liou, Meng-Sing
1995-01-01
We introduce and develop a new approach for upwind biasing: the hybrid upwind splitting (HUS) method. This original procedure is based on a suitable hybridization of current prominent flux vector splitting (FVS) and flux difference splitting (FDS) methods. The HUS method is designed to naturally combine the respective strengths of the above methods while excluding their main deficiencies. Specifically, the HUS strategy yields a family of upwind methods that exhibit the robustness of FVS schemes in the capture of nonlinear waves and the accuracy of some FDS schemes in the resolution of linear waves. We give a detailed construction of the HUS methods following a general and systematic procedure directly performed at the basic level of the field by field (i.e. waves) decomposition involved in FDS methods. For such a given decomposition, each field is endowed either with FVS or FDS numerical fluxes, depending on the nonlinear nature of the field under consideration. Such a design principle is made possible thanks to the introduction of a convenient formalism that provides us with a unified framework for upwind methods. The HUS methods we propose bring significant improvements over current methods in terms of accuracy and robustness. They yield entropy-satisfying approximate solutions as they are strongly supported in numerical experiments. Field by field hybrid numerical fluxes also achieve fairly simple and explicit expressions and hence require a computational effort between that of the FVS and FDS. Several numerical experiments ranging from stiff 1D shock-tube to high speed viscous flows problems are displayed, intending to illustrate the benefits of the present approach. We assess in particular the relevance of our HUS schemes to viscous flow calculations.
Li, Longxiang; Gong, Jianhua; Zhou, Jieping
2014-01-01
Effective assessments of air-pollution exposure depend on the ability to accurately predict pollutant concentrations at unmonitored locations, which can be achieved through spatial interpolation. However, most interpolation approaches currently in use are based on the Euclidean distance, which cannot account for the complex nonlinear features displayed by air-pollution distributions in the wind-field. In this study, an interpolation method based on the shortest path distance is developed to characterize the impact of complex urban wind-field on the distribution of the particulate matter concentration. In this method, the wind-field is incorporated by first interpolating the observed wind-field from a meteorological-station network, then using this continuous wind-field to construct a cost surface based on Gaussian dispersion model and calculating the shortest wind-field path distances between locations, and finally replacing the Euclidean distances typically used in Inverse Distance Weighting (IDW) with the shortest wind-field path distances. This proposed methodology is used to generate daily and hourly estimation surfaces for the particulate matter concentration in the urban area of Beijing in May 2013. This study demonstrates that wind-fields can be incorporated into an interpolation framework using the shortest wind-field path distance, which leads to a remarkable improvement in both the prediction accuracy and the visual reproduction of the wind-flow effect, both of which are of great importance for the assessment of the effects of pollutants on human health. PMID:24798197
Li, Longxiang; Gong, Jianhua; Zhou, Jieping
2014-01-01
Effective assessments of air-pollution exposure depend on the ability to accurately predict pollutant concentrations at unmonitored locations, which can be achieved through spatial interpolation. However, most interpolation approaches currently in use are based on the Euclidean distance, which cannot account for the complex nonlinear features displayed by air-pollution distributions in the wind-field. In this study, an interpolation method based on the shortest path distance is developed to characterize the impact of complex urban wind-field on the distribution of the particulate matter concentration. In this method, the wind-field is incorporated by first interpolating the observed wind-field from a meteorological-station network, then using this continuous wind-field to construct a cost surface based on Gaussian dispersion model and calculating the shortest wind-field path distances between locations, and finally replacing the Euclidean distances typically used in Inverse Distance Weighting (IDW) with the shortest wind-field path distances. This proposed methodology is used to generate daily and hourly estimation surfaces for the particulate matter concentration in the urban area of Beijing in May 2013. This study demonstrates that wind-fields can be incorporated into an interpolation framework using the shortest wind-field path distance, which leads to a remarkable improvement in both the prediction accuracy and the visual reproduction of the wind-flow effect, both of which are of great importance for the assessment of the effects of pollutants on human health.
Magnetic field measurements near stand-alone transformer stations.
Kandel, Shaiela; Hareuveny, Ronen; Yitzhak, Nir-Mordechay; Ruppin, Raphael
2013-12-01
Extremely low-frequency (ELF) magnetic field (MF) measurements around and above three stand-alone 22/0.4-kV transformer stations have been performed. The low-voltage (LV) cables between the transformer and the LV switchgear were found to be the major source of strong ELF MFs of limited spatial extent. The strong fields measured above the transformer stations support the assessment method, to be used in future epidemiological studies, of classifying apartments located right above the transformer stations as highly exposed to MFs. The results of the MF measurements above the ground around the transformer stations provide a basis for the assessment of the option of implementing precautionary procedures.
For EPA, this Summer 2014, Denver CO, DISCOVER-AQ field research activity focused on assessing Federal Reference Methods (FRMs) and Federal Equivalent Methods (FEMs) for ozone (O3) and Nitrogen Dioxide (NO2), while comparing their operational performance to each other and to smal...
Rapid-estimation method for assessing scour at highway bridges
Holnbeck, Stephen R.
1998-01-01
A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.
ERIC Educational Resources Information Center
Perrotta, Carlo
2014-01-01
This paper uses methods derived from the field of futures studies to explore the future of technology-enhanced assessment. Drawing on interviews and consultation activities with experts, the paper aims to discuss the conditions that can impede or foster "innovation" in assessment and education more broadly. Through a review of relevant…
Fox, Aaron S; Bonacci, Jason; McLean, Scott G; Spittle, Michael; Saunders, Natalie
2016-05-01
Laboratory-based measures provide an accurate method to identify risk factors for anterior cruciate ligament (ACL) injury; however, these methods are generally prohibitive to the wider community. Screening methods that can be completed in a field or clinical setting may be more applicable for wider community use. Examination of field-based screening methods for ACL injury risk can aid in identifying the most applicable method(s) for use in these settings. The objective of this systematic review was to evaluate and compare field-based screening methods for ACL injury risk to determine their efficacy of use in wider community settings. An electronic database search was conducted on the SPORTDiscus™, MEDLINE, AMED and CINAHL databases (January 1990-July 2015) using a combination of relevant keywords. A secondary search of the same databases, using relevant keywords from identified screening methods, was also undertaken. Studies identified as potentially relevant were independently examined by two reviewers for inclusion. Where consensus could not be reached, a third reviewer was consulted. Original research articles that examined screening methods for ACL injury risk that could be undertaken outside of a laboratory setting were included for review. Two reviewers independently assessed the quality of included studies. Included studies were categorized according to the screening method they examined. A description of each screening method, and data pertaining to the ability to prospectively identify ACL injuries, validity and reliability, recommendations for identifying 'at-risk' athletes, equipment and training required to complete screening, time taken to screen athletes, and applicability of the screening method across sports and athletes were extracted from relevant studies. Of 1077 citations from the initial search, a total of 25 articles were identified as potentially relevant, with 12 meeting all inclusion/exclusion criteria. From the secondary search, eight further studies met all criteria, resulting in 20 studies being included for review. Five ACL-screening methods-the Landing Error Scoring System (LESS), Clinic-Based Algorithm, Observational Screening of Dynamic Knee Valgus (OSDKV), 2D-Cam Method, and Tuck Jump Assessment-were identified. There was limited evidence supporting the use of field-based screening methods in predicting ACL injuries across a range of populations. Differences relating to the equipment and time required to complete screening methods were identified. Only screening methods for ACL injury risk were included for review. Field-based screening methods developed for lower-limb injury risk in general may also incorporate, and be useful in, screening for ACL injury risk. Limited studies were available relating to the OSDKV and 2D-Cam Method. The LESS showed predictive validity in identifying ACL injuries, however only in a youth athlete population. The LESS also appears practical for community-wide use due to the minimal equipment and set-up/analysis time required. The Clinic-Based Algorithm may have predictive value for ACL injury risk as it identifies athletes who exhibit high frontal plane knee loads during a landing task, but requires extensive additional equipment and time, which may limit its application to wider community settings.
Validation of Field Methods to Assess Body Fat Percentage in Elite Youth Soccer Players.
Munguia-Izquierdo, Diego; Suarez-Arrones, Luis; Di Salvo, Valter; Paredes-Hernandez, Victor; Alcazar, Julian; Ara, Ignacio; Kreider, Richard; Mendez-Villanueva, Alberto
2018-05-01
This study determined the most effective field method for quantifying body fat percentage in male elite youth soccer players and developed prediction equations based on anthropometric variables. Forty-four male elite-standard youth soccer players aged 16.3-18.0 years underwent body fat percentage assessments, including bioelectrical impedance analysis and the calculation of various skinfold-based prediction equations. Dual X-ray absorptiometry provided a criterion measure of body fat percentage. Correlation coefficients, bias, limits of agreement, and differences were used as validity measures, and regression analyses were used to develop soccer-specific prediction equations. The equations from Sarria et al. (1998) and Durnin & Rahaman (1967) reached very large correlations and the lowest biases, and they reached neither the practically worthwhile difference nor the substantial difference between methods. The new youth soccer-specific skinfold equation included a combination of triceps and supraspinale skinfolds. None of the practical methods compared in this study are adequate for estimating body fat percentage in male elite youth soccer players, except for the equations from Sarria et al. (1998) and Durnin & Rahaman (1967). The new youth soccer-specific equation calculated in this investigation is the only field method specifically developed and validated in elite male players, and it shows potentially good predictive power. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
The Multidimensional Assessment of Interoceptive Awareness (MAIA)
Mehling, Wolf E.; Price, Cynthia; Daubenmier, Jennifer J.; Acree, Mike; Bartmess, Elizabeth; Stewart, Anita
2012-01-01
This paper describes the development of a multidimensional self-report measure of interoceptive body awareness. The systematic mixed-methods process involved reviewing the current literature, specifying a multidimensional conceptual framework, evaluating prior instruments, developing items, and analyzing focus group responses to scale items by instructors and patients of body awareness-enhancing therapies. Following refinement by cognitive testing, items were field-tested in students and instructors of mind-body approaches. Final item selection was achieved by submitting the field test data to an iterative process using multiple validation methods, including exploratory cluster and confirmatory factor analyses, comparison between known groups, and correlations with established measures of related constructs. The resulting 32-item multidimensional instrument assesses eight concepts. The psychometric properties of these final scales suggest that the Multidimensional Assessment of Interoceptive Awareness (MAIA) may serve as a starting point for research and further collaborative refinement. PMID:23133619
Risk analysis for veterinary biologicals released into the environment.
Silva, S V; Samagh, B S; Morley, R S
1995-12-01
All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.
Kinematic measurement from panned cinematography.
Gervais, P; Bedingfield, E W; Wronko, C; Kollias, I; Marchiori, G; Kuntz, J; Way, N; Kuiper, D
1989-06-01
Traditional 2-D cinematography has used a stationary camera with its optical axis perpendicular to the plane of motion. This method has constrained the size of the object plane or has introduced potential errors from a small subject image size with large object field widths. The purpose of this study was to assess a panning technique that could overcome the inherent limitations of small object field widths, small object image sizes and limited movement samples. The proposed technique used a series of reference targets in the object field that provided the necessary scales and origin translations. A 102 m object field was panned. Comparisons between criterion distances and film measured distances for field widths of 46 m and 22 m resulted in absolute mean differences that were comparable to that of the traditional method.
Šubelj, Lovro; van Eck, Nees Jan; Waltman, Ludo
2016-01-01
Clustering methods are applied regularly in the bibliometric literature to identify research areas or scientific fields. These methods are for instance used to group publications into clusters based on their relations in a citation network. In the network science literature, many clustering methods, often referred to as graph partitioning or community detection techniques, have been developed. Focusing on the problem of clustering the publications in a citation network, we present a systematic comparison of the performance of a large number of these clustering methods. Using a number of different citation networks, some of them relatively small and others very large, we extensively study the statistical properties of the results provided by different methods. In addition, we also carry out an expert-based assessment of the results produced by different methods. The expert-based assessment focuses on publications in the field of scientometrics. Our findings seem to indicate that there is a trade-off between different properties that may be considered desirable for a good clustering of publications. Overall, map equation methods appear to perform best in our analysis, suggesting that these methods deserve more attention from the bibliometric community.
Šubelj, Lovro; van Eck, Nees Jan; Waltman, Ludo
2016-01-01
Clustering methods are applied regularly in the bibliometric literature to identify research areas or scientific fields. These methods are for instance used to group publications into clusters based on their relations in a citation network. In the network science literature, many clustering methods, often referred to as graph partitioning or community detection techniques, have been developed. Focusing on the problem of clustering the publications in a citation network, we present a systematic comparison of the performance of a large number of these clustering methods. Using a number of different citation networks, some of them relatively small and others very large, we extensively study the statistical properties of the results provided by different methods. In addition, we also carry out an expert-based assessment of the results produced by different methods. The expert-based assessment focuses on publications in the field of scientometrics. Our findings seem to indicate that there is a trade-off between different properties that may be considered desirable for a good clustering of publications. Overall, map equation methods appear to perform best in our analysis, suggesting that these methods deserve more attention from the bibliometric community. PMID:27124610
Narayan, Sreenath; Kalhan, Satish C; Wilson, David L
2013-05-01
To reduce swaps in fat-water separation methods, a particular issue on 7 Tesla (T) small animal scanners due to field inhomogeneity, using image postprocessing innovations that detect and correct errors in the B0 field map. Fat-water decompositions and B0 field maps were computed for images of mice acquired on a 7T Bruker BioSpec scanner, using a computationally efficient method for solving the Markov Random Field formulation of the multi-point Dixon model. The B0 field maps were processed with a novel hole-filling method, based on edge strength between regions, and a novel k-means method, based on field-map intensities, which were iteratively applied to automatically detect and reinitialize error regions in the B0 field maps. Errors were manually assessed in the B0 field maps and chemical parameter maps both before and after error correction. Partial swaps were found in 6% of images when processed with FLAWLESS. After REFINED correction, only 0.7% of images contained partial swaps, resulting in an 88% decrease in error rate. Complete swaps were not problematic. Ex post facto error correction is a viable supplement to a priori techniques for producing globally smooth B0 field maps, without partial swaps. With our processing pipeline, it is possible to process image volumes rapidly, robustly, and almost automatically. Copyright © 2012 Wiley Periodicals, Inc.
A new method to assess damage to RCMRFs from period elongation and Park-Ang damage index using IDA
NASA Astrophysics Data System (ADS)
Aghagholizadeh, Mehrdad; Massumi, Ali
2016-09-01
Despite a significant progress in loading and design codes of seismic resistant structures and technology improvements in building structures, the field of civil engineering is still facing critical challenges. An example of those challenges is the assessment of the state of damage that has been imposed to a structure after earthquakes of different intensities. To determine the operability of a structure and its resistance to probable future earthquakes, quick assessment of damages and determining the operability of a structure after an earthquake are crucial. Present methods to calculate damage to structures are time consuming and do not accurately provide the rate of damage. Damage estimation is important task in the fields of structural health monitoring and decision-making. This study examines the relationship between period elongation and the Park-Ang damage index. A dynamic non-linear analysis is employed with IDARC program to calculate the amount of damage and period of the current state. This new method is shown to be a quick and accurate technique for damage assessment. It is easy to calculate the period of an existing structure and changes in the period which reflects changes in the stiffness matrix.
The use of computed radiography plates to determine light and radiation field coincidence.
Kerns, James R; Anand, Aman
2013-11-01
Photo-stimulable phosphor computed radiography (CR) has characteristics that allow the output to be manipulated by both radiation and optical light. The authors have developed a method that uses these characteristics to carry out radiation field and light field coincidence quality assurance on linear accelerators. CR detectors from Kodak were used outside their cassettes to measure both radiation and light field edges from a Varian linear accelerator. The CR detector was first exposed to a radiation field and then to a slightly smaller light field. The light impinged on the detector's latent image, removing to an extent the portion exposed to the light field. The detector was then digitally scanned. A MATLAB-based algorithm was developed to automatically analyze the images and determine the edges of the light and radiation fields, the vector between the field centers, and the crosshair center. Radiographic film was also used as a control to confirm the radiation field size. Analysis showed a high degree of repeatability with the proposed method. Results between the proposed method and radiographic film showed excellent agreement of the radiation field. The effect of varying monitor units and light exposure time was tested and found to be very small. Radiation and light field sizes were determined with an uncertainty of less than 1 mm, and light and crosshair centers were determined within 0.1 mm. A new method was developed to digitally determine the radiation and light field size using CR photo-stimulable phosphor plates. The method is quick and reproducible, allowing for the streamlined and robust assessment of light and radiation field coincidence, with no observer interpretation needed.
Analyzing Empirical Evaluations of Non-Experimental Methods in Field Settings
ERIC Educational Resources Information Center
Steiner, Peter M.; Wong, Vivian
2016-01-01
Despite recent emphasis on the use of randomized control trials (RCTs) for evaluating education interventions, in most areas of education research, observational methods remain the dominant approach for assessing program effects. Over the last three decades, the within-study comparison (WSC) design has emerged as a method for evaluating the…
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
A new method for water quality assessment: by harmony degree equation.
Zuo, Qiting; Han, Chunhui; Liu, Jing; Ma, Junxia
2018-02-22
Water quality assessment is an important basic work in the development, utilization, management, and protection of water resources, and also a prerequisite for water safety. In this paper, the harmony degree equation (HDE) was introduced into the research of water quality assessment, and a new method for water quality assessment was proposed according to the HDE: by harmony degree equation (WQA-HDE). First of all, the calculation steps and ideas of this method were described in detail, and then, this method with some other important methods of water quality assessment (single factor assessment method, mean-type comprehensive index assessment method, and multi-level gray correlation assessment method) were used to assess the water quality of the Shaying River (the largest tributary of the Huaihe in China). For this purpose, 2 years (2013-2014) dataset of nine water quality variables covering seven monitoring sites, and approximately 189 observations were used to compare and analyze the characteristics and advantages of the new method. The results showed that the calculation steps of WQA-HDE are similar to the comprehensive assessment method, and WQA-HDE is more operational comparing with the results of other water quality assessment methods. In addition, this new method shows good flexibility by setting the judgment criteria value HD 0 of water quality; when HD 0 = 0.8, the results are closer to reality, and more realistic and reliable. Particularly, when HD 0 = 1, the results of WQA-HDE are consistent with the single factor assessment method, both methods are subject to the most stringent "one vote veto" judgment condition. So, WQA-HDE is a composite method that combines the single factor assessment and comprehensive assessment. This research not only broadens the research field of theoretical method system of harmony theory but also promotes the unity of water quality assessment method and can be used for reference in other comprehensive assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, J.J. Jr.; Hyder, Z.
The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less
ERIC Educational Resources Information Center
Katch, Frank I.; Katch, Victor L.
1980-01-01
Sources of error in body composition assessment by laboratory and field methods can be found in hydrostatic weighing, residual air volume, skinfolds, and circumferences. Statistical analysis can and should be used in the measurement of body composition. (CJ)
A comparison of field methods to assess body composition in a diverse group of sedentary women.
D'Alonzo, Karen T; Aluf, Ana; Vincent, Linda; Cooper, Kristin
2009-01-01
Accurate assessment of body composition is essential in the evaluation of obesity. While laboratory methods are commonly used to assess fat mass, field measures (e.g., skinfold thickness [SKF] and bioelectrical impedance [BIA]) may be more practical for screening large numbers of individuals in intervention studies. In this study, a correlational design was used among 46 racially and ethnically diverse, sedentary women (mean age = 25.73 years) to (a) compare the percentage of body fat as determined by SKF and the upper body BIA and (b) examine the effects of body mass index (BMI), racial/ethnic background, age, and stage of the menstrual cycle on differences in the estimated percentage of body fat obtained using the SKF and BIA. Overall, a significant correlation between SKF and BIA (r = .98, p < .001) was found, with similar findings among Black, Hispanic and White non-Hispanic women. The mean differences between BIA and SKF were not significantly correlated with BMI, age, race/ethnicity or stage of the menstrual cycle. Data from this study suggest that BIA showed similar body fat prediction values compared with SKF and may be a viable alternative to SKF among diverse groups of healthy women. Additional testing and comparison of these field methods with the laboratory methods of hydro-densitometry or dual energy X-ray absorptiometry is recommended to further determine whether BIA devices can be routinely recommended as an alternative to the SKF.
Field Assessment of Enclosed Cab Filtration System Performance Using Particle Counting Measurements
Organiscak, John A.; Cecala, Andrew B.; Noll, James D.
2015-01-01
Enclosed cab filtration systems are typically used on mobile mining equipment to reduce miners’ exposure to airborne dust generated during mining operations. The National Institute for Occupational Safety and Health (NIOSH) Office of Mine Safety and Health Research (OMSHR) has recently worked with a mining equipment manufacturer to examine a new cab filtration system design for underground industrial minerals equipment. This cab filtration system uses a combination of three particulate filters to reduce equipment operators’ exposure to dust and diesel particulates present in underground industrial mineral mines. NIOSH initially examined this cab filtration system using a two-instrument particle counting method at the equipment company’s manufacturing shop facility to assess several alternative filters. This cab filtration system design was further studied on several pieces of equipment during a two- to seven-month period at two underground limestone mines. The two-instrument particle counting method was used outside the underground mine at the end of the production shifts to regularly test the cabs’ long-term protection factor performance with particulates present in the ambient air. This particle counting method showed that three of the four cabs achieved protection factors greater than 1,000 during the field studies. The fourth cab did not perform at this level because it had a damaged filter in the system. The particle counting measurements of submicron particles present in the ambient air were shown to be a timely and useful quantification method in assessing cab performance during these field studies. PMID:23915268
Using Bacterial Growth on Insects to Assess Nutrient Impacts in Streams
A. Dennis Lemly
2000-01-01
A combination field and laboratory study was conducted to evaluate the ability of a recently developed bioindicator to detect detrimental nutrient conditions in streams. The method utilizes bacterial growth on aquatic insects to determine nutrient impacts. Field investigations indicated that elevated concentrations of nitrate and phosphate were associated with growth...
Field assessment of wood stake decomposition in forest soil
Xiping Wang; Deborah Page-Dumroese; Martin F. Jurgensen; Robert J. Ross
2007-01-01
A pulse-echo acoustic method was investigated for evaluating wood stake decomposition in the field. A total of 58 wood stakes (29 loblolly pine, Pinus taeda, and 29 aspen, Populus tremuloides) that were vertically installed (full length) in forest soils were non-destructively tested by means of a laboratory-type acoustic...
Neuroethics and animals: methods and philosophy.
Takala, Tuija; Häyry, Matti
2014-04-01
This article provides an overview of the six other contributions in the Neuroethics and Animals special section. In addition, it discusses the methodological and theoretical problems of interdisciplinary fields. The article suggests that interdisciplinary approaches without established methodological and theoretical bases are difficult to assess scientifically. This might cause these fields to expand without actually advancing.
Resource-assessment perspectives for unconventional gas systems
Schmoker, J.W.
2002-01-01
Concepts are described for assessing those unconventional gas systems that can also be defined as continous accumulations. Continuous gas accumulations exist more or less independently of the water column and do not owe their existence directly to the bouyancy of gas in water. They cannot be represented in terms of individual, countable fields or pools delineated by downdip water contacts. For these reasons, traditional resource-assessment methods based on estimating the sizes and numbers of undiscovered discrete fields cannot not be applied to continuous accumulations. Specialized assessment methods are required. Unconventional gas systems that are also continous accumulations include coalbed methane, basin-centered gas, so-called tight gas, fractured shale (and chalk) gas, and gas hydrates. Deep-basin and bacterial gas systems may or may not be continuous accumulations, depending on their geologic setting. Two basic resource-assessment approaches have been employed for continous accumulations. The first approach is based on estimates of gas in place. A volumetric estimate of total gas in place is commonly coupled with an overall recovery factor to narrow the assessment scope from a treatment of gas volumes residing in sedimentary strata to a prediction of potential additions to reserves. The second approach is based on the production performance of continous gas reservoirs, as shown empirically by wells and reservoir-simulation models. In these methods, production characteristics (as opposed to gas in place) are the foundation for forecasts of potential additions to reserves.
Knowledge Transfer on Complex Social Interventions in Public Health: A Scoping Study
Dagenais, Christian; Malo, Marie; Robert, Émilie; Ouimet, Mathieu; Berthelette, Diane; Ridde, Valéry
2013-01-01
Objectives Scientific knowledge can help develop interventions that improve public health. The objectives of this review are (1) to describe the status of research on knowledge transfer strategies in the field of complex social interventions in public health and (2) to identify priorities for future research in this field. Method A scoping study is an exploratory study. After searching databases of bibliographic references and specialized periodicals, we summarized the relevant studies using a predetermined assessment framework. In-depth analysis focused on the following items: types of knowledge transfer strategies, fields of public health, types of publics, types of utilization, and types of research specifications. Results From the 1,374 references identified, we selected 26 studies. The strategies targeted mostly administrators of organizations and practitioners. The articles generally dealt with instrumental utilization and most often used qualitative methods. In general, the bias risk for the studies is high. Conclusion Researchers need to consider the methodological challenges in this field of research in order to improve assessment of more complex knowledge transfer strategies (when they exist), not just diffusion/dissemination strategies and conceptual and persuasive utilization. PMID:24324593
Assessment of the integrity of concrete bridge structures by acoustic emission technique
NASA Astrophysics Data System (ADS)
Yoon, Dong-Jin; Park, Philip; Jung, Juong-Chae; Lee, Seung-Seok
2002-06-01
This study was aimed at developing a new method for assessing the integrity of concrete structures. Especially acoustic emission technique was used in carrying out both laboratory experiment and field application. From the previous laboratory study, we confirmed that AE analysis provided a promising approach for estimating the level of damage and distress in concrete structures. The Felicity ratio, one of the key parameter for assessing damage, exhibits a favorable correlation with the overall damage level. The total number of AE events under stepwise cyclic loading also showed a good agreement with the damage level. In this study, a new suggested technique was applied to several concrete bridges in Korea in order to verify the applicability in field. The AE response was analyzed to obtain key parameters such as the total number and rate of AE events, AE parameter analysis for each event, and the characteristic features of the waveform as well as Felicity ratio analysis. Stepwise loading-unloading procedure for AE generation was introduced in field test by using each different weight of vehicle. According to the condition of bridge, for instance new or old bridge, AE event rate and AE generation behavior indicated many different aspects. The results showed that the suggested analyzing method would be a promising approach for assessing the integrity of concrete structures.
Software for improved field surveys of nesting marine turtles.
Anastácio, R; Gonzalez, J M; Slater, K; Pereira, M J
2017-09-07
Field data are still recorded on paper in many worldwide beach surveys of nesting marine turtles. The data must be subsequently transferred into an electronic database, and this can introduce errors in the dataset. To minimize such errors, the "Turtles" software was developed and piloted to record field data by one software user accompanying one Tortuguero in Akumal beaches, Quintana Roo, Mexico, from June 1 st to July 31 st during the night patrols. Comparisons were made between exported data from the software with the paper forms entered into a database (henceforth traditional). Preliminary assessment indicated that the software user tended to record a greater amount of metrics (i.e., an average of 18.3 fields ± 5.4 sd vs. 8.6 fields ± 2.1 sd recorded by the traditional method). The traditional method introduce three types of "errors" into a dataset: missing values in relevant fields (40.1%), different answers for the same value (9.8%), and inconsistent data (0.9%). Only 5.8% of these (missing values) were found with the software methodology. Although only tested by a single user, the software may suggest increased efficacy and warrants further examination to accurately assess the merit of replacing traditional methods of data recording for beach monitoring programmes.
A modular Human Exposure Model (HEM) framework to ...
Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historically focused on far-field sources (environmentally mediated exposures) while research has shown that use related exposures, (near-field exposures) typically dominate population exposure. Characterizing the human health impacts of chemicals in consumer products over the life cycle of these products requires an evaluation of both near-field as well far-field sources. Assessing the impacts of the near-field exposures requires bridging the scientific and technical gaps that currently prevent the harmonious use of the best available methods and tools from the fields of LCIA and human health exposure and risk assessment. The U.S. EPA’s Chemical Safety and Sustainability LC-HEM project is developing the Human Exposure Model (HEM) to assess near-field exposures to chemicals that occur to various populations over the life cycle of a commercial product. The HEM will be a publically available, web-based, modular system which will allow for the evaluation of chemical/product impacts in a LCIA framework to support CAA. We present here an overview of the framework for the modular HEM system. The framework includes a data flow diagram of in-progress and future planned modules, the definition of each mod
The state of rehabilitation research: art or science?
Tate, Denise G
2006-02-01
Rehabilitation research has been criticized as not standing up enough to the rigors of scientific method to be called "science." The field has been portrayed as slow to promote its scientific achievements and to include them under the rubric of evidence-based rehabilitation. Following in the footsteps of psychology, rehabilitation as a broad-based discipline has faced many similar obstacles in achieving scientific status. Controversy exists about what exactly constitutes rehabilitation science versus its art and its respective multidisciplinary domains. The conception of these domains is directly related to current methods available to assess the state of the discipline and its research accomplishments. I used quantitative methods, such as randomized clinical and/or controlled trials (RCTs) and systematic reviews, to assess the status of rehabilitation research. Findings suggest that, as a field, rehabilitation makes significant contributions to science, measurable by the number and quality of RCTs and systematic reviews conducted so far on topics of critical importance for clinical care. In "artful" complement, qualitative approaches can be used as research tools to aid investigators in seeking knowledge beyond that obtained by quantitative methods, assessing many complexities associated with the various contexts of rehabilitation research. Other requirements to develop a common vision of rehabilitation science are also discussed.
NASA Astrophysics Data System (ADS)
Yi, Chen; Isaev, A. E.; Yuebing, Wang; Enyakov, A. M.; Teng, Fei; Matveev, A. N.
2011-01-01
A description is given of the COOMET project 473/RU-a/09: a pilot comparison of hydrophone calibrations at frequencies from 250 Hz to 200 kHz between Hangzhou Applied Acoustics Research Institute (HAARI, China)—pilot laboratory—and Russian National Research Institute for Physicotechnical and Radio Engineering Measurements (VNIIFTRI, Designated Institute of Russia of the CIPM MRA). Two standard hydrophones, B&K 8104 and TC 4033, were calibrated and compared to assess the current state of hydrophone calibration of HAARI (China) and Russia. Three different calibration methods were applied: a vibrating column method, a free-field reciprocity method and a comparison method. The standard facilities of each laboratory were used, and three different sound fields were applied: pressure field, free-field and reverberant field. The maximum deviation of the sensitivities of two hydrophones between the participants' results was 0.36 dB. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCAUV-KCWG.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
Assessing occupational exposure to sea lamprey pesticides
Ceballos, Diana M; Beaucham, Catherine C; Kurtz, Kristine; Musolin, Kristin
2015-01-01
Background: Sea lampreys are parasitic fish found in lakes of the United States and Canada. Sea lamprey is controlled through manual application of the pesticides 3-trifluoromethyl-4-nitrophenol (TFM) and BayluscideTM into streams and tributaries. 3-Trifluoromethyl-4-nitrophenol may cause irritation and central nervous system depression and Bayluscide may cause irritation, dermatitis, blisters, cracking, edema, and allergic skin reactions. Objectives: To assess occupational exposures to sea lamprey pesticides. Methods: We developed a wipe method for evaluating surface and skin contamination with these pesticides. This method was field tested at a biological field station and at a pesticide river application. We also evaluated exposures using control banding tools. Results: We verified TFM surface contamination at the biological station. At the river application, we found surfaces and worker’s skin contaminated with pesticides. Conclusion: We recommended minimizing exposures by implementing engineering controls and improved use of personal protective equipment. PMID:25730600
Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J
2018-01-22
This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.
FIELD MEASUREMENT OF DISSOLVED OXYGEN: A COMPARISON OF METHODS
The ability to confidently measure the concentration of dissolved oxygen (D.O.) in ground water is a key aspect of remedial selection and assessment. Presented here is a comparison of the commonly practiced methods for determining D.O. concentrations in ground water, including c...
This report presents examples of the relationships between the results of laboratory leaching tests, as defined by the Leaching Environmental Assessment Framework (LEAF) or analogous international test methods, and leaching of constituents from a broad range of materials under di...
Introduction to a New Approach to Experiential Learning.
ERIC Educational Resources Information Center
Jackson, Lewis; MacIsaac, Doug
1994-01-01
A process model for experiential learning (EL) in adult education begins with the characteristics and needs of adult learners and conceptual foundations of EL. It includes methods and techniques for in-class and field-based experiences, building a folio (point-in-time performance assessment), and portfolio construction (assessing transitional…
Laamrani, Ahmed; Branson, Dave; Joosse, Pamela
2018-01-01
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as “app” method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph–grid and script methods (R2 = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by −6.3% and −10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., −5.3% vs. −7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming. PMID:29495497
Rossi, Gina; Videler, Arjan; van Alphen, S P J
2018-04-01
Since older adults often show an atypical presentation of (mal)adaptive personality traits and pathological states, the articles in this special issue will concisely discuss some perennial issues in clinical assessment in older adults and thus outline the main challenges this domain faces. By bringing empirical work and meta-analytic studies from leading scholars in the field of geropsychology, the articles will also address these challenges by reporting the latest developments in the field. This way, we hope to reshape the way clinicians and researchers assess (mal)adaptive personality and pathological states in older adults into a more reliable and valid assessment method that integrates the specific biopsychosocial context of older age.
An analytical framework for estimating aquatic species density from environmental DNA
Chambert, Thierry; Pilliod, David S.; Goldberg, Caren S.; Doi, Hideyuki; Takahara, Teruhiko
2018-01-01
Environmental DNA (eDNA) analysis of water samples is on the brink of becoming a standard monitoring method for aquatic species. This method has improved detection rates over conventional survey methods and thus has demonstrated effectiveness for estimation of site occupancy and species distribution. The frontier of eDNA applications, however, is to infer species density. Building upon previous studies, we present and assess a modeling approach that aims at inferring animal density from eDNA. The modeling combines eDNA and animal count data from a subset of sites to estimate species density (and associated uncertainties) at other sites where only eDNA data are available. As a proof of concept, we first perform a cross-validation study using experimental data on carp in mesocosms. In these data, fish densities are known without error, which allows us to test the performance of the method with known data. We then evaluate the model using field data from a study on a stream salamander species to assess the potential of this method to work in natural settings, where density can never be known with absolute certainty. Two alternative distributions (Normal and Negative Binomial) to model variability in eDNA concentration data are assessed. Assessment based on the proof of concept data (carp) revealed that the Negative Binomial model provided much more accurate estimates than the model based on a Normal distribution, likely because eDNA data tend to be overdispersed. Greater imprecision was found when we applied the method to the field data, but the Negative Binomial model still provided useful density estimates. We call for further model development in this direction, as well as further research targeted at sampling design optimization. It will be important to assess these approaches on a broad range of study systems.
The contribution of inductive electric fields to particle energization in the inner magnetosphere
NASA Astrophysics Data System (ADS)
Ilie, R.; Toth, G.; Liemohn, M. W.; Chan, A. A.
2017-12-01
Assessing the relative contribution of potential versus inductive electric fields at the energization of the hot ion population in the inner magnetosphere is only possible by thorough examination of the time varying magnetic field and current systems using global modeling of the entire system. We present here a method to calculate the inductive and potential components of electric field in the entire magnetosphere region. This method is based on the Helmholtz vector decomposition of the motional electric field as calculated by the BATS-R-US model, and is subject to boundary conditions. This approach removes the need to trace independent field lines and lifts the assumption that the magnetic field lines can be treated as frozen in a stationary ionosphere. In order to quantify the relative contributions of potential and inductive electric fields at driving plasma sheet ions into the inner magnetosphere, we apply this method for the March 17th, 2013 geomagnetic storm. We present here the consequences of slow continuous changes in the geomagnetic field as well as the strong tail dipolarizations on the distortion of the near-Earth magnetic field and current systems. Our findings indicate that the inductive component of the electric field is comparable, and even higher at times than the potential component, suggesting that the electric field induced by the time varying magnetic field plays a crucial role in the overall particle energization in the inner magnetosphere.
Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR
ERIC Educational Resources Information Center
Baglin, James
2014-01-01
Exploratory factor analysis (EFA) methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many…
RIVERINE ASSESSMENT USING MACROINVERTEBRATES: ALL METHODS ARE NOT CREATED EQUAL
In 1999, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those developed for three major programs (EMAP-SW, NAWQA, and Ohio EPA), at each of sixty sites across four tributaries to the Ohio River. Water chemistry samples and physi...
Quantin, P; Thélu, A; Catoire, S; Ficheux, H
2015-11-01
Risk assessment for personal care products requires the use of alternative methods since animal testing is now totally banned. Some of these methods are effective and have been validated by the "European Union Reference Laboratory for alternatives to animal testing"; but there is still a need for development and implementation of methods for specific endpoints. In this review, we have focused on dermal risk assessment because it is the prime route of absorption and main target organ for personal care products. Within this field, various areas must be assessed: irritation, sensitisation and toxicokinetic. Personal care product behaviour after use by the consumer and potential effects on the environment are also discussed. The purpose of this review is to show evolution and the prospects of alternative methods for safety dermal assessment. Assessment strategies must be adapted to the different chemical classes of substances studied but also to the way in which they are used. Finally, experimental and theoretical technical parameters that may impact on measured effects have been identified and discussed. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Wang, Hua; Zeng, Deping; Chen, Ziguang; Yang, Zengtao
2017-04-12
Based on the acousto-optic interaction, we propose a laser deflection method for rapidly, non-invasively and quantitatively measuring the peak positive pressure of HIFU fields. In the characterization of HIFU fields, the effect of nonlinear propagation is considered. The relation between the laser deflection length and the peak positive pressure is derived. Then the laser deflection method is assessed by comparing it with the hydrophone method. The experimental results show that the peak positive pressure measured by laser deflection method is little higher than that obtained by the hydrophone, confirming that they are in reasonable agreement. Considering that the peak pressure measured by hydrophones is always underestimated, the laser deflection method is assumed to be more accurate than the hydrophone method due to the absence of the errors in hydrophone spatial-averaging measurement and the influence of waveform distortion on hydrophone corrections. Moreover, noting that the Lorentz formula still remains applicable to high-pressure environments, the laser deflection method exhibits a great potential for measuring HIFU field under high-pressure amplitude. Additionally, the laser deflection method provides a rapid way for measuring the peak positive pressure, without the scan time, which is required by the hydrophones.
Systematic Field Study of NO(x) Emission Control Methods for Utility Boilers.
ERIC Educational Resources Information Center
Bartok, William; And Others
A utility boiler field test program was conducted. The objectives were to determine new or improved NO (x) emission factors by fossil fuel type and boiler design, and to assess the scope of applicability of combustion modification techniques for controlling NO (x) emissions from such installations. A statistically designed test program was…
Field method to measure changes in percent body fat of young women: The TIGER Study
USDA-ARS?s Scientific Manuscript database
Body mass index (BMI), waist (W) and hip (H) circumference (C) are commonly used to assess changes in body composition for field research. We developed a model to estimate changes in dual energy X-ray absorption (DXA) percent fat (% fat) from these variables with a diverse sample of young women fro...
ERIC Educational Resources Information Center
Begue-Simon, A-M.; Drolet, R. A.
1993-01-01
Difficulties in using the double-blind method of evaluation with use of Pulsed Electromagnetic Fields led to an open evaluation with 96 patients with musculoskeletal diseases, neurological disorders, circulatory diseases, or gastroenterological diseases. This paper reports the impact of use on dependency, pain, and patient satisfaction. (DB)
Flotemersch, Joseph E; North, Sheila; Blocksom, Karen A
2014-02-01
Benthic macroinvertebrates are sampled in streams and rivers as one of the assessment elements of the US Environmental Protection Agency's National Rivers and Streams Assessment. In a 2006 report, the recommendation was made that different yet comparable methods be evaluated for different types of streams (e.g., low gradient vs. high gradient). Consequently, a research element was added to the 2008-2009 National Rivers and Streams Assessment to conduct a side-by-side comparison of the standard macroinvertebrate sampling method with an alternate method specifically designed for low-gradient wadeable streams and rivers that focused more on stream edge habitat. Samples were collected using each method at 525 sites in five of nine aggregate ecoregions located in the conterminous USA. Methods were compared using the benthic macroinvertebrate multimetric index developed for the 2006 Wadeable Streams Assessment. Statistical analysis did not reveal any trends that would suggest the overall assessment of low-gradient streams on a regional or national scale would change if the alternate method was used rather than the standard sampling method, regardless of the gradient cutoff used to define low-gradient streams. Based on these results, the National Rivers and Streams Survey should continue to use the standard field method for sampling all streams.
Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields
Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.
2009-01-01
We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor field. PMID:19540791
Malhat, Farag; Kasiotis, Konstantinos M; Shalaby, Shehata
2018-02-05
Cyantraniliprole is an anthranilic diamide insecticide, belonging to the ryanoid class, with a broad range of applications against several pests. In the presented work, a reliable analytical technique employing high-performance liquid chromatography coupled with photodiode array detector (HPLC-DAD) for analyzing cyantraniliprole residues in tomato was developed. The method was then applied to field-incurred tomato samples collected after applications under open field conditions. The latter aimed to ensure the safe application of cyantraniliprole to tomato and contribute the derived residue data to the risk assessment under field conditions. Sample preparation involved a single step extraction with acetonitrile and sodium chloride for partitioning. The extract was purified utilizing florisil as cleanup reagent. The developed method was further evaluated by comparing the analytical results with those obtained using the QuEChERS technique. The novel method outbalanced QuEChERS regarding matrix interferences in the analysis, while it met all guideline criteria. Hence, it showed excellent linearity over the assayed concentration and yielded satisfactory recovery rate in the range of 88.9 to 96.5%. The half-life of degradation of cyantraniliprole was determined at 2.6 days. Based on the Codex MRL, the pre-harvest interval (PHI) for cyantraniliprole on tomato was 3 days, after treatment at the recommended dose. To our knowledge, the present work provides the first record on PHI determination of cyantraniliprole in tomato under open field conditions in Egypt and the broad Mediterranean region.
Grane, Camilla
2018-01-01
Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.
Accuracy assessment of high resolution satellite imagery orientation by leave-one-out method
NASA Astrophysics Data System (ADS)
Brovelli, Maria Antonia; Crespi, Mattia; Fratarcangeli, Francesca; Giannone, Francesca; Realini, Eugenio
Interest in high-resolution satellite imagery (HRSI) is spreading in several application fields, at both scientific and commercial levels. Fundamental and critical goals for the geometric use of this kind of imagery are their orientation and orthorectification, processes able to georeference the imagery and correct the geometric deformations they undergo during acquisition. In order to exploit the actual potentialities of orthorectified imagery in Geomatics applications, the definition of a methodology to assess the spatial accuracy achievable from oriented imagery is a crucial topic. In this paper we want to propose a new method for accuracy assessment based on the Leave-One-Out Cross-Validation (LOOCV), a model validation method already applied in different fields such as machine learning, bioinformatics and generally in any other field requiring an evaluation of the performance of a learning algorithm (e.g. in geostatistics), but never applied to HRSI orientation accuracy assessment. The proposed method exhibits interesting features which are able to overcome the most remarkable drawbacks involved by the commonly used method (Hold-Out Validation — HOV), based on the partitioning of the known ground points in two sets: the first is used in the orientation-orthorectification model (GCPs — Ground Control Points) and the second is used to validate the model itself (CPs — Check Points). In fact the HOV is generally not reliable and it is not applicable when a low number of ground points is available. To test the proposed method we implemented a new routine that performs the LOOCV in the software SISAR, developed by the Geodesy and Geomatics Team at the Sapienza University of Rome to perform the rigorous orientation of HRSI; this routine was tested on some EROS-A and QuickBird images. Moreover, these images were also oriented using the world recognized commercial software OrthoEngine v. 10 (included in the Geomatica suite by PCI), manually performing the LOOCV since only the HOV is implemented. The software comparison guaranteed about the overall correctness and good performances of the SISAR model, whereas the results showed the good features of the LOOCV method.
NASA Astrophysics Data System (ADS)
Talvik, Silja; Oja, Tõnis; Ellmann, Artu; Jürgenson, Harli
2014-05-01
Gravity field models in a regional scale are needed for a number of applications, for example national geoid computation, processing of precise levelling data and geological modelling. Thus the methods applied for modelling the gravity field from surveyed gravimetric information need to be considered carefully. The influence of using different gridding methods, the inclusion of unit or realistic weights and indirect gridding of free air anomalies (FAA) are investigated in the study. Known gridding methods such as kriging (KRIG), least squares collocation (LSCO), continuous curvature (CCUR) and optimal Delaunay triangulation (ODET) are used for production of gridded gravity field surfaces. As the quality of data collected varies considerably depending on the methods and instruments available or used in surveying it is important to somehow weigh the input data. This puts additional demands on data maintenance as accuracy information needs to be available for each data point participating in the modelling which is complicated by older gravity datasets where the uncertainties of not only gravity values but also supplementary information such as survey point position are not always known very accurately. A number of gravity field applications (e.g. geoid computation) demand foran FAA model, the acquisition of which is also investigated. Instead of direct gridding it could be more appropriate to proceed with indirect FAA modelling using a Bouguer anomaly grid to reduce the effect of topography on the resulting FAA model (e.g. near terraced landforms). The inclusion of different gridding methods, weights and indirect FAA modelling helps to improve gravity field modelling methods. It becomes possible to estimate the impact of varying methodical approaches on the gravity field modelling as statistical output is compared. Such knowledge helps assess the accuracy of gravity field models and their effect on the aforementioned applications.
Quantitative Estimation of Risks for Production Unit Based on OSHMS and Process Resilience
NASA Astrophysics Data System (ADS)
Nyambayar, D.; Koshijima, I.; Eguchi, H.
2017-06-01
Three principal elements in the production field of chemical/petrochemical industry are (i) Production Units, (ii) Production Plant Personnel and (iii) Production Support System (computer system introduced for improving productivity). Each principal element has production process resilience, i.e. a capability to restrain disruptive signals occurred in and out of the production field. In each principal element, risk assessment is indispensable for the production field. In a production facility, the occupational safety and health management system (Hereafter, referred to as OSHMS) has been introduced to reduce a risk of accidents and troubles that may occur during production. In OSHMS, a risk assessment is specified to reduce a potential risk in the production facility such as a factory, and PDCA activities are required for a continual improvement of safety production environments. However, there is no clear statement to adopt the OSHMS standard into the production field. This study introduces a metric to estimate the resilience of the production field by using the resilience generated by the production plant personnel and the result of the risk assessment in the production field. A method for evaluating how OSHMS functions are systematically installed in the production field is also discussed based on the resilience of the three principal elements.
Clinical efficacy and effectiveness of 3D printing: a systematic review
Diment, Laura E; Thompson, Mark S; Bergmann, Jeroen H M
2017-01-01
Objective To evaluate the clinical efficacy and effectiveness of using 3D printing to develop medical devices across all medical fields. Design Systematic review compliant with Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Data sources PubMed, Web of Science, OVID, IEEE Xplore and Google Scholar. Methods A double-blinded review method was used to select all abstracts up to January 2017 that reported on clinical trials of a three-dimensional (3D)-printed medical device. The studies were ranked according to their level of evidence, divided into medical fields based on the International Classification of Diseases chapter divisions and categorised into whether they were used for preoperative planning, aiding surgery or therapy. The Downs and Black Quality Index critical appraisal tool was used to assess the quality of reporting, external validity, risk of bias, risk of confounding and power of each study. Results Of the 3084 abstracts screened, 350 studies met the inclusion criteria. Oral and maxillofacial surgery contained 58.3% of studies, and 23.7% covered the musculoskeletal system. Only 21 studies were randomised controlled trials (RCTs), and all fitted within these two fields. The majority of RCTs were 3D-printed anatomical models for preoperative planning and guides for aiding surgery. The main benefits of these devices were decreased surgical operation times and increased surgical accuracy. Conclusions All medical fields that assessed 3D-printed devices concluded that they were clinically effective. The fields that most rigorously assessed 3D-printed devices were oral and maxillofacial surgery and the musculoskeletal system, both of which concluded that the 3D-printed devices outperformed their conventional comparators. However, the efficacy and effectiveness of 3D-printed devices remain undetermined for the majority of medical fields. 3D-printed devices can play an important role in healthcare, but more rigorous and long-term assessments are needed to determine if 3D-printed devices are clinically relevant before they become part of standard clinical practice. PMID:29273650
Lecourt, Julien; Bishop, Gerard
2018-01-01
Global food security for the increasing world population not only requires increased sustainable production of food but a significant reduction in pre- and post-harvest waste. The timing of when a fruit is harvested is critical for reducing waste along the supply chain and increasing fruit quality for consumers. The early in-field assessment of fruit ripeness and prediction of the harvest date and yield by non-destructive technologies have the potential to revolutionize farming practices and enable the consumer to eat the tastiest and freshest fruit possible. A variety of non-destructive techniques have been applied to estimate the ripeness or maturity but not all of them are applicable for in situ (field or glasshouse) assessment. This review focuses on the non-destructive methods which are promising for, or have already been applied to, the pre-harvest in-field measurements including colorimetry, visible imaging, spectroscopy and spectroscopic imaging. Machine learning and regression models used in assessing ripeness are also discussed. PMID:29320410
Dyer, Bryce
2015-06-01
This study introduces the importance of the aerodynamics to prosthetic limb design for athletes with either a lower-limb or upper-limb amputation. The study comprises two elements: 1) An initial experiment investigating the stability of outdoor velodrome-based field tests, and 2) An experiment evaluating the application of outdoor velodrome aerodynamic field tests to detect small-scale changes in aerodynamic drag respective of prosthetic limb componentry changes. An outdoor field-testing method is used to detect small and repeatable changes in the aerodynamic drag of an able-bodied cyclist. These changes were made at levels typical of alterations in prosthetic componentry. The field-based test method of assessment is used at a smaller level of resolution than previously reported. With a carefully applied protocol, the field test method proved to be statistically stable. The results of the field test experiments demonstrate a noticeable change in overall athlete performance. Aerodynamic refinement of artificial limbs is worthwhile for athletes looking to maximise their competitive performance. A field-testing method illustrates the importance of the aerodynamic optimisation of prosthetic limb components. The field-testing protocol undertaken in this study gives an accessible and affordable means of doing so by prosthetists and sports engineers. Using simple and accessible field-testing methods, this exploratory experiment demonstrates how small changes to riders' equipment, consummate of the scale of a small change in prosthetics componentry, can affect the performance of an athlete. Prosthetists should consider such opportunities for performance enhancement when possible. © The International Society for Prosthetics and Orthotics 2014.
Monitoring colony-level effects of sublethal pesticide exposure on honey bees
USDA-ARS?s Scientific Manuscript database
The effects of sublethal pesticide exposure to honey bee colonies may be significant but difficult to detect in the field using standard visual assessment methods. Here we describe methods to measure the quantities of adult bees, brood and food resources by weighing hives and hive parts, by photogra...
Position Description Analysis: A Method for Describing Academic Roles and Functions.
ERIC Educational Resources Information Center
Renner, K. Edward; Skibbens, Ronald J.
1990-01-01
The Position Description Analysis method for assessing the discrepancy between status quo and specializations needed by institutions to meet new demands and expectations is presented using Dalhousie University (Nova Scotia) as a case study. Dramatic realignment of fields of specialization and change strategies accommodating the aging professoriate…
Concepts in ecological risk assessment. Professional paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, R.K.; Seligman, P.F.
1991-05-01
Assessing the risk of impact to natural ecosystems from xenobiotic compounds requires an accurate characterization of the threatened ecosystem, direct measures or estimates of environmental exposure, and a comprehensive evaluation of the biological effects from actual and potential contamination. Field and laboratory methods have been developed to obtain direct measures of environmental health. These methods have been implemented in monitoring programs to assess and verify the ecological risks of contamination from past events, such as hazardous waste disposal sites, as well as future scenarios, such as the environmental consequences from the use of biocides in antifouling bottom paints for ships.
Measuring zoo animal welfare: theory and practice.
Hill, Sonya P; Broom, Donald M
2009-11-01
The assessment of animal welfare relates to investigations of how animals try to cope with their environment, and how easy or how difficult it is for them to do so. The use of rigorous scientific methods to assess this has grown over the past few decades, and so our understanding of the needs of animals has improved during this time. Much of the work in the field of animal welfare has been conducted on farm animals, but it is important to consider how the methods and approaches used in assessing farm animal welfare have been, and can be, adapted and applied to the measurement of welfare in animals in other domains, such as in zoos. This is beneficial to our understanding of both the theoretical knowledge, and the practicability of methods. In this article, some of the commonly-used methods for measuring animal welfare will be discussed, as well as some practical considerations in assessing the welfare of zoo animals.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-04
... implemented in the field to benefit highly vulnerable populations, including recommendations on the most cost... Institute of Food and Agriculture, USDA. ACTION: Notice; request for public comment. SUMMARY: The National... Extension Service (CSREES) requests input from the public regarding (1) the assessment of methods and tools...
Using Oral Exams to Assess Communication Skills in Business Courses
ERIC Educational Resources Information Center
Burke-Smalley, Lisa A.
2014-01-01
Business, like many other fields in higher education, continues to rely largely on conventional testing methods for assessing student learning. In the current article, another evaluation approach--the oral exam--is examined as a means for building and evaluating the professional communication and oral dialogue skills needed and utilized by…
Variables Associated with Treatment Failure among Adolescent Sex Offenders
ERIC Educational Resources Information Center
Eastman, Brenda J.
2005-01-01
While an adolescent sexual offender's response to treatment is thought to be impacted by both static and dynamic factors, there is no objective method of assessing the likelihood of success or failure in treatment. The assessment of amenability to treatment is generally a subjective process completed by clinicians in the field. Using descriptive…
The research from this REMAP project produced results that demonstrate various stages of an assessment strategy and produced tools including an inventory classification, field methods and multimetric biotic indices that are now available for use by environmental resource managers...
Friedman, Lawrence H.; Vaudin, Mark D.; Stranick, Stephan J.; Stan, Gheorghe; Gerbig, Yvonne B.; Osborn, William; Cook, Robert F.
2016-01-01
The accuracy of electron backscatter diffraction (EBSD) and confocal Raman microscopy (CRM) for small-scale strain mapping are assessed using the multi-axial strain field surrounding a wedge indentation in Si as a test vehicle. The strain field is modeled using finite element analysis (FEA) that is adapted to the near-indentation surface profile measured by atomic force microscopy (AFM). The assessment consists of (1) direct experimental comparisons of strain and deformation and (2) comparisons in which the modeled strain field is used as an intermediate step. Direct experimental methods (1) consist of comparisons of surface elevation and gradient measured by AFM and EBSD and of Raman shifts measured and predicted by CRM and EBSD, respectively. Comparisons that utilize the combined FEA-AFM model (2) consist of predictions of distortion, strain, and rotation for comparison with EBSD measurements and predictions of Raman shift for comparison with CRM measurements. For both EBSD and CRM, convolution of measurements in depth-varying strain fields is considered. The interconnected comparisons suggest that EBSD was able to provide an accurate assessment of the wedge indentation deformation field to within the precision of the measurements, approximately 2 × 10−4 in strain. CRM was similarly precise, but was limited in accuracy to several times this value. PMID:26939030
NASA Astrophysics Data System (ADS)
Tuckness, D. G.; Jost, B.
1995-08-01
Current knowledge of the lunar gravity field is presented. The various methods used in determining these gravity fields are investigated and analyzed. It will be shown that weaknesses exist in the current models of the lunar gravity field. The dominant part of this weakness is caused by the lack of lunar tracking data information (farside, polar areas), which makes modeling the total lunar potential difficult. Comparisons of the various lunar models reveal an agreement in the low-order coefficients of the Legendre polynomials expansions. However, substantial differences in the models can exist in the higher-order harmonics. The main purpose of this study is to assess today's lunar gravity field models for use in tomorrow's lunar mission designs and operations.
Liorni, I; Parazzini, M; Fiocchi, S; Guadagnin, V; Ravazzani, P
2014-01-01
Polynomial Chaos (PC) is a decomposition method used to build a meta-model, which approximates the unknown response of a model. In this paper the PC method is applied to the stochastic dosimetry to assess the variability of human exposure due to the change of the orientation of the B-field vector respect to the human body. In detail, the analysis of the pregnant woman exposure at 7 months of gestational age is carried out, to build-up a statistical meta-model of the induced electric field for each fetal tissue and in the fetal whole-body by means of the PC expansion as a function of the B-field orientation, considering a uniform exposure at 50 Hz.
Briand, Françoise; Guerin, Patrick M.; Charmillot, Pierre-Joseph; Kehrli, Patrik
2012-01-01
Mating disruption by sex pheromones is a sustainable, effective and widely used pest management scheme. A drawback of this technique is its challenging assessment of effectiveness in the field (e.g., spatial scale, pest density). The aim of this work was to facilitate the evaluation of field-deployed pheromone dispensers. We tested the suitability of small insect field cages for a pre-evaluation of the impact of sex pheromones on mating using the grape moths Eupoecilia ambiguella and Lobesia botrana, two major pests in vineyards. Cages consisted of a cubic metal frame of 35 cm sides, which was covered with a mosquito net of 1500 μm mesh size. Cages were installed in the centre of pheromone-treated and untreated vineyards. In several trials, 1 to 20 couples of grape moths per cage were released for one to three nights. The proportion of mated females was between 15 to 70% lower in pheromone-treated compared to untreated vineyards. Overall, the exposure of eight couples for one night was adequate for comparing different control schemes. Small cages may therefore provide a fast and cheap method to compare the effectiveness of pheromone dispensers under standardised semi-field conditions and may help predict the value of setting-up large-scale field trials. PMID:22645483
Dynamical Downscaling of NASA/GISS ModelE: Continuous, Multi-Year WRF Simulations
NASA Astrophysics Data System (ADS)
Otte, T.; Bowden, J. H.; Nolte, C. G.; Otte, M. J.; Herwehe, J. A.; Faluvegi, G.; Shindell, D. T.
2010-12-01
The WRF Model is being used at the U.S. EPA for dynamical downscaling of the NASA/GISS ModelE fields to assess regional impacts of climate change in the United States. The WRF model has been successfully linked to the ModelE fields in their raw hybrid vertical coordinate, and continuous, multi-year WRF downscaling simulations have been performed. WRF will be used to downscale decadal time slices of ModelE for recent past, current, and future climate as the simulations being conducted for the IPCC Fifth Assessment Report become available. This presentation will focus on the sensitivity to interior nudging within the RCM. The use of interior nudging for downscaled regional climate simulations has been somewhat controversial over the past several years but has been recently attracting attention. Several recent studies that have used reanalysis (i.e., verifiable) fields as a proxy for GCM input have shown that interior nudging can be beneficial toward achieving the desired downscaled fields. In this study, the value of nudging will be shown using fields from ModelE that are downscaled using WRF. Several different methods of nudging are explored, and it will be shown that the method of nudging and the choices made with respect to how nudging is used in WRF are critical to balance the constraint of ModelE against the freedom of WRF to develop its own fields.
2011-01-01
Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM) plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set. PMID:21324199
NASA Astrophysics Data System (ADS)
Zaccheo, T. S.; Pernini, T.; Botos, C.; Dobler, J. T.; Blume, N.; Braun, M.; Levine, Z. H.; Pintar, A. L.
2014-12-01
This work presents a methodology for constructing 2D estimates of CO2 field concentrations from integrated open path measurements of CO2 concentrations. It provides a description of the methodology, an assessment based on simulated data and results from preliminary field trials. The Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE) system, currently under development by Exelis and AER, consists of a set of laser-based transceivers and a number of retro-reflectors coupled with a cloud-based compute environment to enable real-time monitoring of integrated CO2 path concentrations, and provides 2D maps of estimated concentrations over an extended area of interest. The GreenLITE transceiver-reflector pairs provide laser absorption spectroscopy (LAS) measurements of differential absorption due to CO2 along intersecting chords within the field of interest. These differential absorption values for the intersecting chords of horizontal path are not only used to construct estimated values of integrated concentration, but also employed in an optimal estimation technique to derive 2D maps of underlying concentration fields. This optimal estimation technique combines these sparse data with in situ measurements of wind speed/direction and an analytic plume model to provide tomographic-like reconstruction of the field of interest. This work provides an assessment of this reconstruction method and preliminary results from the Fall 2014 testing at the Zero Emissions Research and Technology (ZERT) site in Bozeman, Montana. This work is funded in part under the GreenLITE program developed under a cooperative agreement between Exelis and the National Energy and Technology Laboratory (NETL) under the Department of Energy (DOE), contract # DE-FE0012574. Atmospheric and Environmental Research, Inc. is a major partner in this development.
Cross-comparison and evaluation of air pollution field estimation methods
NASA Astrophysics Data System (ADS)
Yu, Haofei; Russell, Armistead; Mulholland, James; Odman, Talat; Hu, Yongtao; Chang, Howard H.; Kumar, Naresh
2018-04-01
Accurate estimates of human exposure is critical for air pollution health studies and a variety of methods are currently being used to assign pollutant concentrations to populations. Results from these methods may differ substantially, which can affect the outcomes of health impact assessments. Here, we applied 14 methods for developing spatiotemporal air pollutant concentration fields of eight pollutants to the Atlanta, Georgia region. These methods include eight methods relying mostly on air quality observations (CM: central monitor; SA: spatial average; IDW: inverse distance weighting; KRIG: kriging; TESS-D: discontinuous tessellation; TESS-NN: natural neighbor tessellation with interpolation; LUR: land use regression; AOD: downscaled satellite-derived aerosol optical depth), one using the RLINE dispersion model, and five methods using a chemical transport model (CMAQ), with and without using observational data to constrain results. The derived fields were evaluated and compared. Overall, all methods generally perform better at urban than rural area, and for secondary than primary pollutants. We found the CM and SA methods may be appropriate only for small domains, and for secondary pollutants, though the SA method lead to large negative spatial correlations when using data withholding for PM2.5 (spatial correlation coefficient R = -0.81). The TESS-D method was found to have major limitations. Results of the IDW, KRIG and TESS-NN methods are similar. They are found to be better suited for secondary pollutants because of their satisfactory temporal performance (e.g. average temporal R2 > 0.85 for PM2.5 but less than 0.35 for primary pollutant NO2). In addition, they are suitable for areas with relatively dense monitoring networks due to their inability to capture spatial concentration variabilities, as indicated by the negative spatial R (lower than -0.2 for PM2.5 when assessed using data withholding). The performance of LUR and AOD methods were similar to kriging. Using RLINE and CMAQ fields without fusing observational data led to substantial errors and biases, though the CMAQ model captured spatial gradients reasonably well (spatial R = 0.45 for PM2.5). Two unique tests conducted here included quantifying autocorrelation of method biases (which can be important in time series analyses) and how well the methods capture the observed interspecies correlations (which would be of particular importance in multipollutant health assessments). Autocorrelation of method biases lasted longest and interspecies correlations of primary pollutants was higher than observations when air quality models were used without data fusing. Use of hybrid methods that combine air quality model outputs with observational data overcome some of these limitations and is better suited for health studies. Results from this study contribute to better understanding the strengths and weaknesses of different methods for estimating human exposures.
Learning Receptive Fields and Quality Lookups for Blind Quality Assessment of Stereoscopic Images.
Shao, Feng; Lin, Weisi; Wang, Shanshan; Jiang, Gangyi; Yu, Mei; Dai, Qionghai
2016-03-01
Blind quality assessment of 3D images encounters more new challenges than its 2D counterparts. In this paper, we propose a blind quality assessment for stereoscopic images by learning the characteristics of receptive fields (RFs) from perspective of dictionary learning, and constructing quality lookups to replace human opinion scores without performance loss. The important feature of the proposed method is that we do not need a large set of samples of distorted stereoscopic images and the corresponding human opinion scores to learn a regression model. To be more specific, in the training phase, we learn local RFs (LRFs) and global RFs (GRFs) from the reference and distorted stereoscopic images, respectively, and construct their corresponding local quality lookups (LQLs) and global quality lookups (GQLs). In the testing phase, blind quality pooling can be easily achieved by searching optimal GRF and LRF indexes from the learnt LQLs and GQLs, and the quality score is obtained by combining the LRF and GRF indexes together. Experimental results on three publicly 3D image quality assessment databases demonstrate that in comparison with the existing methods, the devised algorithm achieves high consistent alignment with subjective assessment.
Selvakumar, N; Murthy, B N; Prabhakaran, E; Sivagamasundari, S; Vasanthan, Samuel; Perumal, M; Govindaraju, R; Chauhan, L S; Wares, Fraser; Santha, T; Narayanan, P R
2005-02-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs.
Selvakumar, N.; Murthy, B. N.; Prabhakaran, E.; Sivagamasundari, S.; Vasanthan, Samuel; Perumal, M.; Govindaraju, R.; Chauhan, L. S.; Wares, Fraser; Santha, T.; Narayanan, P. R.
2005-01-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs. PMID:15695704
Evaluation of methods for the assessment of attention while driving.
Kircher, Katja; Ahlstrom, Christer
2018-05-01
The ability to assess the current attentional state of the driver is important for many aspects of driving, not least in the field of partial automation for transfer of control between vehicle and driver. Knowledge about the driver's attentional state is also necessary for the assessment of the effects of additional tasks on attention. The objective of this paper is to evaluate different methods that can be used to assess attention, first theoretically, and then empirically in a controlled field study and in the laboratory. Six driving instructors participated in all experimental conditions of the study, delivering within-subjects data for all tested methods. Additional participants were recruited for some of the conditions. The test route consisted of 14km of motorway with low to moderate traffic, which was driven three times per participant per condition. The on-road conditions were: baseline, driving with eye tracking and self-paced visual occlusion, and driving while thinking aloud. The laboratory conditions were: Describing how attention should be distributed on a motorway, and thinking aloud while watching a video from the baseline drive. The results show that visual occlusion, especially in combination with eye tracking, was appropriate for assessing spare capacity. The think aloud protocol was appropriate to gain insight about the driver's actual mental representation of the situation at hand. Expert judgement in the laboratory was not reliable for the assessment of drivers' attentional distribution in traffic. Across all assessment techniques, it is evident that meaningful assessment of attention in a dynamic traffic situation can only be achieved when the infrastructure layout, surrounding road users, and intended manoeuvres are taken into account. This requires advanced instrumentation of the vehicle, and subsequent data reduction, analysis and interpretation are demanding. In conclusion, driver attention assessment in real traffic is a complex task, but a combination of visual occlusion, eye tracking and thinking aloud is a promising combination of methods to come further on the way. Copyright © 2017 Elsevier Ltd. All rights reserved.
Near-Sun and 1 AU magnetic field of coronal mass ejections: a parametric study
NASA Astrophysics Data System (ADS)
Patsourakos, S.; Georgoulis, M. K.
2016-11-01
Aims: The magnetic field of coronal mass ejections (CMEs) determines their structure, evolution, and energetics, as well as their geoeffectiveness. However, we currently lack routine diagnostics of the near-Sun CME magnetic field, which is crucial for determining the subsequent evolution of CMEs. Methods: We recently presented a method to infer the near-Sun magnetic field magnitude of CMEs and then extrapolate it to 1 AU. This method uses relatively easy to deduce observational estimates of the magnetic helicity in CME-source regions along with geometrical CME fits enabled by coronagraph observations. We hereby perform a parametric study of this method aiming to assess its robustness. We use statistics of active region (AR) helicities and CME geometrical parameters to determine a matrix of plausible near-Sun CME magnetic field magnitudes. In addition, we extrapolate this matrix to 1 AU and determine the anticipated range of CME magnetic fields at 1 AU representing the radial falloff of the magnetic field in the CME out to interplanetary (IP) space by a power law with index αB. Results: The resulting distribution of the near-Sun (at 10 R⊙) CME magnetic fields varies in the range [0.004, 0.02] G, comparable to, or higher than, a few existing observational inferences of the magnetic field in the quiescent corona at the same distance. We also find that a theoretically and observationally motivated range exists around αB = -1.6 ± 0.2, thereby leading to a ballpark agreement between our estimates and observationally inferred field magnitudes of magnetic clouds (MCs) at L1. Conclusions: In a statistical sense, our method provides results that are consistent with observations.
Gravity Field Recovery from the Cartwheel Formation by the Semi-analytical Approach
NASA Astrophysics Data System (ADS)
Li, Huishu; Reubelt, Tilo; Antoni, Markus; Sneeuw, Nico; Zhong, Min; Zhou, Zebing
2016-04-01
Past and current gravimetric satellite missions have contributed drastically to our knowledge of the Earth's gravity field. Nevertheless, several geoscience disciplines push for even higher requirements on accuracy, homogeneity and time- and space-resolution of the Earth's gravity field. Apart from better instruments or new observables, alternative satellite formations could improve the signal and error structure. With respect to other methods, one significant advantage of the semi-analytical approach is its effective pre-mission error assessment for gravity field missions. The semi-analytical approach builds a linear analytical relationship between the Fourier spectrum of the observables and the spherical harmonic spectrum of the gravity field. The spectral link between observables and gravity field parameters is given by the transfer coefficients, which constitutes the observation model. In connection with a stochastic model, it can be used for pre-mission error assessment of gravity field mission. The cartwheel formation is formed by two satellites on elliptic orbits in the same plane. The time dependent ranging will be considered in the transfer coefficients via convolution including the series expansion of the eccentricity functions. The transfer coefficients are applied to assess the error patterns, which are caused by different orientation of the cartwheel for range-rate and range acceleration. This work will present the isotropy and magnitude of the formal errors of the gravity field coefficients, for different orientations of the cartwheel.
Capodaglio, E M; Facioli, M; Bazzini, G
2001-01-01
Pathologies due to the repetitive activity of the upper limbs constitutes a growing part of the work-related musculo-skeletal disorders. At the moment, there are no universally accepted and validated methods for the description and assessment of the work-related risks. Yet, the criteria fundamentally characterizing the exposure are rather clear and even. This study reports a practical example of the application of some recent risk assessment methods proposed in the literature, combining objective and subjective measures obtained on the field, with the traditional activity analysis.
[Impact factor, its variants and its influence in academic promotion].
Puche, Rodolfo C
2011-01-01
Bibliometrics is a set of methods used to study or measure texts and information. While bibliometric methods are most often used in the field of library and information science, bibliometrics variables have wide applications in other areas. One popular bibliometric variable is Garfield's Impact Factor (IF). IF is used to explore the impact of a given field, the impact of a set of researchers, or the impact of a particular paper. This variable is used to assess academic output and it is believed to affect adversely the traditional approach and assessment of scientific research. In our country, the members of the evaluation committees of intensive research institutions, e.g. the National Scientific and Technical Research Council (CONICET) use IF to assess the quality of research. This article revises the exponential growth of bibliometrics and attempts to expose the overall dissatisfaction with the analytical quality of IF. Such dissatisfaction is expressed in the number of investigations attempting to obtain a better variable of improved analytical quality.
NASA Astrophysics Data System (ADS)
Heo, Seung; Cheong, Cheolung; Kim, Taehoon
2015-09-01
In this study, efficient numerical method is proposed for predicting tonal and broadband noises of a centrifugal fan unit. The proposed method is based on Hybrid Computational Aero-Acoustic (H-CAA) techniques combined with Unsteady Fast Random Particle Mesh (U-FRPM) method. The U-FRPM method is developed by extending the FRPM method proposed by Ewert et al. and is utilized to synthesize turbulence flow field from unsteady RANS solutions. The H-CAA technique combined with U-FRPM method is applied to predict broadband as well as tonal noises of a centrifugal fan unit in a household refrigerator. Firstly, unsteady flow field driven by a rotating fan is computed by solving the RANS equations with Computational Fluid Dynamic (CFD) techniques. Main source regions around the rotating fan are identified by examining the computed flow fields. Then, turbulence flow fields in the main source regions are synthesized by applying the U-FRPM method. The acoustic analogy is applied to model acoustic sources in the main source regions. Finally, the centrifugal fan noise is predicted by feeding the modeled acoustic sources into an acoustic solver based on the Boundary Element Method (BEM). The sound spectral levels predicted using the current numerical method show good agreements with the measured spectra at the Blade Pass Frequencies (BPFs) as well as in the high frequency range. On the more, the present method enables quantitative assessment of relative contributions of identified source regions to the sound field by comparing predicted sound pressure spectrum due to modeled sources.
The Impact of Farmer Field Schools on Human and Social Capital: A Case Study from Ghana
ERIC Educational Resources Information Center
David, Soniia; Asamoah, Christopher
2011-01-01
Based on a case study of Ghanaian cocoa farmers who attended farmer field schools (FFS), this paper explores the impact of the FFS methodology on farmers' technical knowledge, experimentation, knowledge diffusion, group formation and social skills as a way of assessing whether the relatively high costs associated with the method is justified. We…
ERIC Educational Resources Information Center
Ying, Yu-Wen
2011-01-01
The author used a mixed methods design to assess field work-related educational disequilibrium and its effect on the self-concept and mental health of MSW students. Twenty-eight advanced, fourth-semester MSW students were compared with 37 entering, first-semester MSW students in practice-related sense of accomplishment. Compared with first-year…
ERIC Educational Resources Information Center
Higgins, Monica; Ishimaru, Ann; Holcombe, Rebecca; Fowler, Amy
2012-01-01
This study draws upon theory and methods from the field of organizational behavior to examine organizational learning (OL) in the context of a large urban US school district. We build upon prior literature on OL from the field of organizational behavior to introduce and validate three subscales that assess key dimensions of organizational learning…
Competency in Teaching Reading of Fieldbased and On-Campus Students at Cleveland State University.
ERIC Educational Resources Information Center
Boehnlein, Mary Maher; Gans, Thomas G.
The purpose of this study was to determine if students in a field-based program performed significantly better on a test of ability to assess and to teach specific reading skills than students enrolled in on-campus reading methods courses which employed the same textual materials and different amounts of field experiences with children. The…
ERIC Educational Resources Information Center
Smith, Ryan C.; Bowdring, Molly A.; Geller, E. Scott
2015-01-01
Objective: The determinants of alcohol consumption among university students were investigated in a downtown field setting with blood alcohol content (BAC) as the dependent variable. Participants: In total, 521 participants completed a brief survey and had their BAC assessed during April 2013. Methods: Between 10:00 pm and 2:00 am, teams of…
Particle image velocimetry of a flow at a vaulted wall.
Kertzscher, U; Berthe, A; Goubergrits, L; Affeld, K
2008-05-01
The assessment of flow along a vaulted wall (with two main finite radii of curvature) is of general interest; in biofluid mechanics, it is of special interest. Unlike the geometry of flows in engineering, flow geometry in nature is often determined by vaulted walls. Specifically the flow adjacent to the wall of blood vessels is particularly interesting since this is where either thrombi are formed or atherosclerosis develops. Current measurement methods have problems assessing the flow along vaulted walls. In contrast with conventional particle image velocimetry (PIV), this new method, called wall PIV, allows the investigation of a flow adjacent to transparent flexible surfaces with two finite radii of curvature. Using an optical method which allows the observation of particles up to a predefined depth enables the visualization solely of the boundary layer flow. This is accomplished by adding a specific dye to the fluid which absorbs the monochromatic light used to illuminate the region of observation. The obtained images can be analysed with the methods of conventional PIV and result in a vector field of the velocities along the wall. With wall PIV, the steady flow adjacent to the vaulted wall of a blood pump was investigated and the resulting velocity field as well as the velocity fluctuations were assessed.
NASA Astrophysics Data System (ADS)
Schaumburg, F.; Guarnieri, F. A.
2017-05-01
A 3D anatomical computational model is developed to assess thermal effects due to exposure to the electromagnetic field required to power a new investigational active implantable microvalve for the treatment of glaucoma. Such a device, located in the temporal superior eye quadrant, produces a filtering bleb, which is included in the geometry of the model, together with the relevant ocular structures. The electromagnetic field source—a planar coil—as well as the microvalve antenna and casing are also included. Exposure to the electromagnetic field source of an implanted and a non-implanted subject are simulated by solving a magnetic potential formulation, using the finite element method. The maximum SAR10 is reached in the eyebrow and remains within the limits suggested by the IEEE and ICNIRP standards. The anterior chamber, filtering bleb, iris and ciliary body are the ocular structures where more absorption occurs. The temperature rise distribution is also obtained by solving the bioheat equation with the finite element method. The numerical results are compared with the in vivo measurements obtained from four rabbits implanted with the microvalve and exposed to the electromagnetic field source.
Field modeling and ray-tracing of a miniature scanning electron microscope beam column.
Loyd, Jody S; Gregory, Don A; Gaskin, Jessica A
2017-08-01
A miniature scanning electron microscope (SEM) focusing column design is introduced and its potential performance assessed through an estimation of parameters that affect the probe radius, to include source size, spherical and chromatic aberration, diffraction and space charge broadening. The focusing column, a critical component of any SEM capable of operating on the lunar surface, was developed by the NASA Marshall Space Flight Center and Advanced Research Systems. The ray-trace analysis presented uses a model of the electrostatic field (within the focusing column) that is first calculated using the boundary element method (BEM). This method provides flexibility in modeling the complex electrode shapes of practical electron lens systems. A Fourier series solution of the lens field is then derived within a cylindrical domain whose boundary potential is provided by the BEM. Used in this way, the Fourier series solution is an accuracy enhancement to the BEM solution, allowing sufficient precision to assess geometric aberrations through direct ray-tracing. Two modes of operation with distinct lens field solutions are described. © The Author 2017. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Schaumburg, F; Guarnieri, F A
2017-05-07
A 3D anatomical computational model is developed to assess thermal effects due to exposure to the electromagnetic field required to power a new investigational active implantable microvalve for the treatment of glaucoma. Such a device, located in the temporal superior eye quadrant, produces a filtering bleb, which is included in the geometry of the model, together with the relevant ocular structures. The electromagnetic field source-a planar coil-as well as the microvalve antenna and casing are also included. Exposure to the electromagnetic field source of an implanted and a non-implanted subject are simulated by solving a magnetic potential formulation, using the finite element method. The maximum SAR 10 is reached in the eyebrow and remains within the limits suggested by the IEEE and ICNIRP standards. The anterior chamber, filtering bleb, iris and ciliary body are the ocular structures where more absorption occurs. The temperature rise distribution is also obtained by solving the bioheat equation with the finite element method. The numerical results are compared with the in vivo measurements obtained from four rabbits implanted with the microvalve and exposed to the electromagnetic field source.
Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A
2016-07-01
Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.
NASA Astrophysics Data System (ADS)
Boyarshinov, Michael G.; Vaismana, Yakov I.
2016-10-01
The following methods were used in order to identify the pollution fields of urban air caused by the motor transport exhaust gases: the mathematical model, which enables to consider the influence of the main factors that determine pollution fields formation in the complex spatial domain; the authoring software designed for computational modeling of the gas flow, generated by numerous mobile point sources; the results of computing experiments on pollutant spread analysis and evolution of their concentration fields. The computational model of exhaust gas distribution and dispersion in a spatial domain, which includes urban buildings, structures and main traffic arteries, takes into account a stochastic character of cars apparition on the borders of the examined territory and uses a Poisson process. The model also considers the traffic lights switching and permits to define the fields of velocity, pressure and temperature of the discharge gases in urban air. The verification of mathematical model and software used confirmed their satisfactory fit to the in-situ measurements data and the possibility to use the obtained computing results for assessment and prediction of urban air pollution caused by motor transport exhaust gases.
Mathematics for the Student Scientist.
ERIC Educational Resources Information Center
Lauten, A. Darien; Lauten, Gary N.
1998-01-01
Describes the Earth Day-Forest Watch Program which introduces kindergarten through high school level students to field laboratory and satellite-data analysis methods for assessing the health of Eastern White Pine forest stands. (DDR)
Li, Hui; Kayhanian, Masoud; Harvey, John T
2013-03-30
Fully permeable pavement is gradually gaining support as an alternative best management practice (BMP) for stormwater runoff management. As the use of these pavements increases, a definitive test method is needed to measure hydraulic performance and to evaluate clogging, both for performance studies and for assessment of permeability for construction quality assurance and maintenance needs assessment. Two of the most commonly used permeability measurement tests for porous asphalt and pervious concrete are the National Center for Asphalt Technology (NCAT) permeameter and ASTM C1701, respectively. This study was undertaken to compare measured values for both methods in the field on a variety of permeable pavements used in current practice. The field measurements were performed using six experimental section designs with different permeable pavement surface types including pervious concrete, porous asphalt and permeable interlocking concrete pavers. Multiple measurements were performed at five locations on each pavement test section. The results showed that: (i) silicone gel is a superior sealing material to prevent water leakage compared with conventional plumbing putty; (ii) both methods (NCAT and ASTM) can effectively be used to measure the permeability of all pavement types and the surface material type will not impact the measurement precision; (iii) the permeability values measured with the ASTM method were 50-90% (75% on average) lower than those measured with the NCAT method; (iv) the larger permeameter cylinder diameter used in the ASTM method improved the reliability and reduced the variability of the measured permeability. Copyright © 2013 Elsevier Ltd. All rights reserved.
Amoutzopoulos, B; Steer, T; Roberts, C; Cade, J E; Boushey, C J; Collins, C E; Trolle, E; de Boer, E J; Ziauddeen, N; van Rossum, C; Buurma, E; Coyle, D; Page, P
2018-01-01
The aim of the present paper is to summarise current and future applications of dietary assessment technologies in nutrition surveys in developed countries. It includes the discussion of key points and highlights of subsequent developments from a panel discussion to address strengths and weaknesses of traditional dietary assessment methods (food records, FFQ, 24 h recalls, diet history with interviewer-assisted data collection) v. new technology-based dietary assessment methods (web-based and mobile device applications). The panel discussion 'Traditional methods v. new technologies: dilemmas for dietary assessment in population surveys', was held at the 9th International Conference on Diet and Activity Methods (ICDAM9), Brisbane, September 2015. Despite respondent and researcher burden, traditional methods have been most commonly used in nutrition surveys. However, dietary assessment technologies offer potential advantages including faster data processing and better data quality. This is a fast-moving field and there is evidence of increasing demand for the use of new technologies amongst the general public and researchers. There is a need for research and investment to support efforts being made to facilitate the inclusion of new technologies for rapid, accurate and representative data.
NASA Astrophysics Data System (ADS)
Repmann, Frank; Gerwin, Werner; Freese, Dirk
2017-04-01
An ever growing demand for energy and the widely proposed switch from fossil fuels to more sustainable energy sources puts the cultivation and use of bioenergy plants into focus. However, bioenergy production on regular and fertile agricultural soils might conflict with the worldwide growing demand for food. To mitigate or omit this potential conflict, the use of low quality or marginal land for cultivation of bioenergy plants becomes favorable. Against this background the definition and assessment of land marginality and, respectively, the evaluation whether and to which extent specific areas are marginal and thus convenient for sustainable bioenergy production, becomes highly relevant. Within the framework of the EU funded Horizon 2020 project SEEMLA, we attempted to asses land marginality of designated test sites in the Ukraine, Greece and Germany by direct field survey. For that purpose, soil and site properties were investigated and evaluated by applying the Muencheberg Soil Quality Rating (SQR) method, developed at the Leibniz Centre for Agricultural Landscape Research (ZALF). The method deploys a comprehensive set of biogeophysical and chemical indicators to describe and finally evaluate the quality of the soil and site by a score ranging from 1 to 100 points. Field survey data were supported by additional laboratory tests on a representative set of soil samples. Practical field work and analysis of field and lab data from the investigated sites proved the applicability of the SQR method within the SEEMLA context. The SQR indices calculated from the field and lab data ranged from 2 to < 40 and clearly demonstrated the marginality of the investigated sites in the Ukraine, Greece and Germany, which differed considerably in respect to their characteristics. Correlating the site quality index to yield data reflecting yield estimations for common bioenergy plants such as willow (Salix sp.), black locust (Robinia pseudoacacia) and poplar (Populus sp.) cultivated at the respective test sites, revealed that SQR might additionally reflect the potential yield of the investigated sites.
Weng, Chunhua
2013-01-01
Objective To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. Materials and methods A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Results Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Discussion Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. Conclusion There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment. PMID:22733976
Vadose Zone Transport Field Study: Detailed Test Plan for Simulated Leak Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Anderson L.; Gee, Glendon W.
2000-06-23
This report describes controlled transport experiments at well-instrumented field tests to be conducted during FY 2000 in support of DOE?s Vadose Zone Transport Field Study (VZTFS). The VZTFS supports the Groundwater/Vadose Zone Integration Project Science and Technology Initiative. The field tests will improve understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. These methods will capture the extent of contaminant plumes using existing steel-cased boreholes. Specific objectives are to 1) identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford?s waste disposal sites; 2) reduce uncertainty in conceptualmore » models; 3) develop a detailed and accurate data base of hydraulic and transport parameters for validation of three-dimensional numerical models; and 4) identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. Pacific Northwest National Laboratory (PNNL) manages the VZTFS for DOE.« less
Deborah S. Page-Dumroese; Ann M. Abbott; Thomas M. Rice
2009-01-01
Volume I and volume II of the Forest Soil Disturbance Monitoring Protocol (FSDMP) provide information for a wide range of users, including technicians, field crew leaders, private landowners, land managers, forest professionals, and researchers. Volume I: Rapid Assessment includes the basic methods for establishing forest soil monitoring transects and consistently...
ERIC Educational Resources Information Center
Bothe, Anne K.; Richardson, Jessica D.
2011-01-01
Purpose: To discuss constructs and methods related to assessing the magnitude and the meaning of clinical outcomes, with a focus on applications in speech-language pathology. Method: Professionals in medicine, allied health, psychology, education, and many other fields have long been concerned with issues referred to variously as practical…
Assessing occupational exposure to sea lamprey pesticides.
Ceballos, Diana M; Beaucham, Catherine C; Kurtz, Kristine; Musolin, Kristin
2015-01-01
Sea lampreys are parasitic fish found in lakes of the United States and Canada. Sea lamprey is controlled through manual application of the pesticides 3-trifluoromethyl-4-nitrophenol (TFM) and Bayluscide(TM) into streams and tributaries. 3-Trifluoromethyl-4-nitrophenol may cause irritation and central nervous system depression and Bayluscide may cause irritation, dermatitis, blisters, cracking, edema, and allergic skin reactions. To assess occupational exposures to sea lamprey pesticides. We developed a wipe method for evaluating surface and skin contamination with these pesticides. This method was field tested at a biological field station and at a pesticide river application. We also evaluated exposures using control banding tools. We verified TFM surface contamination at the biological station. At the river application, we found surfaces and worker's skin contaminated with pesticides. We recommended minimizing exposures by implementing engineering controls and improved use of personal protective equipment.
Torondel, Belen; Gyekye-Aboagye, Yaw; Routray, Parimita; Boisson, Sophie; Schimdt, Wolf; Clasen, Thomas
2015-06-01
Sentinel toys are increasingly used as a method of assessing young children's exposure to faecal pathogens in households in low-income settings. However, there is no consensus on the suitability of different approaches. We evaluated three types of toy balls with different surfaces (plastic, rubber, urethane) in the laboratory to compare the uptake of faecal indicator bacteria (Escherichia coli) on their surface. We performed bacteria survival analysis under different environmental conditions and tested laboratory methods for bacteria removal and recovery. In a field study we distributed sterile urethane balls to children <5 from 360 households in rural India. After 24 hours, we collected and rinsed the toys in sterile water, assayed for thermotolerant coliforms (TTC) and explored associations between the level of contamination and household characteristics. In the laboratory, urethane foam balls took up more indicator bacteria than the other balls. Bacteria recovery did not differ based on mechanic vs no agitation. Higher temperatures and moisture levels increased bacterial yield. In the field, the only factor associated with a decreased recovery of TTC from the balls was having a soil (unpaved) floor. Sentinel toys may be an effective tool for assessing young children's exposure to faecal pathogens. However, even using methods designed to increase bacterial recovery, limited sensitivity may require larger sample sizes. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Rodriguez, Christina M.; Cook, Anne E.; Jedrziewski, Chezlie T.
2012-01-01
Objective: Researchers in the child maltreatment field have traditionally relied on explicit self-reports to study factors that may exacerbate physical child abuse risk. The current investigation evaluated an implicit analog task utilizing eye tracking technology to assess both parental attributions of child misbehavior and empathy. Method: Based…
Making Numbers Come to Life: Two Scoring Methods for Creativity in Aurora's Cartoon Numbers
ERIC Educational Resources Information Center
Tan, Mei; Mourgues, Catalina; Bolden, David S.; Grigorenko, Elena L.
2014-01-01
Although creativity has long been recognized as an important aspect of mathematical thinking, both for the advancement of the field and in students' developing expertise in mathematics, assessments of student creativity in that domain have been limited in number and focus. This article presents an assessment developed for creativity that…
The Value of Assessment and Accreditation for Hospitality and Tourism Graduate Education
ERIC Educational Resources Information Center
Roberson, Richard D., Jr.
2010-01-01
Graduate education in hospitality is a growing field with a number of different approaches and philosophies which inform the decisions and directions put forth by program administrators. A case study explains the methods and justifications used by one institution to conduct a self assessment to assure high quality graduate hospitality education.…
Improved measurement methods are needed to characterize dry deposition of sulfur and nitrogen compounds to assess ecosystem exposure to nutrients and acidifying compounds and to develop atmospheric deposition budgets in support of critical loads assessments. The purpose of this s...
ERIC Educational Resources Information Center
Kalz, Marco; Specht, Marcus
2014-01-01
This paper deals with the assessment of the crossdisciplinarity of technology-enhanced learning (TEL). Based on a general discussion of the concept interdisciplinarity and a summary of the discussion in the field, two empirical methods from scientometrics are introduced and applied. Science overlay maps and the Rao-Stirling diversity index are…
ERIC Educational Resources Information Center
Alderson, J. Charles; Brunfaut, Tineke; Harding, Luke
2015-01-01
Diagnostic language assessment has received increased research interest in recent years, with particular attention on methods through which diagnostic information can be gleaned from standardized proficiency tests. However, diagnostic procedures in the broader sense have been inadequately theorized to date, with the result that there is still…
Global Assessment of Volcanic Debris Hazards from Space
NASA Technical Reports Server (NTRS)
Watters, Robert J.
2003-01-01
Hazard (slope stability) assessment for different sectors of volcano edifices was successfully obtained from volcanoes in North and South America. The assessment entailed Hyperion images to locate portions of the volcano that were hydrothermally altered to clay rich rocks with zones that were also rich in alunite and other minerals. The identified altered rock zones were field checked and sampled. The rock strength of these zones was calculated from the field and laboratory measurements. Volcano modeling utilizing the distinct element method and limit equilibrium technique, with the calculated strength data was used to assess stability and deformation of the edifice. Modeling results give indications of possible failure volumes, velocities and direction. The models show the crucial role hydrothermally weak rock plays in reducing the strength o the volcano edifice and the rapid identification of weak rock through remote sensing techniques. Volcanoes were assessed in the Cascade Range (USA), Mexico, and Chile (ongoing).
NASA Astrophysics Data System (ADS)
Fleishman, G. D.; Anfinogentov, S.; Loukitcheva, M.; Mysh'yakov, I.; Stupishin, A.
2017-12-01
Measuring and modeling coronal magnetic field, especially above active regions (ARs), remains one of the central problems of solar physics given that the solar coronal magnetism is the key driver of all solar activity. Nowadays the coronal magnetic field is often modelled using methods of nonlinear force-free field reconstruction, whose accuracy has not yet been comprehensively assessed. Given that the coronal magnetic probing is routinely unavailable, only morphological tests have been applied to evaluate performance of the reconstruction methods and a few direct tests using available semi-analytical force-free field solution. Here we report a detailed casting of various tools used for the nonlinear force-free field reconstruction, such as disambiguation methods, photospheric field preprocessing methods, and volume reconstruction methods in a 3D domain using a 3D snapshot of the publicly available full-fledged radiative MHD model. We take advantage of the fact that from the realistic MHD model we know the magnetic field vector distribution in the entire 3D domain, which enables us to perform "voxel-by-voxel" comparison of the restored magnetic field and the true magnetic field in the 3D model volume. Our tests show that the available disambiguation methods often fail at the quiet sun areas, where the magnetic structure is dominated by small-scale magnetic elements, while they work really well at the AR photosphere and (even better) chromosphere. The preprocessing of the photospheric magnetic field, although does produce a more force-free boundary condition, also results in some effective `elevation' of the magnetic field components. The effective `elevation' height turns out to be different for the longitudinal and transverse components of the magnetic field, which results in a systematic error in absolute heights in the reconstructed magnetic data cube. The extrapolation performed starting from actual AR photospheric magnetogram (i.e., without preprocessing) are free from this systematic error, while have other metrics either comparable or only marginally worse than those estimated for extrapolations from the preprocessed magnetograms. This finding favors the use of extrapolations from the original photospheric magnetogram without preprocessing.
Meyer, Frans J C; Davidson, David B; Jakobus, Ulrich; Stuchly, Maria A
2003-02-01
A hybrid finite-element method (FEM)/method of moments (MoM) technique is employed for specific absorption rate (SAR) calculations in a human phantom in the near field of a typical group special mobile (GSM) base-station antenna. The MoM is used to model the metallic surfaces and wires of the base-station antenna, and the FEM is used to model the heterogeneous human phantom. The advantages of each of these frequency domain techniques are, thus, exploited, leading to a highly efficient and robust numerical method for addressing this type of bioelectromagnetic problem. The basic mathematical formulation of the hybrid technique is presented. This is followed by a discussion of important implementation details-in particular, the linear algebra routines for sparse, complex FEM matrices combined with dense MoM matrices. The implementation is validated by comparing results to MoM (surface equivalence principle implementation) and finite-difference time-domain (FDTD) solutions of human exposure problems. A comparison of the computational efficiency of the different techniques is presented. The FEM/MoM implementation is then used for whole-body and critical-organ SAR calculations in a phantom at different positions in the near field of a base-station antenna. This problem cannot, in general, be solved using the MoM or FDTD due to computational limitations. This paper shows that the specific hybrid FEM/MoM implementation is an efficient numerical tool for accurate assessment of human exposure in the near field of base-station antennas.
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
Lieberman, Harris R; Kramer, F Matthew; Montain, Scott J; Niro, Philip
2007-05-01
Limited opportunities to study human cognitive performance in non-laboratory, ambulatory situations exist. However, advances in technology make it possible to extend behavioral assessments to the field. One of the first devices to measure human behavior in the field was the wrist-worn actigraph. This device acquires minute-by-minute information on an individual's physical activity and can distinguish sleep from waking, the most basic aspect of behavior. Our laboratory developed a series of wrist-worn devices, not much larger than a watch, which assess reaction time, vigilance and memory. The devices concurrently assess motor activity with greater temporal resolution than standard actigraphs. They also continuously monitor multiple environmental variables including temperature, humidity, sound, and light. These monitors have been employed during training and simulated military operations to collect behavioral and environmental information that would typically be unavailable under such circumstances. Development of the vigilance monitor, and how each successive version extended capabilities of the device are described. Data from several studies are presented, including studies conducted in harsh field environments during a simulated infantry assault, an officer training course. The monitors simultaneously documented environmental conditions, patterns of sleep and activity and effects of nutritional manipulations on cognitive performance. They provide a new method to relate cognitive performance to real world environmental conditions and assess effects of various interventions on human behavior in the field. They can also monitor cognitive performance in real time, and if it is degraded, attempt to intervene to maintain
Determination of the maximum-depth to potential field sources by a maximum structural index method
NASA Astrophysics Data System (ADS)
Fedi, M.; Florio, G.
2013-01-01
A simple and fast determination of the limiting depth to the sources may represent a significant help to the data interpretation. To this end we explore the possibility of determining those source parameters shared by all the classes of models fitting the data. One approach is to determine the maximum depth-to-source compatible with the measured data, by using for example the well-known Bott-Smith rules. These rules involve only the knowledge of the field and its horizontal gradient maxima, and are independent from the density contrast. Thanks to the direct relationship between structural index and depth to sources we work out a simple and fast strategy to obtain the maximum depth by using the semi-automated methods, such as Euler deconvolution or depth-from-extreme-points method (DEXP). The proposed method consists in estimating the maximum depth as the one obtained for the highest allowable value of the structural index (Nmax). Nmax may be easily determined, since it depends only on the dimensionality of the problem (2D/3D) and on the nature of the analyzed field (e.g., gravity field or magnetic field). We tested our approach on synthetic models against the results obtained by the classical Bott-Smith formulas and the results are in fact very similar, confirming the validity of this method. However, while Bott-Smith formulas are restricted to the gravity field only, our method is applicable also to the magnetic field and to any derivative of the gravity and magnetic field. Our method yields a useful criterion to assess the source model based on the (∂f/∂x)max/fmax ratio. The usefulness of the method in real cases is demonstrated for a salt wall in the Mississippi basin, where the estimation of the maximum depth agrees with the seismic information.
NASA Astrophysics Data System (ADS)
Ibrahima, Fayadhoi; Meyer, Daniel; Tchelepi, Hamdi
2016-04-01
Because geophysical data are inexorably sparse and incomplete, stochastic treatments of simulated responses are crucial to explore possible scenarios and assess risks in subsurface problems. In particular, nonlinear two-phase flows in porous media are essential, yet challenging, in reservoir simulation and hydrology. Adding highly heterogeneous and uncertain input, such as the permeability and porosity fields, transforms the estimation of the flow response into a tough stochastic problem for which computationally expensive Monte Carlo (MC) simulations remain the preferred option.We propose an alternative approach to evaluate the probability distribution of the (water) saturation for the stochastic Buckley-Leverett problem when the probability distributions of the permeability and porosity fields are available. We give a computationally efficient and numerically accurate method to estimate the one-point probability density (PDF) and cumulative distribution functions (CDF) of the (water) saturation. The distribution method draws inspiration from a Lagrangian approach of the stochastic transport problem and expresses the saturation PDF and CDF essentially in terms of a deterministic mapping and the distribution and statistics of scalar random fields. In a large class of applications these random fields can be estimated at low computational costs (few MC runs), thus making the distribution method attractive. Even though the method relies on a key assumption of fixed streamlines, we show that it performs well for high input variances, which is the case of interest. Once the saturation distribution is determined, any one-point statistics thereof can be obtained, especially the saturation average and standard deviation. Moreover, the probability of rare events and saturation quantiles (e.g. P10, P50 and P90) can be efficiently derived from the distribution method. These statistics can then be used for risk assessment, as well as data assimilation and uncertainty reduction in the prior knowledge of input distributions. We provide various examples and comparisons with MC simulations to illustrate the performance of the method.
Arienzo, Alyexandra; Sobze, Martin Sanou; Wadoum, Raoul Emeric Guetiya; Losito, Francesca; Colizzi, Vittorio; Antonini, Giovanni
2015-08-25
According to the World Health Organization (WHO) guidelines, "safe drinking-water must not represent any significant risk to health over a lifetime of consumption, including different sensitivities that may occur between life stages". Traditional methods of water analysis are usually complex, time consuming and require an appropriately equipped laboratory, specialized personnel and expensive instrumentation. The aim of this work was to apply an alternative method, the Micro Biological Survey (MBS), to analyse for contaminants in drinking water. Preliminary experiments were carried out to demonstrate the linearity and accuracy of the MBS method and to verify the possibility of using the evaluation of total coliforms in 1 mL of water as a sufficient parameter to roughly though accurately determine water microbiological quality. The MBS method was then tested "on field" to assess the microbiological quality of water sources in the city of Douala (Cameroon, Central Africa). Analyses were performed on both dug and drilled wells in different periods of the year. Results confirm that the MBS method appears to be a valid and accurate method to evaluate the microbiological quality of many water sources and it can be of valuable aid in developing countries.
Pletzer, Belinda; M Ortner, Tuulia
2016-09-01
Personality assessment has been challenged by the fact that different assessment methods (implicit measures, behavioral measures and explicit rating scales) show little or no convergence in behavioral studies. In this neuroimaging study we address for the first time, whether different assessment methods rely on separate or overlapping neuronal systems. Fifty nine healthy adult participants completed two objective personality tests of risk propensity: the more implicit Balloon Analogue Risk Task (BART) and the more explicit Game of Dice Task (GDT). Significant differences in activation, as well as connectivity patterns between both tasks were observed. In both tasks, risky decisions yielded significantly stronger activations than safe decisions in the bilateral caudate, as well as the bilateral Insula. The finding of overlapping brain areas validates different assessment methods, despite their behavioral non-convergence. This suggests that neuroimaging can be an important tool of validation in the field of personality assessment. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gurkov, Anton; Shchapova, Ekaterina; Bedulina, Daria; Baduev, Boris; Borvinskaya, Ekaterina; Meglinski, Igor; Timofeyev, Maxim
2016-11-01
Remote in vivo scanning of physiological parameters is a major trend in the development of new tools for the fields of medicine and animal physiology. For this purpose, a variety of implantable optical micro- and nanosensors have been designed for potential medical applications. At the same time, the important area of environmental sciences has been neglected in the development of techniques for remote physiological measurements. In the field of environmental monitoring and related research, there is a constant demand for new effective and quick techniques for the stress assessment of aquatic animals, and the development of proper methods for remote physiological measurements in vivo may significantly increase the precision and throughput of analyses in this field. In the present study, we apply pH-sensitive microencapsulated biomarkers to remotely monitor the pH of haemolymph in vivo in endemic amphipods from Lake Baikal, and we compare the suitability of this technique for stress assessment with that of common biochemical methods. For the first time, we demonstrate the possibility of remotely detecting a change in a physiological parameter in an aquatic organism under ecologically relevant stressful conditions and show the applicability of techniques using microencapsulated biomarkers for remote physiological measurements in environmental monitoring.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Traunspurger, W.; Haitzer, M.; Hoess, S.
1997-02-01
The authors present a method using the free-living nematode Caenorhabditis elegans to assess toxicity in liquid medium and whole-sediment setups. Test duration is 72 h; endpoints are body length, number of eggs inside worms, percentage of gravid worms, and number of offspring per worm. The effect of CdCl{sub 2} on C. elegans in liquid-phase exposures is described as an example. Results from a field study with cadmium polluted sediments from the River Elbe (Germany) suggest that nematodes may be useful organisms in assessing toxicity of sediments in the whole phase.
Assessment of PIV-based unsteady load determination of an airfoil with actuated flap
NASA Astrophysics Data System (ADS)
Sterenborg, J. J. H. M.; Lindeboom, R. C. J.; Simão Ferreira, C. J.; van Zuijlen, A. H.; Bijl, H.
2014-02-01
For complex experimental setups involving movable structures it is not trivial to directly measure unsteady loads. An alternative is to deduce unsteady loads indirectly from measured velocity fields using Noca's method. The ultimate aim is to use this method in future work to determine unsteady loads for fluid-structure interaction problems. The focus in this paper is first on the application and assessment of Noca's method for an airfoil with an oscillating trailing edge flap. To our best knowledge Noca's method has not been applied yet to airfoils with moving control surfaces or fluid-structure interaction problems. In addition, wind tunnel corrections for this type of unsteady flow problem are considered.
An Automated Method for Navigation Assessment for Earth Survey Sensors Using Island Targets
NASA Technical Reports Server (NTRS)
Patt, F. S.; Woodward, R. H.; Gregg, W. W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalogue of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean colour sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Using IKONOS Imagery to Estimate Surface Soil Property Variability in Two Alabama Physiographies
NASA Technical Reports Server (NTRS)
Sullivan, Dana; Shaw, Joey; Rickman, Doug
2005-01-01
Knowledge of surface soil properties is used to assess past erosion and predict erodibility, determine nutrient requirements, and assess surface texture for soil survey applications. This study was designed to evaluate high resolution IKONOS multispectral data as a soil- mapping tool. Imagery was acquired over conventionally tilled fields in the Coastal Plain and Tennessee Valley physiographic regions of Alabama. Acquisitions were designed to assess the impact of surface crusting, roughness and tillage on our ability to depict soil property variability. Soils consisted mostly of fine-loamy, kaolinitic, thermic Plinthic Kandiudults at the Coastal Plain site and fine, kaolinitic, thermic Rhodic Paleudults at the Tennessee Valley site. Soils were sampled in 0.20 ha grids to a depth of 15 cm and analyzed for % sand (0.05 - 2 mm), silt (0.002 -0.05 mm), clay (less than 0.002 mm), citrate dithionite extractable iron (Fe(sub d)) and soil organic carbon (SOC). Four methods of evaluating variability in soil attributes were evaluated: 1) kriging of soil attributes, 2) co-kriging with soil attributes and reflectance data, 3) multivariate regression based on the relationship between reflectance and soil properties, and 4) fuzzy c-means clustering of reflectance data. Results indicate that co-kriging with remotely sensed data improved field scale estimates of surface SOC and clay content compared to kriging and regression methods. Fuzzy c-means worked best using RS data acquired over freshly tilled fields, reducing soil property variability within soil zones compared to field scale soil property variability.
Does bad inference drive out good?
Marozzi, Marco
2015-07-01
The (mis)use of statistics in practice is widely debated, and a field where the debate is particularly active is medicine. Many scholars emphasize that a large proportion of published medical research contains statistical errors. It has been noted that top class journals like Nature Medicine and The New England Journal of Medicine publish a considerable proportion of papers that contain statistical errors and poorly document the application of statistical methods. This paper joins the debate on the (mis)use of statistics in the medical literature. Even though the validation process of a statistical result may be quite elusive, a careful assessment of underlying assumptions is central in medicine as well as in other fields where a statistical method is applied. Unfortunately, a careful assessment of underlying assumptions is missing in many papers, including those published in top class journals. In this paper, it is shown that nonparametric methods are good alternatives to parametric methods when the assumptions for the latter ones are not satisfied. A key point to solve the problem of the misuse of statistics in the medical literature is that all journals have their own statisticians to review the statistical method/analysis section in each submitted paper. © 2015 Wiley Publishing Asia Pty Ltd.
Coecke, Sandra; Goldberg, Alan M; Allen, Sandra; Buzanska, Leonora; Calamandrei, Gemma; Crofton, Kevin; Hareng, Lars; Hartung, Thomas; Knaut, Holger; Honegger, Paul; Jacobs, Miriam; Lein, Pamela; Li, Abby; Mundy, William; Owen, David; Schneider, Steffen; Silbergeld, Ellen; Reum, Torsten; Trnovec, Tomas; Monnet-Tschudi, Florianne; Bal-Price, Anna
2007-01-01
This is the report of the first workshop on Incorporating In Vitro Alternative Methods for Developmental Neurotoxicity (DNT) Testing into International Hazard and Risk Assessment Strategies, held in Ispra, Italy, on 19–21 April 2005. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and jointly organized by ECVAM, the European Chemical Industry Council, and the Johns Hopkins University Center for Alternatives to Animal Testing. The primary aim of the workshop was to identify and catalog potential methods that could be used to assess how data from in vitro alternative methods could help to predict and identify DNT hazards. Working groups focused on two different aspects: a) details on the science available in the field of DNT, including discussions on the models available to capture the critical DNT mechanisms and processes, and b) policy and strategy aspects to assess the integration of alternative methods in a regulatory framework. This report summarizes these discussions and details the recommendations and priorities for future work. PMID:17589601
CART V: recent advancements in computer-aided camouflage assessment
NASA Astrophysics Data System (ADS)
Müller, Thomas; Müller, Markus
2011-05-01
In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.
Fatigue Life Assessment of 65Si7 Leaf Springs: A Comparative Study
Arora, Vinkel Kumar; Bhushan, Gian; Aggarwal, M. L.
2014-01-01
The experimental fatigue life prediction of leaf springs is a time consuming process. The engineers working in the field of leaf springs always face a challenge to formulate alternate methods of fatigue life assessment. The work presented in this paper provides alternate methods for fatigue life assessment of leaf springs. A 65Si7 light commercial vehicle leaf spring is chosen for this study. The experimental fatigue life and load rate are determined on a full scale leaf spring testing machine. Four alternate methods of fatigue life assessment have been depicted. Firstly by SAE spring design manual approach the fatigue test stroke is established and by the intersection of maximum and initial stress the fatigue life is predicted. The second method constitutes a graphical method based on modified Goodman's criteria. In the third method codes are written in FORTRAN for fatigue life assessment based on analytical technique. The fourth method consists of computer aided engineering tools. The CAD model of the leaf spring has been prepared in solid works and analyzed using ANSYS. Using CAE tools, ideal type of contact and meshing elements have been proposed. The method which provides fatigue life closer to experimental value and consumes less time is suggested. PMID:27379327
Field Evaluation of Fracture Control in Tunnel Blasting
DOT National Transportation Integrated Search
1979-12-01
The objective of this research was to implement fracture control procedures in a tunnel project and to assess the practicality, advantages, disadvantages, performance and cost effectiveness of fracture control methods against smooth blasting procedur...
Estepp, Justin R; Klosterman, Samantha L; Christensen, James C
2011-01-01
With increased attention toward physiological cognitive state assessment as a component in the larger field of applied neuroscience, the need to develop methods for robust, stable assessment of cognitive state has been expressed as critical to designing effective augmented human-machine systems. The technique of cognitive state assessment, as well as its benefits, has been demonstrated by many research groups. In an effort to move closer toward a realized system, efforts must now be focused on critical issues that remain unsolved, namely instability of pattern classifiers over the course of hours and days. This work, as part of the Cognitive State Assessment Competition 2011, seeks to explore methods for 'learning' non-stationarity as a mitigation for more generalized patterns that are stable over time courses that are not widely discussed in the literature.
Gebresillasie, Sintayehu; Tadesse, Zerihun; Shiferaw, Ayalew; Yu, Sun N.; Stoller, Nicole E.; Zhou, Zhaoxia; Emerson, Paul M.; Gaynor, Bruce D.; Lietman, Thomas M.; Keenan, Jeremy D.
2016-01-01
Purpose Trachoma surveillance is most commonly performed by direct observation, usually by non-ophthalmologists using the World Health Organization (WHO) simplified grading system. However, conjunctival photographs may offer several benefits over direct clinical observation, including the potential for greater inter-rater agreement. This study assesses whether inter-rater agreement of trachoma grading differs when trained graders review conjunctival photographs versus when they perform conjunctival examinations in the field. Methods 3 trained trachoma graders each performed an independent examination of the everted right tarsal conjunctiva of 269 children aged 0-9 years, and then reviewed photographs of these same conjunctivae in a random order. For each eye, the grader documented the presence or absence of follicular trachoma (TF) and intense trachomatous inflammation (TI) according to the WHO simplified grading system. Results Inter-rater agreement for grade of TF was significantly higher in the field (kappa coefficient, κ, 0.73, 95% confidence interval, CI 0.67-0.80) than by photographic review (κ=0.55, 95% CI 0.49-0.63; difference in κ between field grading and photo grading 0.18, 95% CI 0.09-0.26). When field and photographic grades were each assessed as the consensus grade from the 3 graders, agreement between in-field and photographic graders was high for TF (κ=0.75, 95% CI 0.68-0.84). Conclusions In an area with hyperendemic trachoma, inter-rater agreement was lower for photographic assessment of trachoma than for in-field assessment. However, the trachoma grade reached by a consensus of photographic graders agreed well with the grade given by a consensus of in-field graders. PMID:26158573
Organizational readiness for implementing change: a psychometric assessment of a new measure
2014-01-01
Background Organizational readiness for change in healthcare settings is an important factor in successful implementation of new policies, programs, and practices. However, research on the topic is hindered by the absence of a brief, reliable, and valid measure. Until such a measure is developed, we cannot advance scientific knowledge about readiness or provide evidence-based guidance to organizational leaders about how to increase readiness. This article presents results of a psychometric assessment of a new measure called Organizational Readiness for Implementing Change (ORIC), which we developed based on Weiner’s theory of organizational readiness for change. Methods We conducted four studies to assess the psychometric properties of ORIC. In study one, we assessed the content adequacy of the new measure using quantitative methods. In study two, we examined the measure’s factor structure and reliability in a laboratory simulation. In study three, we assessed the reliability and validity of an organization-level measure of readiness based on aggregated individual-level data from study two. In study four, we conducted a small field study utilizing the same analytic methods as in study three. Results Content adequacy assessment indicated that the items developed to measure change commitment and change efficacy reflected the theoretical content of these two facets of organizational readiness and distinguished the facets from hypothesized determinants of readiness. Exploratory and confirmatory factor analysis in the lab and field studies revealed two correlated factors, as expected, with good model fit and high item loadings. Reliability analysis in the lab and field studies showed high inter-item consistency for the resulting individual-level scales for change commitment and change efficacy. Inter-rater reliability and inter-rater agreement statistics supported the aggregation of individual level readiness perceptions to the organizational level of analysis. Conclusions This article provides evidence in support of the ORIC measure. We believe this measure will enable testing of theories about determinants and consequences of organizational readiness and, ultimately, assist healthcare leaders to reduce the number of health organization change efforts that do not achieve desired benefits. Although ORIC shows promise, further assessment is needed to test for convergent, discriminant, and predictive validity. PMID:24410955
Brown, Gregory G; Anderson, Vicki; Bigler, Erin D; Chan, Agnes S; Fama, Rosemary; Grabowski, Thomas J; Zakzanis, Konstantine K
2017-11-01
The American Psychological Association (APA) celebrated its 125th anniversary in 2017. As part of this celebration, the APA journal Neuropsychology has published in its November 2017 issue 11 papers describing some of the advances in the field of neuropsychology over the past 25 years. The papers address three broad topics: assessment and intervention, brain imaging, and theory and methods. The papers describe the rise of new assessment and intervention technologies, the impact of evidence for neuroplasticity on neurorehabilitation. Examples of the use of mathematical models of cognition to investigate latent neurobehavioral processes, the development of the field of neuropsychology in select international countries, the increasing sophistication of brain imaging methods, the recent evidence for localizationist and connectionist accounts of neurobehavioral functioning, the advances in neurobehavioral genomics, and descriptions of newly developed statistical models of longitudinal change. Together the papers convey evidence of the vibrant growth in the field of neuropsychology over the quarter century since APA's 100th anniversary in 1992. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Shoda, Megan E.; Nowell, Lisa H.; Stone, Wesley W.; Sandstrom, Mark W.; Bexfield, Laura M.
2018-04-02
In 2013, the U.S. Geological Survey National Water Quality Laboratory (NWQL) made a new method available for the analysis of pesticides in filtered water samples: laboratory schedule 2437. Schedule 2437 is an improvement on previous analytical methods because it determines the concentrations of 225 fungicides, herbicides, insecticides, and associated degradates in one method at similar or lower concentrations than previously available methods. Additionally, the pesticides included in schedule 2437 were strategically identified in a prioritization analysis that assessed likelihood of occurrence, prevalence of use, and potential toxicity. When the NWQL reports pesticide concentrations for analytes in schedule 2437, the laboratory also provides supplemental information useful to data users for assessing method performance and understanding data quality. That supplemental information is discussed in this report, along with an initial analysis of analytical recovery of pesticides in water-quality samples analyzed by schedule 2437 during 2013–2015. A total of 523 field matrix spike samples and their paired environmental samples and 277 laboratory reagent spike samples were analyzed for this report (1,323 samples total). These samples were collected in the field as part of the U.S. Geological Survey National Water-Quality Assessment groundwater and surface-water studies and as part of the NWQL quality-control program. This report reviews how pesticide samples are processed by the NWQL, addresses how to obtain all the data necessary to interpret pesticide concentrations, explains the circumstances that result in a reporting level change or the occurrence of a raised reporting level, and describes the calculation and assessment of recovery. This report also discusses reasons why a data user might choose to exclude data in an interpretive analysis and outlines the approach used to identify the potential for decreased data quality in the assessment of method recovery. The information provided in this report is essential to understanding pesticide data determined by schedule 2437 and should be reviewed before interpretation of these data.
Tyree, Melvin T.; Engelbrecht, Bettina M.J.; Vargas, Gustavo; Kursar, Thomas A.
2003-01-01
Studies of the desiccation tolerance of the seedlings of five tropical trees were made on potted plants growing in a greenhouse. Pots were watered to field capacity and then dehydrated for 3 to 9 weeks to reach various visual wilting stages, from slightly wilted to dead. Saturated root hydraulic conductance was measured with a high-pressure flowmeter, and whole-stem hydraulic conductance was measured by a vacuum chamber method. Leaf punches (5.6-mm diameter) were harvested for measurement of leaf water potential by a thermocouple psychrometer method and for measurement of fresh and dry weight. In a parallel study, the same five species were studied in a field experiment in the understory of a tropical forest, where these species frequently germinate. Control seedlings were maintained in irrigated plots during a dry season, and experimental plants were grown in similar plots with rain exclusion shelters. Every 2 to 4 weeks, the seedlings were scored for wilt state and survivorship. After a 22-week drought, the dry plots were irrigated for several weeks to verify visual symptoms of death. The field trials were used to rank drought performance of species, and the greenhouse desiccation studies were used to determine the conditions of moribund plants. Our conclusion is that the desiccation tolerance of moribund plants correlated with field assessment of drought-performance for the five species (r2 > 0.94). PMID:12857825
Tyree, Melvin T; Engelbrecht, Bettina M J; Vargas, Gustavo; Kursar, Thomas A
2003-07-01
Studies of the desiccation tolerance of the seedlings of five tropical trees were made on potted plants growing in a greenhouse. Pots were watered to field capacity and then dehydrated for 3 to 9 weeks to reach various visual wilting stages, from slightly wilted to dead. Saturated root hydraulic conductance was measured with a high-pressure flowmeter, and whole-stem hydraulic conductance was measured by a vacuum chamber method. Leaf punches (5.6-mm diameter) were harvested for measurement of leaf water potential by a thermocouple psychrometer method and for measurement of fresh and dry weight. In a parallel study, the same five species were studied in a field experiment in the understory of a tropical forest, where these species frequently germinate. Control seedlings were maintained in irrigated plots during a dry season, and experimental plants were grown in similar plots with rain exclusion shelters. Every 2 to 4 weeks, the seedlings were scored for wilt state and survivorship. After a 22-week drought, the dry plots were irrigated for several weeks to verify visual symptoms of death. The field trials were used to rank drought performance of species, and the greenhouse desiccation studies were used to determine the conditions of moribund plants. Our conclusion is that the desiccation tolerance of moribund plants correlated with field assessment of drought-performance for the five species (r(2) > 0.94).
A technique for measuring petal gloss, with examples from the Namaqualand flora.
Whitney, Heather M; Rands, Sean A; Elton, Nick J; Ellis, Allan G
2012-01-01
The degree of floral gloss varies between species. However, little is known about this distinctive floral trait, even though it could be a key feature of floral biotic and abiotic interactions. One reason for the absence of knowledge is the lack of a simple, repeatable method of gloss measurement that can be used in the field to study floral gloss. A protocol is described for measuring gloss in petal samples collected in the field, using a glossmeter. Repeatability of the technique is assessed. We demonstrate a simple yet highly accurate and repeatable method that can easily be implemented in the field. We also highlight the huge variety of glossiness found within flowers and between species in a sample of spring-blooming flowers collected in Namaqualand, South Africa. We discuss the potential uses of this method and its applications for furthering studies in plant-pollinator interactions. We also discuss the potential functions of gloss in flowers.
Harwood, Amanda D; Landrum, Peter F; Weston, Donald P; Lydy, Michael J
2013-02-01
The presence of pyrethroids in both urban and agricultural sediments at levels lethal to invertebrates has been well documented. However, variations in bioavailability among sediments make accurate predictions of toxicity based on whole sediment concentrations difficult. A proposed solution to this problem is the use of bioavailability-based estimates, such as solid phase microextraction (SPME) fibers and Tenax beads. This study compared three methods to assess the bioavailability and ultimately toxicity of pyrethroid pesticides including field-deployed SPME fibers, laboratory-exposed SPME fibers, and a 24-h Tenax extraction. The objective of the current study was to compare the ability of these methods to quantify the bioavailable fraction of pyrethroids in contaminated field sediments that were toxic to benthic invertebrates. In general, Tenax proved a more sensitive method than SPME fibers and a correlation between Tenax extractable concentrations and mortality was observed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Quantitative Image Restoration in Bright Field Optical Microscopy.
Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús
2017-11-07
Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
General introduction for the “National Field Manual for the Collection of Water-Quality Data”
,
2018-02-28
BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.
Developing a regional canopy fuels assessment strategy using multi-scale lidar
Peterson, Birgit E.; Nelson, Kurtis
2011-01-01
Accurate assessments of canopy fuels are needed by fire scientists to understand fire behavior and to predict future fire occurrence. A key descriptor for canopy fuels is canopy bulk density (CBD). CBD is closely linked to the structure of the canopy; therefore, lidar measurements are particularly well suited to assessments of CBD. LANDFIRE scientists are exploring methods to integrate airborne and spaceborne lidar datasets into a national mapping effort. In this study, airborne lidar, spaceborne lidar, and field data are used to map CBD in the Yukon Flats Ecoregion, with the airborne lidar serving as a bridge between the field data and the spaceborne observations. The field-based CBD was positively correlated with airborne lidar observations (R2=0.78). Mapped values of CBD using the airborne lidar dataset were significantly correlated with spaceborne lidar observations when analyzed by forest type (R2=0.62, evergreen and R2=0.71, mixed). Though continued research is necessary to validate these results, they do support the feasibility of airborne and, most importantly, spaceborne lidar data for canopy fuels assessment.
Yu, Ying; Shen, Guofeng; Zhou, Yufeng; Bai, Jingfeng; Chen, Yazhu
2013-11-01
With the popularity of ultrasound therapy in clinics, characterization of the acoustic field is important not only to the tolerability and efficiency of ablation, but also for treatment planning. A quantitative method was introduced to assess the intensity distribution of a focused ultrasound beam using a hydrophone and an infrared camera with no prior knowledge of the acoustic and thermal parameters of the absorber or the configuration of the array elements. This method was evaluated in both theoretical simulations and experimental measurements. A three-layer model was developed to calculate the acoustic field in the absorber, the absorbed acoustic energy during the sonication and the consequent temperature elevation. Experiments were carried out to measure the acoustic pressure with the hydrophone and the temperature elevation with the infrared camera. The percentage differences between the derived results and the simulation are <4.1% for on-axis intensity and <21.1% for -6-dB beam width at heating times up to 360 ms in the focal region of three phased-array ultrasound transducers using two different absorbers. The proposed method is an easy, quick and reliable approach to calibrating focused ultrasound transducers with satisfactory accuracy. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Afach, S.; Ayres, N. J.; Ban, G.; Bison, G.; Bodek, K.; Chowdhuri, Z.; Daum, M.; Fertl, M.; Franke, B.; Griffith, W. C.; Grujić, Z. D.; Harris, P. G.; Heil, W.; Hélaine, V.; Kasprzak, M.; Kermaidic, Y.; Kirch, K.; Knowles, P.; Koch, H.-C.; Komposch, S.; Kozela, A.; Krempel, J.; Lauss, B.; Lefort, T.; Lemière, Y.; Mtchedlishvili, A.; Musgrave, M.; Naviliat-Cuncic, O.; Pendlebury, J. M.; Piegsa, F. M.; Pignol, G.; Plonka-Spehr, C.; Prashanth, P. N.; Quéméner, G.; Rawlik, M.; Rebreyend, D.; Ries, D.; Roccia, S.; Rozpedzik, D.; Schmidt-Wellenburg, P.; Severijns, N.; Thorne, J. A.; Weis, A.; Wursten, E.; Wyszynski, G.; Zejma, J.; Zenner, J.; Zsigmond, G.
2015-10-01
We describe a spin-echo method for ultracold neutrons (UCNs) confined in a precession chamber and exposed to a |B0|=1 μ T magnetic field. We have demonstrated that the analysis of UCN spin-echo resonance signals in combination with knowledge of the ambient magnetic field provides an excellent method by which to reconstruct the energy spectrum of a confined ensemble of neutrons. The method takes advantage of the relative dephasing of spins arising from a gravitationally induced striation of stored UCNs of different energies, and also permits an improved determination of the vertical magnetic-field gradient with an exceptional accuracy of 1.1 pT /cm . This novel combination of a well-known nuclear resonance method and gravitationally induced vertical striation is unique in the realm of nuclear and particle physics and should prove to be invaluable for the assessment of systematic effects in precision experiments such as searches for an electric dipole moment of the neutron or the measurement of the neutron lifetime.
Are rapid population estimates accurate? A field trial of two different assessment methods.
Grais, Rebecca F; Coulombier, Denis; Ampuero, Julia; Lucas, Marcelino E S; Barretto, Avertino T; Jacquier, Guy; Diaz, Francisco; Balandine, Serge; Mahoudeau, Claude; Brown, Vincent
2006-09-01
Emergencies resulting in large-scale displacement often lead to populations resettling in areas where basic health services and sanitation are unavailable. To plan relief-related activities quickly, rapid population size estimates are needed. The currently recommended Quadrat method estimates total population by extrapolating the average population size living in square blocks of known area to the total site surface. An alternative approach, the T-Square, provides a population estimate based on analysis of the spatial distribution of housing units taken throughout a site. We field tested both methods and validated the results against a census in Esturro Bairro, Beira, Mozambique. Compared to the census (population: 9,479), the T-Square yielded a better population estimate (9,523) than the Quadrat method (7,681; 95% confidence interval: 6,160-9,201), but was more difficult for field survey teams to implement. Although applicable only to similar sites, several general conclusions can be drawn for emergency planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robert P. Breckenridge; Maxine Dakins; Stephen Bunting
2011-09-01
In this study, the use of unmanned aerial vehicles (UAVs) as a quick and safe method for monitoring biotic resources was evaluated. Vegetation cover and the amount of bare ground are important factors in understanding the sustainability of many ecosystems and assessment of rangeland health. Methods that improve speed and cost efficiency could greatly improve how biotic resources are monitored on western lands. Sagebrush steppe ecosystems provide important habitat for a variety of species (including sage grouse and pygmy rabbit). Improved methods are needed to support monitoring these habitats because there are not enough resource specialists or funds available formore » comprehensive ground evaluations. In this project, two UAV platforms, fixed wing and helicopter, were used to collect still-frame imagery to assess vegetation cover in sagebrush steppe ecosystems. This paper discusses the process for collecting and analyzing imagery from the UAVs to (1) estimate percent cover for six different vegetation types (shrub, dead shrub, grass, forb, litter, and bare ground) and (2) locate sage grouse using representative decoys. The field plots were located on the Idaho National Engineering (INL) site west of Idaho Falls, Idaho, in areas with varying amounts and types of vegetation cover. A software program called SamplePoint was used along with visual inspection to evaluate percent cover for the six cover types. Results were compared against standard field measurements to assess accuracy. The comparison of fixed-wing and helicopter UAV technology against field estimates shows good agreement for the measurement of bare ground. This study shows that if a high degree of detail and data accuracy is desired, then a helicopter UAV may be a good platform to use. If the data collection objective is to assess broad-scale landscape level changes, then the collection of imagery with a fixed-wing system is probably more appropriate.« less
Nondestructive assessment of timber bridges using a vibration-based method
Xiping Wang; James P. Wacker; Robert J. Ross; Brian K. Brashaw
2005-01-01
This paper describes an effort to develop a global dynamic testing technique for evaluating the overall stiffness of timber bridge superstructures. A forced vibration method was used to measure the natural frequency of single-span timber bridges in the laboratory and field. An analytical model based on simple beam theory was proposed to represent the relationship...
Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...
ERIC Educational Resources Information Center
Haas, Eric; Wilson, Glen Yahola; Cobb, Casey D.; Hyle, Adrienne E.; Jordan, Kitty; Kearney, Kerri S.
2007-01-01
Study Purpose: This article examines the influence of "Educational Administration Quarterly (EAQ)" on the scholarly literature in education during the 25-year period 1979 to 2003. This article continues part of the first critique of EAQ conducted by Roald Campbell in 1979. Study Methods: Two citation measures are used in this study to assess EAQ…
Michael C. Amacher; Katherine P. O' Neill
2004-01-01
Soil compaction is an important indicator of soil quality, yet few practical methods are available to quantitatively measure this variable. Although an assessment of the areal extent of soil compaction is included as part of the soil indicator portion of the Forest Inventory & Analysis (FIA) program, no quantitative measurement of the degree of soil compaction...
ERIC Educational Resources Information Center
Grønborg, Lisbeth
2013-01-01
This paper sheds light on how competence assessment takes place in the Danish Vocational and Educational Training System. It discusses how intentions formulated by the government have unintentional effects when implemented in practice. The qualitative methods used in this study consist of participant observations from my field study of dropouts in…
Motamedzade, Majid; Ashuri, Mohammad Reza; Golmohammadi, Rostam; Mahjub, Hossein
2011-06-13
During the last decades, to assess the risk factors of work-related musculoskeletal disorders (WMSDs), enormous observational methods have been developed. Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods in this field. This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. All jobs were observed by a trained occupational health practitioner. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%). There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs, and determining the potential risk for incidence of WMSDs. Therefore, there is possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.
Linzalone, Nunzia; Bianchi, Fabrizio; Cervino, Marco; Cori, Liliana; De Gennaro, Gianluigi; Mangia, Cristina; Bustaffa, Elisa
2018-01-01
In Europe, Health Impact Assessment (HIA) is a consolidated practice aimed at predicting health impacts supporting the predisposition of plans and projects subjected to authorization procedures. In Italy, further developments are needed to harmonize the practice and consolidate methodologies in order to extend the HIA application in different fields. The recent HIA conducted in Val d'Agri (Basilicata) on the impacts of a first crude oil treatment plant represents an opportunity to illustrate its tools, methods and fields of application. In this experience, participation methods in impact assessment have been adapted to the context, emphasizing aspects of ethics, equity and democracy. Environmental and epidemiological studies were included in the HIA Val d'Agri in order to characterize the environment and assess the health status of the resident population. On the basis of the results public health recommendations have been elaborated, shared with the stakeholders and shared with local and regional administrators. The experience in Val d'Agri introduces elements of reflection on the potential of HIA at local level in order to support the public health and the environmental control systems in the area, as well as planning based on preventive environment and HIA.
Assessment of technological level of stem cell research using principal component analysis.
Do Cho, Sung; Hwan Hyun, Byung; Kim, Jae Kyeom
2016-01-01
In general, technological levels have been assessed based on specialist's opinion through the methods such as Delphi. But in such cases, results could be significantly biased per study design and individual expert. In this study, therefore scientific literatures and patents were selected by means of analytic indexes for statistic approach and technical assessment of stem cell fields. The analytic indexes, numbers and impact indexes of scientific literatures and patents, were weighted based on principal component analysis, and then, were summated into the single value. Technological obsolescence was calculated through the cited half-life of patents issued by the United States Patents and Trademark Office and was reflected in technological level assessment. As results, ranks of each nation's in reference to the technology level were rated by the proposed method. Furthermore we were able to evaluate strengthens and weaknesses thereof. Although our empirical research presents faithful results, in the further study, there is a need to compare the existing methods and the suggested method.
Assessment method of digital Chinese dance movements based on virtual reality technology
NASA Astrophysics Data System (ADS)
Feng, Wei; Shao, Shuyuan; Wang, Shumin
2008-03-01
Virtual reality has played an increasing role in such areas as medicine, architecture, aviation, engineering science and advertising. However, in the art fields, virtual reality is still in its infancy in the representation of human movements. Based on the techniques of motion capture and reuse of motion capture data in virtual reality environment, this paper presents an assessment method in order to evaluate the quantification of dancers' basic Arm Position movements in Chinese traditional dance. In this paper, the data for quantifying traits of dance motions are defined and measured on dancing which performed by an expert and two beginners, with results indicating that they are beneficial for evaluating dance skills and distinctiveness, and the assessment method of digital Chinese dance movements based on virtual reality technology is validity and feasibility.
Ahn, Eunjong; Kim, Hyunjun; Sim, Sung-Han; Shin, Sung Woo; Shin, Myoungsu
2017-01-01
Recently, self-healing technologies have emerged as a promising approach to extend the service life of social infrastructure in the field of concrete construction. However, current evaluations of the self-healing technologies developed for cementitious materials are mostly limited to lab-scale experiments to inspect changes in surface crack width (by optical microscopy) and permeability. Furthermore, there is a universal lack of unified test methods to assess the effectiveness of self-healing technologies. Particularly, with respect to the self-healing of concrete applied in actual construction, nondestructive test methods are required to avoid interrupting the use of the structures under evaluation. This paper presents a review of all existing research on the principles of ultrasonic test methods and case studies pertaining to self-healing concrete. The main objective of the study is to examine the applicability and limitation of various ultrasonic test methods in assessing the self-healing performance. Finally, future directions on the development of reliable assessment methods for self-healing cementitious materials are suggested. PMID:28772640
Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy
2015-09-15
One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.
NASA Astrophysics Data System (ADS)
Inamori, Takaya; Otsuki, Kensuke; Sugawara, Yoshiki; Saisutjarit, Phongsatorn; Nakasuka, Shinichi
2016-11-01
This study proposes a novel method for three-axis attitude control using only magnetic torquers (MTQs). Previously, MTQs have been utilized for attitude control in many low Earth orbit satellites. Although MTQs are useful for achieving attitude control at low cost and high reliability without the need for propellant, these electromagnetic coils cannot be used to generate an attitude control torque about the geomagnetic field vector. Thus, conventional attitude control methods using MTQs assume the magnetic field changes in an orbital period so that the satellite can generate a required attitude control torque after waiting for a change in the magnetic field direction. However, in a near magnetic equatorial orbit, the magnetic field does not change in an inertial reference frame. Thus, satellites cannot generate a required attitude control torque in a single orbital period with only MTQs. This study proposes a method for achieving a rotation about the geomagnetic field vector by generating a torque that is perpendicular to it. First, this study shows that the three-axis attitude control using only MTQs is feasible with a two-step rotation. Then, the study proposes a method for controlling the attitude with the two-step rotation using a PD controller. Finally, the proposed method is assessed by examining the results of numerical simulations.
Alexanderian, Alen; Zhu, Liang; Salloum, Maher; Ma, Ronghui; Yu, Meilin
2017-09-01
In this study, statistical models are developed for modeling uncertain heterogeneous permeability and porosity in tumors, and the resulting uncertainties in pressure and velocity fields during an intratumoral injection are quantified using a nonintrusive spectral uncertainty quantification (UQ) method. Specifically, the uncertain permeability is modeled as a log-Gaussian random field, represented using a truncated Karhunen-Lòeve (KL) expansion, and the uncertain porosity is modeled as a log-normal random variable. The efficacy of the developed statistical models is validated by simulating the concentration fields with permeability and porosity of different uncertainty levels. The irregularity in the concentration field bears reasonable visual agreement with that in MicroCT images from experiments. The pressure and velocity fields are represented using polynomial chaos (PC) expansions to enable efficient computation of their statistical properties. The coefficients in the PC expansion are computed using a nonintrusive spectral projection method with the Smolyak sparse quadrature. The developed UQ approach is then used to quantify the uncertainties in the random pressure and velocity fields. A global sensitivity analysis is also performed to assess the contribution of individual KL modes of the log-permeability field to the total variance of the pressure field. It is demonstrated that the developed UQ approach can effectively quantify the flow uncertainties induced by uncertain material properties of the tumor.
EPA Field Manual for Coral Reef Assessments
The Water Quality Research Program (WQRP) supports development of coral reef biological criteria. Research is focused on developing methods and tools to support implementation of legally defensible biological standards for maintaining biological integrity, which is protected by ...
Subsurface application enhances benefits of manure redistribution
USDA-ARS?s Scientific Manuscript database
Sustainable nutrient management requires redistribution of livestock manure from nutrient-excess areas to nutrient-deficit areas. Field experiments were conducted to assess agronomic and environmental effects of different poultry litter application methods (surface vs. subsurface) and timings (fall ...
Determining open cluster membership. A Bayesian framework for quantitative member classification
NASA Astrophysics Data System (ADS)
Stott, Jonathan J.
2018-01-01
Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.
Older Adults' Acceptance of Activity Trackers
Preusse, Kimberly C.; Mitzner, Tracy L.; Fausset, Cara Bailey; Rogers, Wendy A.
2016-01-01
Objective To assess the usability and acceptance of activity tracking technologies by older adults. Method First in our multi-method approach, we conducted heuristic evaluations of two activity trackers that revealed potential usability barriers to acceptance. Next, questionnaires and interviews were administered to 16 older adults (Mage=70, SDage=3.09, rangeage= 65-75) before and after a 28-day field study to understand facilitators and additional barriers to acceptance. These measurements were supplemented with diary and usage data and assessed if and why users overcame usability issues. Results The heuristic evaluation revealed usability barriers in System Status Visibility; Error Prevention; and Consistency and Standards. The field study revealed additional barriers (e.g., accuracy, format), and acceptance-facilitators (e.g., goal-tracking, usefulness, encouragement). Discussion The acceptance of wellness management technologies, such as activity trackers, may be increased by addressing acceptance-barriers during deployment (e.g., providing tutorials on features that were challenging, communicating usefulness). PMID:26753803
Input-variable sensitivity assessment for sediment transport relations
NASA Astrophysics Data System (ADS)
Fernández, Roberto; Garcia, Marcelo H.
2017-09-01
A methodology to assess input-variable sensitivity for sediment transport relations is presented. The Mean Value First Order Second Moment Method (MVFOSM) is applied to two bed load transport equations showing that it may be used to rank all input variables in terms of how their specific variance affects the overall variance of the sediment transport estimation. In sites where data are scarce or nonexistent, the results obtained may be used to (i) determine what variables would have the largest impact when estimating sediment loads in the absence of field observations and (ii) design field campaigns to specifically measure those variables for which a given transport equation is most sensitive; in sites where data are readily available, the results would allow quantifying the effect that the variance associated with each input variable has on the variance of the sediment transport estimates. An application of the method to two transport relations using data from a tropical mountain river in Costa Rica is implemented to exemplify the potential of the method in places where input data are limited. Results are compared against Monte Carlo simulations to assess the reliability of the method and validate its results. For both of the sediment transport relations used in the sensitivity analysis, accurate knowledge of sediment size was found to have more impact on sediment transport predictions than precise knowledge of other input variables such as channel slope and flow discharge.
Shur, P Z; Zaĭtseva, N V; Alekseev, V B; Shliapnikov, D M
2015-01-01
In accordance with the international documents in the field of occupational safety and hygiene, the assessment and minimization of occupational risks is a key instrument for the health maintenance of workers. One of the main ways to achieve it is the minimization of occupational risks. Correspondingly, the instrument for the implementation of this method is the methodology of analysis of occupational risks. In Russian Federation there were the preconditions for the formation of the system for the assessment and management of occupational risks. As the target of the national (state) policy in the field of occupational safety in accordance with ILO Conventions it can be offered the prevention of accidents and injuries to health arising from work or related with it, minimizing the causes of hazards inherent in the working environment, as far as it is reasonably and practically feasible. Global trend ofusing the methodology of the assessment and management of occupational risks to life and health of citizens requires the improvement of national policies in the field of occupational hygiene and safety. Achieving an acceptable level of occupational risk in the formation of national policy in the field of occupational hygiene and safety can be considered as one of the main tasks.
Assessing flood damage to agriculture using color infrared aerial photography
Anderson, William H.
1977-01-01
The rationale for using color-infrared (CIR) film to assist in assessing flood damage to agriculture is demonstrated using examples prepared from photographs acquired of the 1975 flood in the Red River Valley of North Dakota and Minnesota. Information concerning flood inundation boundaries, crop damage, soil erosion, sedimentation, and other similar general features and conditions was obtained through the interpretation of CIR aerial photographs. CIR aerial photographs can be used to help improve the estimates of potential remaining production on a field by field basis, owing to the increased accuracy obtained in determining the area component of crop production as compared to conventional ground sketching methods.
Accurate evaluation and analysis of functional genomics data and methods
Greene, Casey S.; Troyanskaya, Olga G.
2016-01-01
The development of technology capable of inexpensively performing large-scale measurements of biological systems has generated a wealth of data. Integrative analysis of these data holds the promise of uncovering gene function, regulation, and, in the longer run, understanding complex disease. However, their analysis has proved very challenging, as it is difficult to quickly and effectively assess the relevance and accuracy of these data for individual biological questions. Here, we identify biases that present challenges for the assessment of functional genomics data and methods. We then discuss evaluation methods that, taken together, begin to address these issues. We also argue that the funding of systematic data-driven experiments and of high-quality curation efforts will further improve evaluation metrics so that they more-accurately assess functional genomics data and methods. Such metrics will allow researchers in the field of functional genomics to continue to answer important biological questions in a data-driven manner. PMID:22268703
Assessing Cardiac Metabolism: A Scientific Statement From the American Heart Association.
Taegtmeyer, Heinrich; Young, Martin E; Lopaschuk, Gary D; Abel, E Dale; Brunengraber, Henri; Darley-Usmar, Victor; Des Rosiers, Christine; Gerszten, Robert; Glatz, Jan F; Griffin, Julian L; Gropler, Robert J; Holzhuetter, Hermann-Georg; Kizer, Jorge R; Lewandowski, E Douglas; Malloy, Craig R; Neubauer, Stefan; Peterson, Linda R; Portman, Michael A; Recchia, Fabio A; Van Eyk, Jennifer E; Wang, Thomas J
2016-05-13
In a complex system of interrelated reactions, the heart converts chemical energy to mechanical energy. Energy transfer is achieved through coordinated activation of enzymes, ion channels, and contractile elements, as well as structural and membrane proteins. The heart's needs for energy are difficult to overestimate. At a time when the cardiovascular research community is discovering a plethora of new molecular methods to assess cardiac metabolism, the methods remain scattered in the literature. The present statement on "Assessing Cardiac Metabolism" seeks to provide a collective and curated resource on methods and models used to investigate established and emerging aspects of cardiac metabolism. Some of those methods are refinements of classic biochemical tools, whereas most others are recent additions from the powerful tools of molecular biology. The aim of this statement is to be useful to many and to do justice to a dynamic field of great complexity. © 2016 American Heart Association, Inc.
Taegtmeyer, Heinrich; Young, Martin E.; Lopaschuk, Gary D.; Abel, E. Dale; Brunengraber, Henri; Darley-Usmar, Victor; Des Rosiers, Christine; Gerszten, Robert; Glatz, Jan F.; Griffin, Julian L.; Gropler, Robert J.; Holzhuetter, Hermann-Georg; Kizer, Jorge R.; Lewandowski, E. Douglas; Malloy, Craig R.; Neubauer, Stefan; Peterson, Linda R.; Portman, Michael A.; Recchia, Fabio A.; Van Eyk, Jennifer E.; Wang, Thomas J.
2016-01-01
In a complex system of interrelated reactions, the heart converts chemical energy to mechanical energy. Energy transfer is achieved through coordinated activation of enzymes, ion channels, and contractile elements, as well as structural and membrane proteins. The heart’s needs for energy are difficult to overestimate. At a time when the cardiovascular research community is discovering a plethora of new molecular methods to assess cardiac metabolism, the methods remain scattered in the literature. The present statement on “Assessing Cardiac Metabolism” seeks to provide a collective and curated resource on methods and models used to investigate established and emerging aspects of cardiac metabolism. Some of those methods are refinements of classic biochemical tools, whereas most others are recent additions from the powerful tools of molecular biology. The aim of this statement is to be useful to many and to do justice to a dynamic field of great complexity. PMID:27012580
ERIC Educational Resources Information Center
Trigg, Richard; Skevington, Suzanne M.; Jones, Roy W.
2007-01-01
Purpose: The study aim was to develop a measure of self-reported quality of life (QoL) for people with mild to moderate dementia based on their views--the Bath Assessment of Subjective Quality of Life in Dementia (BASQID). Design and Methods: We developed the measure through multiple stages. Two field tests of the measure (ns = 60 and 150)…
Yang, Qi; Meng, Fan-Rui; Bourque, Charles P-A; Zhao, Zhengyong
2017-09-08
Forest ecosite reflects the local site conditions that are meaningful to forest productivity as well as basic ecological functions. Field assessments of vegetation and soil types are often used to identify forest ecosites. However, the production of high-resolution ecosite maps for large areas from interpolating field data is difficult because of high spatial variation and associated costs and time requirements. Indices of soil moisture and nutrient regimes (i.e., SMR and SNR) introduced in this study reflect the combined effects of biogeochemical and topographic factors on forest growth. The objective of this research is to present a method for creating high-resolution forest ecosite maps based on computer-generated predictions of SMR and SNR for an area in Atlantic Canada covering about 4.3 × 10 6 hectares (ha) of forestland. Field data from 1,507 forest ecosystem classification plots were used to assess the accuracy of the ecosite maps produced. Using model predictions of SMR and SNR alone, ecosite maps were 61 and 59% correct in identifying 10 Acadian- and Maritime-Boreal-region ecosite types, respectively. This method provides an operational framework for the production of high-resolution maps of forest ecosites over large areas without the need for data from expensive, supplementary field surveys.
Field evaluation of an avian risk assessment model
Vyas, N.B.; Spann, J.W.; Hulse, C.S.; Borges, S.L.; Bennett, R.S.; Torrez, M.; Williams, B.I.; Leffel, R.
2006-01-01
We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in the field. We tested technical-grade diazinon and its D Z N- 50W (50% diazinon active ingredient wettable powder) formulation on Canada goose (Branta canadensis) goslings. Brain acetylcholinesterase activity was measured, and the feathers and skin, feet. and gastrointestinal contents were analyzed for diazinon residues. The dose-response curves showed that diazinon was significantly more toxic to goslings in the outdoor test than in the laboratory tests. The deterministic risk assessment method identified the potential for risk to birds in general, but the factors associated with extrapolating from the laboratory to the field, and from the laboratory test species to other species, resulted in the underestimation of risk to the goslings. The present study indicates that laboratory-based risk quotients should be interpreted with caution.
Ranging methods for developing wellbores in subsurface formations
MacDonald, Duncan [Houston, TX
2011-09-06
A method for forming two or more wellbores in a subsurface formation includes forming a first wellbore in the formation. A second wellbore is directionally drilled in a selected relationship relative to the first wellbore. At least one magnetic field is provided in the second wellbore using one or more magnets in the second wellbore located on a drilling string used to drill the second wellbore. At least one magnetic field is sensed in the first wellbore using at least two sensors in the first wellbore as the magnetic field passes by the at least two sensors while the second wellbore is being drilled. A position of the second wellbore is continuously assessed relative to the first wellbore using the sensed magnetic field. The direction of drilling of the second wellbore is adjusted so that the second wellbore remains in the selected relationship relative to the first wellbore.
The citation merit of scientific publications.
Crespo, Juan A; Ortuño-Ortín, Ignacio; Ruiz-Castillo, Javier
2012-01-01
We propose a new method to assess the merit of any set of scientific papers in a given field based on the citations they receive. Given a field and a citation impact indicator, such as the mean citation or the [Formula: see text]-index, the merit of a given set of [Formula: see text] articles is identified with the probability that a randomly drawn set of [Formula: see text] articles from a given pool of articles in that field has a lower citation impact according to the indicator in question. The method allows for comparisons between sets of articles of different sizes and fields. Using a dataset acquired from Thomson Scientific that contains the articles published in the periodical literature in the period 1998-2007, we show that the novel approach yields rankings of research units different from those obtained by a direct application of the mean citation or the [Formula: see text]-index.
Williams, Colin F.; Reed, Marshall J.; Mariner, Robert H.
2008-01-01
The U. S. Geological Survey (USGS) is conducting an updated assessment of geothermal resources in the United States. The primary method applied in assessments of identified geothermal systems by the USGS and other organizations is the volume method, in which the recoverable heat is estimated from the thermal energy available in a reservoir. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. The new assessment will incorporate some changes in the models for temperature and depth ranges for electric power production, preferred chemical geothermometers for estimates of reservoir temperatures, estimates of reservoir volumes, and geothermal energy recovery factors. Monte Carlo simulations are used to characterize uncertainties in the estimates of electric power generation. These new models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of natural geothermal reservoirs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trott, Donna M.; LePage, Jane; Hebert, Vincent
A regional air assessment was performed to characterize volatile natural isothiocyanate (NITC) compounds in air during soil incorporation of mustard cover crops in Washington State. Field air sampling and analytical methods were developed specific to three NITCs known to be present in air at appreciable concentrations during/after field incorporation. The maximum observed concentrations in air for the allyl, benzyl, and phenethyl isothiocyanates were respectively 188, 6.1, and 0.7 lg m-3 during mustard incorporation. Based on limited inhalation toxicity information, airborne NITC concentrations did not appear to pose an acute human inhalation exposure concern to field operators and bystanders.
Advancements in the field of personality development.
De Fruyt, Filip; Van Leeuwen, Karla
2014-07-01
A summary is provided what the fields of personality and developmental psychology had to offer each other the past decade, reflected in the eleven contributions enclosed in this special issue. Strengths and opportunities to further advance the field are identified, including the extension of general trait with maladaptive trait models, the use of alternative methods to assess personality, and the adoption of configural approaches to describe traits in individuals, beyond more traditional person-centered approaches. Copyright © 2014 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Finite geometry effects of field-aligned currents
NASA Technical Reports Server (NTRS)
Fung, Shing F.; Hoffman, R. A.
1992-01-01
Results are presented of model calculations of the magnetic field produced by finite current regions that would be measured by a spaceborne magnetometer. Conditions were examined under which the infinite current sheet approximation can be applied to the calculation of the field-aligned current (FAC) density, using satellite magnetometer data. The accuracy of the three methods used for calculating the current sheet normal direction with respect to the spacecraft trajectory was assessed. It is shown that the model can be used to obtain the position and the orientation of the spacecraft trajectory through the FAC region.
Repeatability and validity of a field kit for estimation of cholinesterase in whole blood.
London, L; Thompson, M L; Sacks, S; Fuller, B; Bachmann, O M; Myers, J E
1995-01-01
OBJECTIVES--To evaluate a spectrophotometric field kit (Test-Mate-OP) for repeatability and validity in comparison with reference laboratory methods and to model its anticipated sensitivity and specificity based on these findings. METHODS--76 farm workers between the age of 20 and 55, of whom 30 were pesticide applicators exposed to a range of organophosphates in the preceding 10 days, had blood taken for plasma cholinesterase (PCE) and erythrocyte cholinesterase (ECE) measurement by field kit or laboratory methods. Paired blinded duplicate samples were taken from subgroups in the sample to assess repeatability of laboratory and field kit methods. Field kits were also used to test venous blood in one subgroup. The variance obtained for the field kit tests was then applied to two hypothetical scenarios that used published action guidelines to model the kit's sensitivity and specificity. RESULTS--Repeatability for PCE was much poorer and for ECE slightly poorer than that of laboratory measures. A substantial upward bias for field kit ECE relative to laboratory measurements was found. Sensitivity of the kit to a 40% drop in PCE was 67%, whereas that for ECE was 89%. Specificity of the kit with no change in mean of the population was 100% for ECE and 91% for PCE. CONCLUSION--Field kit ECE estimation seems to be sufficiently repeatable for surveillance activities, whereas PCE does not. Repeatability of both tests seems to be too low for use in epidemiological dose-response investigations. Further research is indicated to characterise the upward bias in ECE estimation on the kit. PMID:7697143
Martin, Jeffrey D.; Norman, Julia E.; Sandstrom, Mark W.; Rose, Claire E.
2017-09-06
U.S. Geological Survey monitoring programs extensively used two analytical methods, gas chromatography/mass spectrometry and liquid chromatography/mass spectrometry, to measure pesticides in filtered water samples during 1992–2012. In October 2012, the monitoring programs began using direct aqueous-injection liquid chromatography tandem mass spectrometry as a new analytical method for pesticides. The change in analytical methods, however, has the potential to inadvertently introduce bias in analysis of datasets that span the change.A field study was designed to document performance of the new method in a variety of stream-water matrices and to quantify any potential changes in measurement bias or variability that could be attributed to changes in analytical methods. The goals of the field study were to (1) summarize performance (bias and variability of pesticide recovery) of the new method in a variety of stream-water matrices; (2) compare performance of the new method in laboratory blank water (laboratory reagent spikes) to that in a variety of stream-water matrices; (3) compare performance (analytical recovery) of the new method to that of the old methods in a variety of stream-water matrices; (4) compare pesticide detections and concentrations measured by the new method to those of the old methods in a variety of stream-water matrices; (5) compare contamination measured by field blank water samples in old and new methods; (6) summarize the variability of pesticide detections and concentrations measured by the new method in field duplicate water samples; and (7) identify matrix characteristics of environmental water samples that adversely influence the performance of the new method. Stream-water samples and a variety of field quality-control samples were collected at 48 sites in the U.S. Geological Survey monitoring networks during June–September 2012. Stream sites were located across the United States and included sites in agricultural and urban land-use settings, as well as sites on major rivers.The results of the field study identified several challenges for the analysis and interpretation of data analyzed by both old and new methods, particularly when data span the change in methods and are combined for analysis of temporal trends in water quality. The main challenges identified are large (greater than 30 percent), statistically significant differences in analytical recovery, detection capability, and (or) measured concentrations for selected pesticides. These challenges are documented and discussed, but specific guidance or statistical methods to resolve these differences in methods are beyond the scope of the report. The results of the field study indicate that the implications of the change in analytical methods must be assessed individually for each pesticide and method.Understanding the possible causes of the systematic differences in concentrations between methods that remain after recovery adjustment might be necessary to determine how to account for the differences in data analysis. Because recoveries for each method are independently determined from separate reference standards and spiking solutions, the differences might be due to an error in one of the reference standards or solutions or some other basic aspect of standard procedure in the analytical process. Further investigation of the possible causes is needed, which will lead to specific decisions on how to compensate for these differences in concentrations in data analysis. In the event that further investigations do not provide insight into the causes of systematic differences in concentrations between methods, the authors recommend continuing to collect and analyze paired environmental water samples by both old and new methods. This effort should be targeted to seasons, sites, and expected concentrations to supplement those concentrations already assessed and to compare the ongoing analytical recovery of old and new methods to those observed in the summer and fall of 2012.
Boll, Björn; Josse, Lena; Heubach, Anja; Hochenauer, Sophie; Finkler, Christof; Huwyler, Jörg; Koulov, Atanas V
2018-04-25
Asymmetric flow field-flow fractionation is a valuable tool for the characterization of protein aggregates in biotechnology owing to its broad size range and unique separation principle. However, in practice asymmetric flow field-flow fractionation is non-trivial to use due to the major deviations from theory and the influence on separation by various factors that are not fully understood. Here we report methods to assess the non-ideal effects that influence asymmetric flow field-flow fractionation separation and for the first time identify experimentally the main factors that impact it. Furthermore, we propose new approaches to minimize such non-ideal behavior, showing that by adjusting the mobile phase composition (pH and ionic strength) the resolution of asymmetric flow field-flow fractionation separation can be drastically improved. Additionally, we propose a best practice method for new proteins. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
Rohner, Fabian; Kangambèga, Marcelline O.; Khan, Noor; Kargougou, Robert; Garnier, Denis; Sanou, Ibrahima; Ouaro, Bertine D.; Petry, Nicolai; Wirth, James P.; Jooste, Pieter
2015-01-01
Background Iodine deficiency has important health and development consequences and the introduction of iodized salt as national programs has been a great public health success in the past decades. To render national salt iodization programs sustainable and ensure adequate iodization levels, simple methods to quantitatively assess whether salt is adequately iodized are required. Several methods claim to be simple and reliable, and are available on the market or are in development. Objective This work has validated the currently available quantitative rapid test kits (quantRTK) in a comparative manner for both their laboratory performance and ease of use in field settings. Methods Laboratory performance parameters (linearity, detection and quantification limit, intra- and inter-assay imprecision) were conducted on 5 quantRTK. We assessed inter-operator imprecision using salt of different quality along with the comparison of 59 salt samples from across the globe; measurements were made both in a laboratory and a field setting by technicians and non-technicians. Results from the quantRTK were compared against iodometric titration for validity. An ‘ease-of-use’ rating system was developed to identify the most suitable quantRTK for a given task. Results Most of the devices showed acceptable laboratory performance, but for some of the devices, use by non-technicians revealed poorer performance when working in a routine manner. Of the quantRTK tested, the iCheck® and I-Reader® showed most consistent performance and ease of use, and a newly developed paper-based method (saltPAD) holds promise if further developed. Conclusions User- and field-friendly devices are now available and the most appropriate quantRTK can be selected depending on the number of samples and the budget available. PMID:26401655
Image analysis for skeletal evaluation of carpal bones
NASA Astrophysics Data System (ADS)
Ko, Chien-Chuan; Mao, Chi-Wu; Lin, Chi-Jen; Sun, Yung-Nien
1995-04-01
The assessment of bone age is an important field to the pediatric radiology. It provides very important information for treatment and prediction of skeletal growth in a developing child. So far, various computerized algorithms for automatically assessing the skeletal growth have been reported. Most of these methods made attempt to analyze the phalangeal growth. The most fundamental step in these automatic measurement methods is the image segmentation that extracts bones from soft-tissue and background. These automatic segmentation methods of hand radiographs can roughly be categorized into two main approaches that are edge and region based methods. This paper presents a region-based carpal-bone segmentation approach. It is organized into four stages: contrast enhancement, moment-preserving thresholding, morphological processing, and region-growing labeling.
NASA Astrophysics Data System (ADS)
Castillo, Carlos; Pérez, Rafael
2017-04-01
The assessment of gully erosion volumes is essential for the quantification of soil losses derived from this relevant degradation process. Traditionally, 2D and 3D approaches has been applied for this purpose (Casalí et al., 2006). Although innovative 3D approaches have recently been proposed for gully volume quantification, a renewed interest can be found in literature regarding the useful information that cross-section analysis still provides in gully erosion research. Moreover, the application of methods based on 2D approaches can be the most cost-effective approach in many situations such as preliminary studies with low accuracy requirements or surveys under time or budget constraints. The main aim of this work is to examine the key factors controlling volume error variability in 2D gully assessment by means of a stochastic experiment involving a Monte Carlo analysis over synthetic gully profiles in order to 1) contribute to a better understanding of the drivers and magnitude of gully erosion 2D-surveys uncertainty and 2) provide guidelines for optimal survey designs. Owing to the stochastic properties of error generation in 2D volume assessment, a statistical approach was followed to generate a large and significant set of gully reach configurations to evaluate quantitatively the influence of the main factors controlling the uncertainty of the volume assessment. For this purpose, a simulation algorithm in Matlab® code was written, involving the following stages: - Generation of synthetic gully area profiles with different degrees of complexity (characterized by the cross-section variability) - Simulation of field measurements characterised by a survey intensity and the precision of the measurement method - Quantification of the volume error uncertainty as a function of the key factors In this communication we will present the relationships between volume error and the studied factors and propose guidelines for 2D field surveys based on the minimal survey densities required to achieve a certain accuracy given the cross-sectional variability of a gully and the measurement method applied. References Casali, J., Loizu, J., Campo, M.A., De Santisteban, L.M., Alvarez-Mozos, J., 2006. Accuracy of methods for field assessment of rill and ephemeral gully erosion. Catena 67, 128-138. doi:10.1016/j.catena.2006.03.005
Detecting glaucomatous change in visual fields: Analysis with an optimization framework.
Yousefi, Siamak; Goldbaum, Michael H; Varnousfaderani, Ehsan S; Belghith, Akram; Jung, Tzyy-Ping; Medeiros, Felipe A; Zangwill, Linda M; Weinreb, Robert N; Liebmann, Jeffrey M; Girkin, Christopher A; Bowd, Christopher
2015-12-01
Detecting glaucomatous progression is an important aspect of glaucoma management. The assessment of longitudinal series of visual fields, measured using Standard Automated Perimetry (SAP), is considered the reference standard for this effort. We seek efficient techniques for determining progression from longitudinal visual fields by formulating the problem as an optimization framework, learned from a population of glaucoma data. The longitudinal data from each patient's eye were used in a convex optimization framework to find a vector that is representative of the progression direction of the sample population, as a whole. Post-hoc analysis of longitudinal visual fields across the derived vector led to optimal progression (change) detection. The proposed method was compared to recently described progression detection methods and to linear regression of instrument-defined global indices, and showed slightly higher sensitivities at the highest specificities than other methods (a clinically desirable result). The proposed approach is simpler, faster, and more efficient for detecting glaucomatous changes, compared to our previously proposed machine learning-based methods, although it provides somewhat less information. This approach has potential application in glaucoma clinics for patient monitoring and in research centers for classification of study participants. Copyright © 2015 Elsevier Inc. All rights reserved.
Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.
Dasbach, Erik J; Elbasha, Elamin H
2017-07-01
Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.
Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.
Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I
2018-06-26
The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.
Meijer, Ellen; van Nes, Arie; Back, Willem; van der Staay, Franz Josef
2015-12-01
Lameness in pigs decreases animal welfare and economic profit for the farmer. An important reason for impaired welfare in lame animals is pain due to lameness. No direct measurement of pain is possible in animals, and methods to indirectly detect and quantify the amount of pain an animal is experiencing are urgently needed. In this study, two methods to assess pain associated with lameness in pigs were evaluated to determine if they were sensitive enough to detect a lameness reduction as an effect of an experimental analgesic medication. Asymmetry associated with lameness was objectively quantified using pressure mat kinetic parameters: peak vertical force (PVF), load rate (LR), vertical impulse (VI) and peak vertical pressure (PVP). Locomotor activity was assessed in an open field test. A dose of 0.04 mg/kg buprenorphine, a strong analgesic, was used to treat 10 lame pigs, while eight other lame pigs, treated with physiological saline solution, served as controls. Buprenorphine decreased lameness-associated asymmetry for pressure mat LR (P = 0.002), VI (P = 0.003) and PVP (P = 0.001) and increased activity of the lame pigs in the open field (P = 0.023), while saline-treated animals did not show any changes in asymmetry and became less active in the open field (P <0.001). It was concluded that measurement of gait asymmetry by pressure mat analysis and locomotor activity in an open field test are both sensitive enough to detect the analgesic effects of buprenorphine when used to treat moderate to severe clinical pain in a relatively small group of affected pigs. The methods used in this study may also provide promising additional tools for future research into early pain recognition and lameness treatment in pigs. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Fong, L. S.; Ambrose, R. F.
2017-12-01
Remote sensing is an excellent way to assess the changing condition of streams and wetlands. Several studies have measured large-scale changes in riparian condition indicators, but few have remotely applied multi-metric assessments on a finer scale to measure changes, such as those caused by restoration, in the condition of small riparian areas. We developed an aerial imagery assessment method (AIAM) that combines landscape, hydrology, and vegetation observations into one index describing overall ecological condition of non-confined streams. Verification of AIAM demonstrated that sites in good condition (as assessed on-site by the California Rapid Assessment Method) received high AIAM scores. (AIAM was not verified with poor condition sites.) Spearman rank correlation tests comparing AIAM and the field-based California Rapid Assessment Method (CRAM) results revealed that some components of the two methods were highly correlated. The application of AIAM is illustrated with time-series restoration trajectories of three southern California stream restoration projects aged 15 to 21 years. The trajectories indicate that the projects improved in condition in years following their restoration, with vegetation showing the most dynamic change over time. AIAM restoration trajectories also overlapped to different degrees with CRAM chronosequence restoration performance curves that demonstrate the hypothetical development of high-performing projects. AIAM has high potential as a remote ecological assessment method and effective tool to determine restoration trajectories. Ultimately, this tool could be used to further improve stream and wetland restoration management.
Banta, David; Jonsson, Egon; Childs, Paul
2009-07-01
The International Society for Technology Assessment in Health Care (ISTAHC) was formed in 1985. It grew out of the increasing awareness of the international dimensions of health technology assessment (HTA) and the need for new communication methods at the international level. The main function of ISTAHC was to present an annual conference, which gradually grew in size, and also to generally improve in quality from to year. ISTAHC overextended itself financially early in the first decade of the 2000s and had to cease its existence. A new society, Health Technology Assessment international (HTAi), based on many of the same ideas and people, grew up beginning in the year 2003. The two societies have played a large role in making the field of HTA visible to people around the world and providing a forum for discussion on the methods and role of HTA.
[Methods for assessing the potential health risks of traces of pharmaceuticals in drinking water].
Kozísek, Frantisek; Jeligová, Hana
2012-01-01
Increasing consumption of pharmaceuticals leads also to higher release of its non-metabolized residues into environment, mostly hydrosphere. Some of these substances may reach also processed drinking water. Although it is found in traces, it causes public concern as it can represent a non-targeted and unwanted medication. Toxicologists and public health authorities are appealed to assess potential health risks carefully and to communicate the risk adequately to public. As health risks assessment of environmental exposure to pharmaceuticals is a new field of expertise, its methodology has not been unified and standardized yet, but several different procedures have been proposed and used. The paper provides overview of these methods.
Physical Processes for Driving Ionospheric Outflows in Global Simulations
NASA Technical Reports Server (NTRS)
Moore, Thomas Earle; Strangeway, Robert J.
2009-01-01
We review and assess the importance of processes thought to drive ionospheric outflows, linking them as appropriate to the solar wind and interplanetary magnetic field, and to the spatial and temporal distribution of their magnetospheric internal responses. These begin with the diffuse effects of photoionization and thermal equilibrium of the ionospheric topside, enhancing Jeans' escape, with ambipolar diffusion and acceleration. Auroral outflows begin with dayside reconnexion and resultant field-aligned currents and driven convection. These produce plasmaspheric plumes, collisional heating and wave-particle interactions, centrifugal acceleration, and auroral acceleration by parallel electric fields, including enhanced ambipolar fields from electron heating by precipitating particles. Observations and simulations show that solar wind energy dissipation into the atmosphere is concentrated by the geomagnetic field into auroral regions with an amplification factor of 10-100, enhancing heavy species plasma and gas escape from gravity, and providing more current carrying capacity. Internal plasmas thus enable electromagnetic driving via coupling to the plasma, neutral gas and by extension, the entire body " We assess the Importance of each of these processes in terms of local escape flux production as well as global outflow, and suggest methods for their implementation within multispecies global simulation codes. We complete 'he survey with an assessment of outstanding obstacles to this objective.
Jroundi, Imane; Belarbi, Abdellatif
2016-11-01
Morocco in 2010 launched a new field epidemiology training program to enhance the skills of health professionals in charge of epidemiological surveillance and to investigate outbreaks; including foodborne diseases that represent a very substantial burden of disease. To apply an active learning method to teach outbreak investigation within a controled environment for field epidemiology trainees program at the Moroccan National school of public Health. A scenario describing digestive symptoms evoking a restaurant-associated foodborne outbreak that would affect the school staff was designed for the residents to investigate, to assess their organizational capacity and application of all stages of epidemiological investigation. Nine Residents applied study design, database management and statistical analysis to investigate the foodborne outbreak, to estimate attack rates, classify cases and controls, to identify the contaminated foods and pathogens and to issue preventive recommendations for the control and the prevention of further transmission. The overall resident's satisfaction of the learning method was 67%. A simulation of an outbreak investigation within an academic setting is an active learning method to be used in the curriculum for introducing the residents on field epidemiology program to the principles and practices of outbreak investigation before their implication in a real situation.
COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES
Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...
APPLYING TOXICITY IDENTIFICATION PROCEDURES TO FIELD COLLECTED SEDIMENTS
Identification of specific causes of sediment toxicity can allow for much more focused risk assessment and management decision making. We have been developing toxicity identification evaluation (TIE) methods for contaminated sediments and focusing on three toxicant groups (ammoni...
NASA Astrophysics Data System (ADS)
Gu, Hui-Wen; Zhang, Shan-Hui; Wu, Bai-Chun; Chen, Wu; Wang, Jing-Bo; Liu, Yang
2018-07-01
Oil-field wastewaters contain high level of polycyclic aromatic hydrocarbons (PAHs), which have to be analyzed to assess the environmental effects before discharge. In this work, a green fluorimetric detection method that combines excitation-emission matrix (EEM) fluorescence with parallel factor analysis (PARAFAC) algorithm was firstly developed to achieve the direct and simultaneous determination of six U.S. EPA PAHs in two different kinds of complex oil-field wastewaters. Due to the distinctive "second-order advantage", neither time-consuming sample pretreatments nor toxic organic reagents were involved in the determination. By using the environment-friendly "mathematical separation" of PARAFAC, satisfactory quantitative results and reasonable spectral profiles for six PAHs were successfully extracted from the total EEM signals of oil-field wastewaters without need of chromatographic separation. The limits of detection of six PAHs were in the range of 0.09-0.72 ng mL-1, and the average spiked recoveries were between (89.4 ± 4.8)% and (109.1 ± 5.8)%, with average relative predictive errors <2.93%. In order to further confirm the accuracy of the proposed method, the same batch oil-field wastewater samples were analyzed by the recognized GC-MS method. t-test demonstrated that no significant differences exist between the quantitative results of the two methods. Given the advantages of green, fast, low-cost and high-sensitivity, the proposed method is expected to be broadened as an appealing alternative method for multi-residue analysis of overlapped PAHs in complex wastewater samples.
Tomography reconstruction methods for damage diagnosis of wood structure in construction field
NASA Astrophysics Data System (ADS)
Qiu, Qiwen; Lau, Denvid
2018-03-01
The structural integrity of wood building element plays a critical role in the public safety, which requires effective methods for diagnosis of internal damage inside the wood body. Conventionally, the non-destructive testing (NDT) methods such as X-ray computed tomography, thermography, radar imaging reconstruction method, ultrasonic tomography, nuclear magnetic imaging techniques, and sonic tomography have been used to obtain the information about the internal structure of wood. In this paper, the applications, advantages and disadvantages of these traditional tomography methods are reviewed. Additionally, the present article gives an overview of recently developed tomography approach that relies on the use of mechanical and electromagnetic waves for assessing the structural integrity of wood buildings. This developed tomography reconstruction method is believed to provide a more accurate, reliable, and comprehensive assessment of wood structural integrity
State of the art on nuclear heating in a mixed (n/γ) field in research reactors
NASA Astrophysics Data System (ADS)
Amharrak, H.; Salvo, J. Di; Lyoussi, A.; Carette, M.; Reynard-Carette, C.
2014-06-01
This article aims at inventorying the knowledge on nuclear heating measurements in a mixed (n,γ) field in low-power research reactors using ThermoLuminescent Detectors (TLDs), Optically Stimulated Luminescent Detectors (OSLDs) and Ionization Chambers. The difficulty in measuring a mixed (n,γ) field in a reactor configuration lies in quantifying the contribution of the gamma photons and neutrons to the full signal measured by these detectors. The algorithms and experimental protocols developed together with the calculation methods used to assess the contribution of the neutron dose to the total integrated dose as measured by these detectors will be described in this article. This 'inventory' will be used to summarize the best methods to be used in relation to the requirements.
NASA Astrophysics Data System (ADS)
Damianos, D.; Vitrant, G.; Lei, M.; Changala, J.; Kaminski-Cachopo, A.; Blanc-Pelissier, D.; Cristoloveanu, S.; Ionica, I.
2018-05-01
In this work, we investigate Second Harmonic Generation (SHG) as a non-destructive characterization method for Silicon-On-Insulator (SOI) materials. For thick SOI stacks, the SHG signal is related to the thickness variations of the different layers. However, in thin SOI films, the comparison between measurements and optical modeling suggests a supplementary SHG contribution attributed to the electric fields at the SiO2/Si interfaces. The impact of the electric field at each interface of the SOI on the SHG is assessed. The SHG technique can be used to evaluate interfacial electric fields and consequently interface charge density in SOI materials.
This paper summarizes and discusses recent available U.S. and European information on
ammonia (NH3) emissions from swine farms and assesses the applicability for general use
in the United States. The emission rates for the swine barns calculated by various methods show
g...
Nondestructive assessment of single-span timber bridges using a vibration- based method
Xiping Wang; James P. Wacker; Angus M. Morison; John W. Forsman; John R. Erickson; Robert J. Ross
2005-01-01
This paper describes an effort to develop a global dynamic testing technique for evaluating the overall stiffness of timber bridge superstructures. A forced vibration method was used to measure the natural frequency of single-span timber bridges in the laboratory and field. An analytical model based on simple beam theory was proposed to represent the relationship...
ERIC Educational Resources Information Center
Luera, Gail; Murray, Kent
2016-01-01
A mixed methods research approach was used to investigate the impact of a geosciences research institute upon 62 science teachers' knowledge, beliefs, and teaching practices related to teaching the geosciences. Pre- and postinstitute quantitative and qualitative assessments revealed mixed results. Results of a quantitative measure found a…
A Comparative Study on Electronic versus Traditional Data Collection in a Special Education Setting
ERIC Educational Resources Information Center
Ruf, Hernan Dennis
2012-01-01
The purpose of the current study was to determine the efficiency of an electronic data collection method compared to a traditional paper-based method in the educational field, in terms of the accuracy of data collected and the time required to do it. In addition, data were collected to assess users' preference and system usability. The study…
USDA-ARS?s Scientific Manuscript database
Structure-from-motion (SfM) photogrammetry from unmanned aircraft system (UAS) imagery is an emerging tool for repeat topographic surveying of dryland erosion. These methods are particularly appealing due to the ability to cover large landscapes compared to field methods and at reduced costs and hig...
A Feasibility Study of Using ICT in Iranian Secondary Schools: The Case of Tehran Province
ERIC Educational Resources Information Center
Fathi Vajargah, Kourosh; Saadattlab, Ayat
2014-01-01
This research presents the results of a feasibility assessment on implementing ICT in Tehran high schools. Mixed method research (both qualitative and quantitative) was employed and due to the nature of research, data collection included two stages: library and field study. Using the cluster method with 362 subjects, data was collected using a…
ERIC Educational Resources Information Center
Fraser, Renee White; Shani, Hadasa
Intended to assist Agency for International Development (AID) officers, advisors, and health officials in incorporating health planning into national plans for economic development, this second of ten manuals in the International Health Planning Methods Series deals with assessment, planning, and evaluation in the field of environmental health.…
Kleinmann, Joachim U; Wang, Magnus
2017-09-01
Spatial behavior is of crucial importance for the risk assessment of pesticides and for the assessment of effects of agricultural practice or multiple stressors, because it determines field use, exposition, and recovery. Recently, population models have increasingly been used to understand the mechanisms driving risk and recovery or to conduct landscape-level risk assessments. To include spatial behavior appropriately in population models for use in risk assessments, a new method, "probabilistic walk," was developed, which simulates the detailed daily movement of individuals by taking into account food resources, vegetation cover, and the presence of conspecifics. At each movement step, animals decide where to move next based on probabilities being determined from this information. The model was parameterized to simulate populations of brown hares (Lepus europaeus). A detailed validation of the model demonstrated that it can realistically reproduce various natural patterns of brown hare ecology and behavior. Simulated proportions of time animals spent in fields (PT values) were also comparable to field observations. It is shown that these important parameters for the risk assessment may, however, vary in different landscapes. The results demonstrate the value of using population models to reduce uncertainties in risk assessment and to better understand which factors determine risk in a landscape context. Environ Toxicol Chem 2017;36:2299-2307. © 2017 SETAC. © 2017 SETAC.
Efficient method for assessing channel instability near bridges
Robinson, Bret A.; Thompson, R.E.
1993-01-01
Efficient methods for data collection and processing are required to complete channel-instability assessments at 5,600 bridge sites in Indiana at an affordable cost and within a reasonable time frame while maintaining the quality of the assessments. To provide this needed efficiency and quality control, a data-collection form was developed that specifies the data to be collected and the order of data collection. This form represents a modification of previous forms that grouped variables according to type rather than by order of collection. Assessments completed during two field seasons showed that greater efficiency was achieved by using a fill-in-the-blank form that organizes the data to be recorded in a specified order: in the vehicle, from the roadway, in the upstream channel, under the bridge, and in the downstream channel.
Glaucoma management in patients with osteo-odonto-keratoprosthesis (OOKP): the Singapore OOKP Study.
Kumar, Rajesh S; Tan, Donald T H; Por, Yong-Ming; Oen, Francis T; Hoh, Sek-Tien; Parthasarathy, Anand; Aung, Tin
2009-01-01
To report diagnostic modalities and treatment options for glaucoma in eyes with osteo-odonto keratoprosthesis (OOKP). Eyes that underwent OOKP were evaluated for glaucoma at the time of the first postoperative visit, then at 1 and 3 months after the procedure, and thereafter every 6 months. All eyes underwent stereo-biomicroscopic optic nerve head (ONH) assessment, kinetic (Goldmann perimetry) and automated static visual field testing, ONH photography, Heidelberg retina tomograph, scanning laser polarimetery (GDx), and optical coherence tomography. Treatment of glaucoma was also reviewed. Average follow-up period was 19.1 (range: 5 to 31) months. Of the 15 eyes that underwent OOKP, 5 eyes had preexisting glaucoma. None of the other 10 eyes developed glaucoma after OOKP. ONH photography and visual field testing were the most reliable methods to assess status of the disease, whereas Heidelberg retina tomograph and optical coherence tomography could be performed with reasonable reproducibility and quality; GDx imaging was poor. All patients with glaucoma were treated with oral acetazolamide 500 mg twice a day. Transscleral cyclophotocoagulation was performed in 3 eyes at stage 2 of OOKP surgery. Progression of glaucoma was noted in 2 eyes on the basis of optic disc photographs and automated perimetry. Visual field testing and optic disc assessment with optic disc photographs seem to be effective methods to monitor eyes with OOKP for glaucoma. Treatment strategies include oral medications to lower intraocular pressure and cyclophotocoagulation.
Pitcher, T J; Lam, M E; Ainsworth, C; Martindale, A; Nakamura, K; Perry, R I; Ward, T
2013-10-01
This paper reports recent developments in Rapfish, a normative, scalable and flexible rapid appraisal technique that integrates both ecological and human dimensions to evaluate the status of fisheries in reference to a norm or goal. Appraisal status targets may be sustainability, compliance with a standard (such as the UN code of conduct for responsible fisheries) or the degree of progress in meeting some other goal or target. The method combines semi-quantitative (e.g. ecological) and qualitative (e.g. social) data via multiple evaluation fields, each of which is assessed through scores assigned to six to 12 attributes or indicators: the scoring method allows user flexibility to adopt a wide range of utility relationships. For assessing sustainability, six evaluation fields have been developed: ecological, technological, economic, social, ethical and institutional. Each field can be assessed directly with a set of scored attributes, or several of the fields can be dealt with in greater detail using nested subfields that themselves comprise multidimensional Rapfish assessments (e.g. the hierarchical institutional field encompasses both governance and management, including a detailed analysis of legality). The user has the choice of including all or only some of the available sustainability fields. For the attributes themselves, there will rarely be quantitative data, but scoring allows these items to be estimated. Indeed, within a normative framework, one important advantage with Rapfish is transparency of the rigour, quality and replicability of the scores. The Rapfish technique employs a constrained multidimensional ordination that is scaled to situate data points within evaluation space. Within each evaluation field, results may be presented as a two-dimensional plot or in a one-dimensional rank order. Uncertainty is expressed through the probability distribution of Monte-Carlo simulations that use the C.L. on each original observation. Overall results of the multidisciplinary analysis may be shown using kite diagrams that compare different locations, time periods (including future projections) and management scenarios, which make policy trade-offs explicit. These enhancements are now available in the R programming language and on an open website, where users can run Rapfish analyses by downloading the software or uploading their data to a user interface. © 2013 The Authors. Journal of Fish Biology © 2013 The Fisheries Society of the British Isles.
Graphic comparison of reserve-growth models for conventional oil and accumulation
Klett, T.R.
2003-01-01
The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.
Hinderer, Svenja; Brauchle, Eva
2015-01-01
Current clinically applicable tissue and organ replacement therapies are limited in the field of cardiovascular regenerative medicine. The available options do not regenerate damaged tissues and organs, and, in the majority of the cases, show insufficient restoration of tissue function. To date, anticoagulant drug‐free heart valve replacements or growing valves for pediatric patients, hemocompatible and thrombus‐free vascular substitutes that are smaller than 6 mm, and stem cell‐recruiting delivery systems that induce myocardial regeneration are still only visions of researchers and medical professionals worldwide and far from being the standard of clinical treatment. The design of functional off‐the‐shelf biomaterials as well as automatable and up‐scalable biomaterial processing methods are the focus of current research endeavors and of great interest for fields of tissue engineering and regenerative medicine. Here, various approaches that aim to overcome the current limitations are reviewed, focusing on biomaterials design and generation methods for myocardium, heart valves, and blood vessels. Furthermore, novel contact‐ and marker‐free biomaterial and extracellular matrix assessment methods are highlighted. PMID:25778713
A new four-step hierarchy method for combined assessment of groundwater quality and pollution.
Zhu, Henghua; Ren, Xiaohua; Liu, Zhizheng
2017-12-28
A new four-step hierarchy method was constructed and applied to evaluate the groundwater quality and pollution of the Dagujia River Basin. The assessment index system is divided into four types: field test indices, common inorganic chemical indices, inorganic toxicology indices, and trace organic indices. Background values of common inorganic chemical indices and inorganic toxicology indices were estimated with the cumulative-probability curve method, and the results showed that the background values of Mg 2+ (51.1 mg L -1 ), total hardness (TH) (509.4 mg L -1 ), and NO 3 - (182.4 mg L -1 ) are all higher than the corresponding grade III values of Quality Standard for Groundwater, indicating that they were poor indicators and therefore were not included in the groundwater quality assessment. The quality assessment results displayed that the field test indices were mainly classified as grade II, accounting for 60.87% of wells sampled. The indices of common inorganic chemical and inorganic toxicology were both mostly in the range of grade III, whereas the trace organic indices were predominantly classified as grade I. The variabilities and excess ratios of the indices were also calculated and evaluated. Spatial distributions showed that the groundwater with poor quality indices was mainly located in the northeast of the basin, which was well-connected with seawater intrusion. Additionally, the pollution assessment revealed that groundwater in well 44 was classified as "moderately polluted," wells 5 and 8 were "lightly polluted," and other wells were classified as "unpolluted."
Shao, Feng; Li, Kemeng; Lin, Weisi; Jiang, Gangyi; Yu, Mei; Dai, Qionghai
2015-10-01
Quality assessment of 3D images encounters more challenges than its 2D counterparts. Directly applying 2D image quality metrics is not the solution. In this paper, we propose a new full-reference quality assessment for stereoscopic images by learning binocular receptive field properties to be more in line with human visual perception. To be more specific, in the training phase, we learn a multiscale dictionary from the training database, so that the latent structure of images can be represented as a set of basis vectors. In the quality estimation phase, we compute sparse feature similarity index based on the estimated sparse coefficient vectors by considering their phase difference and amplitude difference, and compute global luminance similarity index by considering luminance changes. The final quality score is obtained by incorporating binocular combination based on sparse energy and sparse complexity. Experimental results on five public 3D image quality assessment databases demonstrate that in comparison with the most related existing methods, the devised algorithm achieves high consistency with subjective assessment.
Risk assessment of tropical cyclone rainfall flooding in the Delaware River Basin
NASA Astrophysics Data System (ADS)
Lu, P.; Lin, N.; Smith, J. A.; Emanuel, K.
2016-12-01
Rainfall-induced inland flooding is a leading cause of death, injury, and property damage from tropical cyclones (TCs). In the context of climate change, it has been shown that extreme precipitation from TCs is likely to increase during the 21st century. Assessing the long-term risk of inland flooding associated with landfalling TCs is therefore an important task. Standard risk assessment techniques, which are based on observations from rain gauges and stream gauges, are not broadly applicable to TC induced flooding, since TCs are rare, extreme events with very limited historical observations at any specific location. Also, rain gauges and stream gauges can hardly capture the complex spatial variation of TC rainfall and flooding. Furthermore, the utility of historically based assessments is compromised by climate change. Regional dynamical downscaling models can resolve many features of TC precipitation. In terms of risk assessment, however, it is computationally demanding to run such models to obtain long-term climatology of TC induced flooding. Here we apply a computationally efficient climatological-hydrological method to assess the risk of inland flooding associated with landfalling TCs. It includes: 1) a deterministic TC climatology modeling method to generate large numbers of synthetic TCs with physically correlated characteristics (i.e., track, intensity, size) under observed and projected climates; 2) a simple physics-based tropical cyclone rainfall model which is able to simulate rainfall fields associated with each synthetic storm; 3) a hydrologic modeling system that takes in rainfall fields to simulate flood peaks over an entire drainage basin. We will present results of this method applied to the Delaware River Basin in the mid-Atlantic US.
Assessing Energy Efficiency Opportunities in US Industrial and Commercial Building Motor Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Prakash; Sheaffer, Paul; McKane, Aimee
2015-09-01
In 2002, the United States Department of Energy (USDOE) published an energy efficiency assessment of U.S. industrial sector motor systems titled United States Industrial Electric Motor Systems Market Opportunities Assessment. The assessment advanced motor system efficiency by providing a greater understanding of the energy consumption, use characteristics, and energy efficiency improvement potential of industrial sector motor systems in the U.S. Since 2002, regulations such as Minimum Energy Performance Standards, cost reductions for motor system components such as variable frequency drives, system-integrated motor-driven equipment, and awareness programs for motor system energy efficiency have changed the landscape of U.S. motor system energymore » consumption. To capture the new landscape, the USDOE has initiated a three-year Motor System Market Assessment (MSMA), led by Lawrence Berkeley National Laboratory (LBNL). The MSMA will assess the energy consumption, operational and maintenance characteristics, and efficiency improvement opportunity of U.S. industrial sector and commercial building motor systems. As part of the MSMA, a significant effort is currently underway to conduct field assessments of motor systems from a sample of facilities representative of U.S. commercial and industrial motor system energy consumption. The Field Assessment Plan used for these assessments builds on recent LBNL research presented at EEMODS 2011 and EEMODS 2013 using methods for characterizing and determining regional motor system energy efficiency opportunities. This paper provides an update on the development and progress of the MSMA, focusing on the Field Assessment Plan and the framework for assessing the global supply chain for emerging motors and drive technologies.« less
Destounis, Stamatia; Arieno, Andrea; Morgan, Renee; Roberts, Christina; Chan, Ariane
2017-01-01
Mammographic breast density (MBD) has been proven to be an important risk factor for breast cancer and an important determinant of mammographic screening performance. The measurement of density has changed dramatically since its inception. Initial qualitative measurement methods have been found to have limited consistency between readers, and in regards to breast cancer risk. Following the introduction of full-field digital mammography, more sophisticated measurement methodology is now possible. Automated computer-based density measurements can provide consistent, reproducible, and objective results. In this review paper, we describe various methods currently available to assess MBD, and provide a discussion on the clinical utility of such methods for breast cancer screening. PMID:28561776
Automated navigation assessment for earth survey sensors using island targets
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Woodward, Robert H.; Gregg, Watson W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalog of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean color sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Validation of Rapid Radiochemical Method for Californium ...
Technical Brief In the event of a radiological/nuclear contamination event, the response community would need tools and methodologies to rapidly assess the nature and the extent of contamination. To characterize a radiologically contaminated outdoor area and to inform risk assessment, large numbers of environmental samples would be collected and analyzed over a short period of time. To address the challenge of quickly providing analytical results to the field, the U.S. EPA developed a robust analytical method. This method allows response officials to characterize contaminated areas and to assess the effectiveness of remediation efforts, both rapidly and accurately, in the intermediate and late phases of environmental cleanup. Improvement in sample processing and analysis leads to increased laboratory capacity to handle the analysis of a large number of samples following the intentional or unintentional release of a radiological/nuclear contaminant.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Validity and inter-observer reliability of subjective hand-arm vibration assessments.
Coenen, Pieter; Formanoy, Margriet; Douwes, Marjolein; Bosch, Tim; de Kraker, Heleen
2014-07-01
Exposure to mechanical vibrations at work (e.g., due to handling powered tools) is a potential occupational risk as it may cause upper extremity complaints. However, reliable and valid assessment methods for vibration exposure at work are lacking. Measuring hand-arm vibration objectively is often difficult and expensive, while often used information provided by manufacturers lacks detail. Therefore, a subjective hand-arm vibration assessment method was tested on validity and inter-observer reliability. In an experimental protocol, sixteen tasks handling powered tools were executed by two workers. Hand-arm vibration was assessed subjectively by 16 observers according to the proposed subjective assessment method. As a gold standard reference, hand-arm vibration was measured objectively using a vibration measurement device. Weighted κ's were calculated to assess validity, intra-class-correlation coefficients (ICCs) were calculated to assess inter-observer reliability. Inter-observer reliability of the subjective assessments depicting the agreement among observers can be expressed by an ICC of 0.708 (0.511-0.873). The validity of the subjective assessments as compared to the gold-standard reference can be expressed by a weighted κ of 0.535 (0.285-0.785). Besides, the percentage of exact agreement of the subjective assessment compared to the objective measurement was relatively low (i.e., 52% of all tasks). This study shows that subjectively assessed hand-arm vibrations are fairly reliable among observers and moderately valid. This assessment method is a first attempt to use subjective risk assessments of hand-arm vibration. Although, this assessment method can benefit from some future improvement, it can be of use in future studies and in field-based ergonomic assessments. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
2015-06-07
Field-Portable Gas Chromatograph-Mass Spectrometer.” Forensic Toxicol, 2006, 24, 17-22. Smith, P. “Person-Portable Gas Chromatography : Rapid Temperature...bench-top Gas Chromatograph-Mass Spectrometer (GC-MS) system (ISQ). Nine sites were sampled and analyzed for compounds using Environmental Protection...extraction methods for Liquid Chromatography -MS (LC- MS). Additionally, TD is approximately 1000X more sensitive, requires minimal sample preparation
The Use of Terrestrial Laser Scanning for Determining the Driver’s Field of Vision
Zemánek, Tomáš; Cibulka, Miloš; Skoupil, Jaromír
2017-01-01
Terrestrial laser scanning (TLS) is currently one of the most progressively developed methods in obtaining information about objects and phenomena. This paper assesses the TLS possibilities in determining the driver’s field of vision in operating agricultural and forest machines with movable and immovable components in comparison to the method of using two light point sources for the creation of shade images according to ISO (International Organization for Standardization) 5721-1. Using the TLS method represents a minimum time saving of 55% or more, according to the project complexity. The values of shading ascertained by using the shadow cast method by the point light sources are generally overestimated and more distorted for small cabin structural components. The disadvantage of the TLS method is the scanner’s sensitivity to a soiled or scratched cabin windscreen and to the glass transparency impaired by heavy tinting. PMID:28902177
Muhs, Daniel
2017-01-01
Dune fields of Quaternary age occupy large areas of the world's arid and semiarid regions. Despite this, there has been surprisingly little work done on understanding dune sediment provenance, in part because many techniques are time-consuming, prone to operator error, experimental, highly specialized, expensive, or require sophisticated instrumentation. Provenance of dune sand using K/Rb and K/Ba values in K-feldspar in aeolian sands of the arid and semiarid regions of North America is tested here. Results indicate that K/Rb and K/Ba can distinguish different river sands that are sediment sources for dunes and dune fields themselves have distinctive K/Rb and K/Ba compositions. Over the Basin and Range and Great Plains regions of North America, the hypothesized sediment sources of dune fields are reviewed and assessed using K/Rb and K/Ba values in dune sands and in hypothesized source sediments. In some cases, the origins of dunes assessed in this manner are consistent with previous studies and in others, dune fields are found to have a more complex origin than previously thought. Use of K/Rb and K/Ba for provenance studies is a robust method that is inexpensive, rapid, and highly reproducible. It exploits one of the most common minerals found in dune sand, K-feldspar. The method avoids the problem of using simple concentrations of key elements that may be subject to interpretative bias due to changes in mineralogical maturity of Quaternary dune fields that occur over time.
RESULTS OF APPLYING TOXICITY IDENTIFICATION PROCEDURES TO FIELD COLLECTED SEDIMENTS
Identification of specific causes of sediment toxicity can allow for much more focused risk assessment and management decision making. We have been developing toxicity identification evaluation TIE) methods for contaminated sediments and are focusing on three toxicant groups (amm...
On-Tree Mango Fruit Size Estimation Using RGB-D Images
Wang, Zhenglin; Verma, Brijesh
2017-01-01
In-field mango fruit sizing is useful for estimation of fruit maturation and size distribution, informing the decision to harvest, harvest resourcing (e.g., tray insert sizes), and marketing. In-field machine vision imaging has been used for fruit count, but assessment of fruit size from images also requires estimation of camera-to-fruit distance. Low cost examples of three technologies for assessment of camera to fruit distance were assessed: a RGB-D (depth) camera, a stereo vision camera and a Time of Flight (ToF) laser rangefinder. The RGB-D camera was recommended on cost and performance, although it functioned poorly in direct sunlight. The RGB-D camera was calibrated, and depth information matched to the RGB image. To detect fruit, a cascade detection with histogram of oriented gradients (HOG) feature was used, then Otsu’s method, followed by color thresholding was applied in the CIE L*a*b* color space to remove background objects (leaves, branches etc.). A one-dimensional (1D) filter was developed to remove the fruit pedicles, and an ellipse fitting method employed to identify well-separated fruit. Finally, fruit lineal dimensions were calculated using the RGB-D depth information, fruit image size and the thin lens formula. A Root Mean Square Error (RMSE) = 4.9 and 4.3 mm was achieved for estimated fruit length and width, respectively, relative to manual measurement, for which repeated human measures were characterized by a standard deviation of 1.2 mm. In conclusion, the RGB-D method for rapid in-field mango fruit size estimation is practical in terms of cost and ease of use, but cannot be used in direct intense sunshine. We believe this work represents the first practical implementation of machine vision fruit sizing in field, with practicality gauged in terms of cost and simplicity of operation. PMID:29182534
NASA Technical Reports Server (NTRS)
Aljabri, Abdullah S.
1988-01-01
High speed subsonic transports powered by advanced propellers provide significant fuel savings compared to turbofan powered transports. Unfortunately, however, propfans must operate in aircraft-induced nonuniform flow fields which can lead to high blade cyclic stresses, vibration and noise. To optimize the design and installation of these advanced propellers, therefore, detailed knowledge of the complex flow field is required. As part of the NASA Propfan Test Assessment (PTA) program, a 1/9 scale semispan model of the Gulfstream II propfan test-bed aircraft was tested in the NASA-Lewis 8 x 6 supersonic wind tunnel to obtain propeller flow field data. Detailed radial and azimuthal surveys were made to obtain the total pressure in the flow and the three components of velocity. Data was acquired for Mach numbers ranging from 0.6 to 0.85. Analytical predictions were also made using a subsonic panel method, QUADPAN. Comparison of wind-tunnel measurements and analytical predictions show good agreement throughout the Mach range.
Bitton, Gabriel; Koopman, Ben
1982-01-01
A method was developed to assess the activity of filamentous bacteria in activated sludge. It involves the incubation of activated sludge with 2(p-iodophenyl)-3-(p-nitrophenyl)-5-phenyl tetrazolium chloride followed by staining with malachite green. Both cells and 2(p-iodophenyl)-3-(p-nitrophenyl)-5-phenyl tetrazolium chloride-formazan crystals can be observed in prepared specimens by using bright-field microscopy. This procedure allowed us to distinguish between inactive and actively metabolizing filaments after chlorine application to control the bulking of activated sludge. Images PMID:16345999
Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.
2005-01-01
Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In contrast, FEMA Flood Insurance Rate Maps (FIRMs) based on the FAN model predict uniformly high flood risk across the study areas without regard for small-scale topography and surficial geology. ?? 2005 Geological Society of America.
Quinn, Terence J; Livingstone, Iain; Weir, Alexander; Shaw, Robert; Breckenridge, Andrew; McAlpine, Christine; Tarbert, Claire M
2018-01-01
Visual impairment affects up to 70% of stroke survivors. We designed an app (StrokeVision) to facilitate screening for common post stroke visual issues (acuity, visual fields, and visual inattention). We sought to describe the test time, feasibility, acceptability, and accuracy of our app-based digital visual assessments against (a) current methods used for bedside screening and (b) gold standard measures. Patients were prospectively recruited from acute stroke settings. Index tests were app-based assessments of fields and inattention performed by a trained researcher. We compared against usual clinical screening practice of visual fields to confrontation, including inattention assessment (simultaneous stimuli). We also compared app to gold standard assessments of formal kinetic perimetry (Goldman or Octopus Visual Field Assessment); and pencil and paper-based tests of inattention (Albert's, Star Cancelation, and Line Bisection). Results of inattention and field tests were adjudicated by a specialist Neuro-ophthalmologist. All assessors were masked to each other's results. Participants and assessors graded acceptability using a bespoke scale that ranged from 0 (completely unacceptable) to 10 (perfect acceptability). Of 48 stroke survivors recruited, the complete battery of index and reference tests for fields was successfully completed in 45. Similar acceptability scores were observed for app-based [assessor median score 10 (IQR: 9-10); patient 9 (IQR: 8-10)] and traditional bedside testing [assessor 10 (IQR: 9-10); patient 10 (IQR: 9-10)]. Median test time was longer for app-based testing [combined time to completion of all digital tests 420 s (IQR: 390-588)] when compared with conventional bedside testing [70 s, (IQR: 40-70)], but shorter than gold standard testing [1,260 s, (IQR: 1005-1,620)]. Compared with gold standard assessments, usual screening practice demonstrated 79% sensitivity and 82% specificity for detection of a stroke-related field defect. This compares with 79% sensitivity and 88% specificity for StrokeVision digital assessment. StrokeVision shows promise as a screening tool for visual complications in the acute phase of stroke. The app is at least as good as usual screening and offers other functionality that may make it attractive for use in acute stroke. https://ClinicalTrials.gov/ct2/show/NCT02539381.
Field application of pathogen detection technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Tim M.; Call, Douglas R.; Bruckner-Lea, Cindy J.
Over the last 10 years there has been a significant increase in commercial products designed for field-based detection of microbial pathogens. This is due, in part, to the anthrax attacks in the United States in 2001, and the need for first responders to quickly identify the composition of suspected white powders and other potential biothreats. Demand for rapid detection is also driven by the need to ensure safe food, water, and environmental systems. From a technology perspective, rapid identification methods have largely capitalized on PCR and other molecular recognition techniques that can be deployed as robust field instrumentation. Examples ofmore » the relevant needs include the ability to: 1) declare a water distribution system free of microbial pathogens after a pipe/main break repair; 2) assess risks of contamination such as when produce production and processing plants are located near concentrated animal feeing operations; 3) evaluate the safety of ready-to-eat products; 4) determine the extent of potential serious disease outbreaks in remote and/or disaster stricken areas where access to clinical laboratories is not an immediate option; and 5) quickly assess credible biological terrorism events. Many of the principles underlying rapid detection methods are derived from methods for environmental microbiology, but there is a dearth of literature describing and evaluating field-based detection systems. Thus, the aims of this chapter are to: 1) summarize the different kinds of commercially available sampling kits and field-based biological detectors; 2) highlight some of the continued challenges of sample preparation to stimulate new research towards minimizing the impact of inhibitors on PCR-based detection systems; 3) describe our general rationale and statistically-based approach for instrument evaluation; 4) provide statistical and spatial guidelines for developing valid sampling plans; and 5) summarize some current needs and emerging technologies. This information is presented both to highlight the state of the field, and to also highlight major questions that students may wish to consider investigating further. Where possible we will cite studies that have been conducted and published either in traditional peer-reviewed or other literature (e.g., AOAC International Methods).« less
NASA Astrophysics Data System (ADS)
Wakila, M. H.; Saepuloh, A.; Heriawan, M. N.; Susanto, A.
2016-09-01
Geothermal explorations and productions are currently being intensively conducted at certain areas in Indonesia such as Wayang Windu Geothermal Field (WWGF) in West Java, Indonesia. The WWGF is located at wide area covering about 40 km2. An accurate method to map the distribution of heterogeneity minerals is necessary for wide areas such as WWGF. Mineral mapping is an important method in geothermal explorations to determine the distribution of minerals which indicate the surface manifestations of geothermal system. This study is aimed to determine the most precise and accurate methods for minerals mapping at geothermal field. Field measurements were performed to assess the accuracy of three proposed methods: 1) Minimum Noise Fraction (MNF), utilizing the linear transformation method to eliminate the correlation among the spectra bands and to reduce the noise in the data, 2) Pixel Purity Index (PPI), a designed method to find the most extreme spectrum pixels and their characteristics due to end-members mixing, 3) Spectral Angle Mapper (SAM), an image classification technique by measuring the spectral similarity between an unknown object with spectral reference in n- dimension. The output of those methods were mineral distribution occurrence. The performance of each mapping method was analyzed based on the ground truth data. Among the three proposed method, the SAM classification method is the most appropriate and accurate for mineral mapping related to spatial distribution of alteration minerals.
Gulf Coast Disaster Management: Forest Damage Detection and Carbon Flux Estimation
NASA Astrophysics Data System (ADS)
Maki, A. E.; Childs, L. M.; Jones, J.; Matthews, C.; Spindel, D.; Batina, M.; Malik, S.; Allain, M.; Brooks, A. O.; Brozen, M.; Chappell, C.; Frey, J. W.
2008-12-01
Along the Gulf Coast and Eastern Seaboard, tropical storms and hurricanes annually cause defoliation and deforestation amongst coastal forests. After a severe storm clears, there is an urgent need to assess impacts on timber resources for targeting state and national resources to assist in recovery. It is important to identify damaged areas following the storm, due to their increased probability of fire risk, as well as the effect upon the carbon budget. Better understanding and management of the immediate and future effects on the carbon cycle in the coastal forest ecosystem is especially important. Current methods of detection involve assessment through ground-based field surveys, aerial surveys, computer modeling of meteorological data, space-borne remote sensing, and Forest Inventory and Analysis field plots. Introducing remotely-sensed data from NASA and NASA-partnered Earth Observation Systems (EOS), this project seeks to improve the current methodology and focuses on a need for methods that are more synoptic than field surveys and more closely linked to the phenomenology of tree loss and damage than passive remote sensing methods. The primary concentration is on the utilization of Ice, Cloud, and land Elevation Satellite (ICESat) Geoscience Laser Altimeter System (GLAS) data products to detect changes in forest canopy height as an indicator of post-hurricane forest disturbances. By analyzing ICESat data over areas affected by Hurricane Katrina, this study shows that ICESsat is a useful method of detecting canopy height change, though further research is needed in mixed forest areas. Other EOS utilized in this study include Landsat, Moderate Resolution Imaging Spectroradiometer (MODIS), and the NASA verified and validated international Advanced Wide Field Sensor (AWiFS) sensor. This study addresses how NASA could apply ICESat data to contribute to an improved method of detecting hurricane-caused forest damage in coastal areas; thus to pinpoint areas more susceptible to fire damage and subsequent loss of carbon sequestration.
Qualitative risk assessment during polymer mortar test specimens preparation - methods comparison
NASA Astrophysics Data System (ADS)
Silva, F.; Sousa, S. P. B.; Arezes, P.; Swuste, P.; Ribeiro, M. C. S.; Baptista, J. S.
2015-05-01
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Lavoie, Jacques; Marchand, Geneviève; Cloutier, Yves; Lavoué, Jérôme
2011-08-01
Dust accumulation in the components of heating, ventilation, and air-conditioning (HVAC) systems is a potential source of contaminants. To date, very little information is available on recognized methods for assessing dust buildup in these systems. The few existing methods are either objective in nature, involving numerical values, or subjective in nature, based on experts' judgments. An earlier project aimed at assessing different methods of sampling dust in ducts was carried out in the laboratories of the Institut de recherche Robert-Sauvé en santé et en sécurité du travail (IRSST). This laboratory study showed that all the sampling methods were practicable, provided that a specific surface-dust cleaning initiation criterion was used for each method. However, these conclusions were reached on the basis of ideal conditions in a laboratory using a reference dust. The objective of this present study was to validate these laboratory results in the field. To this end, the laboratory sampling templates were replicated in real ducts and the three sampling methods (the IRSST method, the method of the U.S. organization National Air Duct Cleaner Association [NADCA] and that of the French organization Association pour la Prévention et l'Étude de la Contamination [ASPEC]) were used simultaneously in a statistically representative number of systems. The air return and supply ducts were also compared. Cleaning initiation criteria under real conditions were found to be 6.0 mg/100 cm(2) using the IRSST method, 2.0 mg/100 cm(2) using the NADCA method, and 23 mg/100 cm(2) using the ASPEC method. In the laboratory study, the criteria using the same methods were 6.0 for the IRSST method, 2.0 for the NADCA method, and 3.0 for the ASPEC method. The laboratory criteria for the IRSST and NADCA methods were therefore validated in the field. The ASPEC criterion was the only one to change. The ASPEC method therefore allows for the most accurate evaluation of dust accumulation in HVAC ductwork. We therefore recommend using the latter method to objectively assess dust accumulation levels in HVAC ductwork.
Rapid in situ assessment for predicting soil quality using an algae-soaked disc seeding assay.
Nam, Sun-Hwa; Moon, Jongmin; Kim, Shin Woong; Kim, Hakyeong; Jeong, Seung-Woo; An, Youn-Joo
2017-11-16
The soil quality of remediated land is altered and this land consequently exerts unexpected biological effects on terrestrial organisms. Therefore, field evaluation of such land should be conducted using biological indicators. Algae are a promising new biological indicator since they are a food source for organisms in higher soil trophic levels and easily sampled from the soil. Field evaluation of soil characteristics is preferred to be testing in laboratory conditions because many biological effects cannot be duplicated during laboratory evaluations. Herein, we describe a convenient and rapid algae-soaked disc seeding assay for assessing soil quality in the field based on soil algae. The collection of algae is easy and rapid and the method predicts the short-term quality of contaminated, remediated, and amended farm and paddy soils. The algae-soaked disc seeding assay is yet to be extensively evaluated, and the method cannot be applied to loamy sand soil in in situ evaluations. The algae-soaked disc seeding assay is recommended for prediction of soil quality in in situ evaluations because it reflects all variations in the environment. The algae-soaked disc seeding assay will help to develop management strategies for in situ evaluation.
Cao, Wenlong; Vaddella, Venkata; Biswas, Sagor; Perkins, Katherine; Clay, Cameron; Wu, Tong; Zheng, Yawen; Ndegwa, Pius; Pandey, Pramod
2016-11-01
Vermicomposting (VC) has proven to be a promising method for treating garden, household, and municipal wastes. Although the VC has been used extensively for converting wastes into fertilizers, pathogens such as Escherichia coli (E. coli) survival during this process is not well documented. In this study, both lab and field scale experiments were conducted assessing the impacts of earthworms in reducing E. coli concentration during VC of food waste. In addition, other pertinent parameters such as temperature, carbon and nitrogen content, moisture content, pH, volatile solids, micronutrients (P, K, Ca, Mg, and S), and heavy metals (Zn, Mn, Fe, and Cu) were monitored during the study. The lab and field scale experiments were conducted for 107 and 103 days, respectively. The carbon to nitrogen ratio (C/N) decreased by 54 % in the lab scale study and by 36 % in the field study. Results showed that VC was not significantly effective in reducing E. coli levels in food waste under both lab and field scale settings. The carbon to nitrogen ratio (C/N) decreased by 54 % in the lab scale study and by 36 % in the field study.
Ahmed, Mavra; Mandic, Iva; Lou, Wendy; Goodman, Len; Jacobs, Ira; L'Abbé, Mary R
2017-02-27
The collection of accurate dietary intakes using traditional dietary assessment methods (e.g., food records) from military personnel is challenging due to the demanding physiological and psychological conditions of training or operations. In addition, these methods are burdensome, time consuming, and prone to measurement errors. Adopting smart-phone/tablet technology could overcome some of these barriers. The objective was to assess the validity of a tablet app, modified to contain detailed nutritional composition data, in comparison to a measured food intake/waste method. A sample of Canadian Armed Forces personnel, randomized to either a tablet app ( n = 9) or a weighed food record (wFR) ( n = 9), recorded the consumption of standard military rations for a total of 8 days. Compared to the gold standard measured food intake/waste method, the difference in mean energy intake was small (-73 kcal/day for tablet app and -108 kcal/day for wFR) ( p > 0.05). Repeated Measures Bland-Altman plots indicated good agreement for both methods (tablet app and wFR) with the measured food intake/waste method. These findings demonstrate that the tablet app, with added nutritional composition data, is comparable to the traditional dietary assessment method (wFR) and performs satisfactorily in relation to the measured food intake/waste method to assess energy, macronutrient, and selected micronutrient intakes in a sample of military personnel.
Ahmed, Mavra; Mandic, Iva; Lou, Wendy; Goodman, Len; Jacobs, Ira; L’Abbé, Mary R.
2017-01-01
The collection of accurate dietary intakes using traditional dietary assessment methods (e.g., food records) from military personnel is challenging due to the demanding physiological and psychological conditions of training or operations. In addition, these methods are burdensome, time consuming, and prone to measurement errors. Adopting smart-phone/tablet technology could overcome some of these barriers. The objective was to assess the validity of a tablet app, modified to contain detailed nutritional composition data, in comparison to a measured food intake/waste method. A sample of Canadian Armed Forces personnel, randomized to either a tablet app (n = 9) or a weighed food record (wFR) (n = 9), recorded the consumption of standard military rations for a total of 8 days. Compared to the gold standard measured food intake/waste method, the difference in mean energy intake was small (−73 kcal/day for tablet app and −108 kcal/day for wFR) (p > 0.05). Repeated Measures Bland-Altman plots indicated good agreement for both methods (tablet app and wFR) with the measured food intake/waste method. These findings demonstrate that the tablet app, with added nutritional composition data, is comparable to the traditional dietary assessment method (wFR) and performs satisfactorily in relation to the measured food intake/waste method to assess energy, macronutrient, and selected micronutrient intakes in a sample of military personnel. PMID:28264428
Quantifying Uncertainty in Near Surface Electromagnetic Imaging Using Bayesian Methods
NASA Astrophysics Data System (ADS)
Blatter, D. B.; Ray, A.; Key, K.
2017-12-01
Geoscientists commonly use electromagnetic methods to image the Earth's near surface. Field measurements of EM fields are made (often with the aid an artificial EM source) and then used to infer near surface electrical conductivity via a process known as inversion. In geophysics, the standard inversion tool kit is robust and can provide an estimate of the Earth's near surface conductivity that is both geologically reasonable and compatible with the measured field data. However, standard inverse methods struggle to provide a sense of the uncertainty in the estimate they provide. This is because the task of finding an Earth model that explains the data to within measurement error is non-unique - that is, there are many, many such models; but the standard methods provide only one "answer." An alternative method, known as Bayesian inversion, seeks to explore the full range of Earth model parameters that can adequately explain the measured data, rather than attempting to find a single, "ideal" model. Bayesian inverse methods can therefore provide a quantitative assessment of the uncertainty inherent in trying to infer near surface conductivity from noisy, measured field data. This study applies a Bayesian inverse method (called trans-dimensional Markov chain Monte Carlo) to transient airborne EM data previously collected over Taylor Valley - one of the McMurdo Dry Valleys in Antarctica. Our results confirm the reasonableness of previous estimates (made using standard methods) of near surface conductivity beneath Taylor Valley. In addition, we demonstrate quantitatively the uncertainty associated with those estimates. We demonstrate that Bayesian inverse methods can provide quantitative uncertainty to estimates of near surface conductivity.
Detection and Assessment of Wood Decay in Glulam Beams Using a Decay Rate Approach: A Review
C. Adam Senalik
2013-01-01
A glulam beam is subjected to X-ray computer tomography and acousto-ultrasonic measurements to detect and assess wood decay. A glulam beam without visible indications of wood decay was taken from field use. A modified impulse-echo technique is employed as an inspection method requiring access to only one side of the beam. It is observed that decay-rate analysis of the...
Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael
2015-01-21
Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.
Chmitorz, A; Kunzler, A; Helmreich, I; Tüscher, O; Kalisch, R; Kubiak, T; Wessa, M; Lieb, K
2018-02-01
Psychological resilience refers to the phenomenon that many people are able to adapt to the challenges of life and maintain mental health despite exposure to adversity. This has stimulated research on training programs to foster psychological resilience. We evaluated concepts, methods and designs of 43 randomized controlled trials published between 1979 and 2014 which assessed the efficacy of such training programs and propose standards for future intervention research based on recent developments in the field. We found that concepts, methods and designs in current resilience intervention studies are of limited use to properly assess efficacy of interventions to foster resilience. Major problems are the use of definitions of resilience as trait or a composite of resilience factors, the use of unsuited assessment instruments, and inappropriate study designs. To overcome these challenges, we propose 1) an outcome-oriented definition of resilience, 2) an outcome-oriented assessment of resilience as change in mental health in relation to stressor load, and 3) methodological standards for suitable study designs of future intervention studies. Our proposals may contribute to an improved quality of resilience intervention studies and may stimulate further progress in this growing research field. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach
NASA Astrophysics Data System (ADS)
Xiao, T.
2012-12-01
One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.
NASA Astrophysics Data System (ADS)
Salinas, F. S.; Lancaster, J. L.; Fox, P. T.
2009-06-01
Transcranial magnetic stimulation (TMS) delivers highly localized brain stimulations via non-invasive externally applied magnetic fields. This non-invasive, painless technique provides researchers and clinicians with a unique tool capable of stimulating both the central and peripheral nervous systems. However, a complete analysis of the macroscopic electric fields produced by TMS has not yet been performed. In this paper, we addressed the importance of the secondary E-field created by surface charge accumulation during TMS using the boundary element method (BEM). 3D models were developed using simple head geometries in order to test the model and compare it with measured values. The effects of tissue geometry, size and conductivity were also investigated. Finally, a realistically shaped head model was used to assess the effect of multiple surfaces on the total E-field. Secondary E-fields have the greatest impact at areas in close proximity to each tissue layer. Throughout the head, the secondary E-field magnitudes typically range from 20% to 35% of the primary E-field's magnitude. The direction of the secondary E-field was generally in opposition to the primary E-field; however, for some locations, this was not the case (i.e. going from high to low conductivity tissues). These findings show that realistically shaped head geometries are important for accurate modeling of the total E-field.
Innovative Techniques for Evaluating Behavioral Nutrition Interventions1234
Laugero, Kevin D; Cunningham, Brian T; Lora, Karina R; Reicks, Marla
2017-01-01
Assessing outcomes and the impact from behavioral nutrition interventions has remained challenging because of the lack of methods available beyond traditional nutrition assessment tools and techniques. With the current high global obesity and related chronic disease rates, novel methods to evaluate the impact of behavioral nutrition-based interventions are much needed. The objective of this narrative review is to describe and review the current status of knowledge as it relates to 4 different innovative methods or tools to assess behavioral nutrition interventions. Methods reviewed include 1) the assessment of stress and stress responsiveness to enhance the evaluation of nutrition interventions, 2) eye-tracking technology in nutritional interventions, 3) smartphone biosensors to assess nutrition and health-related outcomes, and 4) skin carotenoid measurements to assess fruit and vegetable intake. Specifically, the novel use of functional magnetic resonance imaging, by characterizing the brain’s responsiveness to an intervention, can help researchers develop programs with greater efficacy. Similarly, if eye-tracking technology can enable researchers to get a better sense as to how participants view materials, the materials may be better tailored to create an optimal impact. The latter 2 techniques reviewed, smartphone biosensors and methods to detect skin carotenoids, can provide the research community with portable, effective, nonbiased ways to assess dietary intake and quality and more in the field. The information gained from using these types of methodologies can improve the efficacy and assessment of behavior-based nutrition interventions. PMID:28096132
DOT National Transportation Integrated Search
2012-04-01
This paper presents a description of efforts to disseminate findings from the Phase I study (SPR-2244), provides examples of applied maturity testing and temperature monitoring in Connecticut, reviews several State Highway Agency protocols for using ...
DNA-BASED METHODS FOR MONITORING INVASIVE SPECIES: A REVIEW AND PROSPECTUS
The recent explosion of interest in DNA-based tools for species identification has prompted widespread speculation on the future availability of inexpensive, rapid and accurate means of identifying specimens and assessing biodiversity. One applied field that may benefit dramatic...
CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS
This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...
Passive field reflectance measurements
NASA Astrophysics Data System (ADS)
Weber, Christian; Schinca, Daniel C.; Tocho, Jorge O.; Videla, Fabian
2008-10-01
The results of reflectance measurements performed with a three-band passive radiometer with independent channels for solar irradiance reference are presented. Comparative operation between the traditional method that uses downward-looking field and reference white panel measurements and the new approach involving duplicated downward- and upward-looking spectral channels (each latter one with its own diffuser) is analyzed. The results indicate that the latter method performs in very good agreement with the standard method and is more suitable for passive sensors under rapidly changing atmospheric conditions (such as clouds, dust, mist, smog and other scatterers), since a more reliable synchronous recording of reference and incident light is achieved. Besides, having separate channels for the reference and the signal allows a better balancing of gains in the amplifiers for each spectral channel. We show the results obtained in the determination of the normalized difference vegetation index (NDVI) corresponding to the period 2004-2007 field experiments concerning weed detection in soybean stubbles and fertilizer level assessment in wheat. The method may be used to refine sensor-based nitrogen fertilizer rate recommendations and to determine suitable zones for herbicide applications.
Afach, S; Ayres, N J; Ban, G; Bison, G; Bodek, K; Chowdhuri, Z; Daum, M; Fertl, M; Franke, B; Griffith, W C; Grujić, Z D; Harris, P G; Heil, W; Hélaine, V; Kasprzak, M; Kermaidic, Y; Kirch, K; Knowles, P; Koch, H-C; Komposch, S; Kozela, A; Krempel, J; Lauss, B; Lefort, T; Lemière, Y; Mtchedlishvili, A; Musgrave, M; Naviliat-Cuncic, O; Pendlebury, J M; Piegsa, F M; Pignol, G; Plonka-Spehr, C; Prashanth, P N; Quéméner, G; Rawlik, M; Rebreyend, D; Ries, D; Roccia, S; Rozpedzik, D; Schmidt-Wellenburg, P; Severijns, N; Thorne, J A; Weis, A; Wursten, E; Wyszynski, G; Zejma, J; Zenner, J; Zsigmond, G
2015-10-16
We describe a spin-echo method for ultracold neutrons (UCNs) confined in a precession chamber and exposed to a |B0|=1 μT magnetic field. We have demonstrated that the analysis of UCN spin-echo resonance signals in combination with knowledge of the ambient magnetic field provides an excellent method by which to reconstruct the energy spectrum of a confined ensemble of neutrons. The method takes advantage of the relative dephasing of spins arising from a gravitationally induced striation of stored UCNs of different energies, and also permits an improved determination of the vertical magnetic-field gradient with an exceptional accuracy of 1.1 pT/cm. This novel combination of a well-known nuclear resonance method and gravitationally induced vertical striation is unique in the realm of nuclear and particle physics and should prove to be invaluable for the assessment of systematic effects in precision experiments such as searches for an electric dipole moment of the neutron or the measurement of the neutron lifetime.
Safety assessment of a shallow foundation using the random finite element method
NASA Astrophysics Data System (ADS)
Zaskórski, Łukasz; Puła, Wojciech
2015-04-01
A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.
Hu, Ning; Fang, Jiaru; Zou, Ling; Wan, Hao; Pan, Yuxiang; Su, Kaiqi; Zhang, Xi; Wang, Ping
2016-10-01
Cell-based bioassays were effective method to assess the compound toxicity by cell viability, and the traditional label-based methods missed much information of cell growth due to endpoint detection, while the higher throughputs were demanded to obtain dynamic information. Cell-based biosensor methods can dynamically and continuously monitor with cell viability, however, the dynamic information was often ignored or seldom utilized in the toxin and drug assessment. Here, we reported a high-efficient and high-content cytotoxic recording method via dynamic and continuous cell-based impedance biosensor technology. The dynamic cell viability, inhibition ratio and growth rate were derived from the dynamic response curves from the cell-based impedance biosensor. The results showed that the biosensors has the dose-dependent manners to diarrhetic shellfish toxin, okadiac acid based on the analysis of the dynamic cell viability and cell growth status. Moreover, the throughputs of dynamic cytotoxicity were compared between cell-based biosensor methods and label-based endpoint methods. This cell-based impedance biosensor can provide a flexible, cost and label-efficient platform of cell viability assessment in the shellfish toxin screening fields.
On assessing bioequivalence and interchangeability between generics based on indirect comparisons.
Zheng, Jiayin; Chow, Shein-Chung; Yuan, Mengdie
2017-08-30
As more and more generics become available in the market place, the safety/efficacy concerns may arise as the result of interchangeably use of approved generics. However, bioequivalence assessment for regulatory approval among generics of the innovative drug product is not required. In practice, approved generics are often used interchangeably without any mechanism of safety monitoring. In this article, based on indirect comparisons, we proposed several methods to assessing bioequivalence and interchangeability between generics. The applicability of the methods and the similarity assumptions were discussed, as well as the inappropriateness of directly adopting adjusted indirect comparison to the field of generics' comparison. Besides, some extensions were given to take into consideration the important topics in clinical trials for bioequivalence assessments, for example, multiple comparisons and simultaneously testing bioequivalence among three generics. Extensive simulation studies were conducted to investigate the performances of the proposed methods. The studies of malaria generics and HIV/AIDS generics prequalified by the WHO were used as real examples to demonstrate the use of the methods. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NOTE Effects of skeletal muscle anisotropy on induced currents from low-frequency magnetic fields
NASA Astrophysics Data System (ADS)
Tachas, Nikolaos J.; Samaras, Theodoros; Baskourelos, Konstantinos; Sahalos, John N.
2009-12-01
Studies which take into account the anisotropy of tissue dielectric properties for the numerical assessment of induced currents from low-frequency magnetic fields are scarce. In the present study, we compare the induced currents in two anatomical models, using the impedance method. In the first model, we assume that all tissues have isotropic conductivity, whereas in the second one, we assume anisotropic conductivity for the skeletal muscle. Results show that tissue anisotropy should be taken into account when investigating the exposure to low-frequency magnetic fields, because it leads to higher induced current values.
NASA Technical Reports Server (NTRS)
Schmitt, Jeff G.; Stahnke, Brian
2017-01-01
This report describes test results from an assessment of the acoustically treated 9x15 Foot Low Speed Wind Tunnel at the NASA Glenn Research Center in Cleveland, Ohio in July of 2016. The tests were conducted in accordance with the recently adopted international standard ISO 26101-2012 on Qualification of Free Field Test Environments. This method involves moving a microphone relative to a source and comparing the sound pressure level versus distance measurements with theoretical inverse square law spreading.
Kinoshita, Rintaro; Moebius-Clune, Bianca N.; van Es, Harold M.; Hively, W. Dean; Bilgilis, A. Volkan
2012-01-01
Visible and near-infrared reflectance spectroscopy (VNIRS) is a rapid and nondestructive method that can predict multiple soil properties simultaneously, but its application in multidimensional soil quality (SQ) assessment in the tropics still needs to be further assessed. In this study, VNIRS (350–2500 nm) was employed to analyze 227 air-dried soil samples of Ultisols from a soil chronosequence in western Kenya and assess 16 SQ indicators. Partial least squares regression (PLSR) was validated using the full-site cross-validation method by grouping samples from each farm or forest site. Most suitable models successfully predicted SQ indicators (R2 ≥ 0.80; ratio of performance to deviation [RPD] ≥ 2.00) including soil organic matter (OMLOI), active C, Ca, cation exchange capacity (CEC), and clay. Moderately-well predicted indicators (0.50 ≤ R2 pwp), and field capacity (Θfc). Poorly predicted indicators (R2 < 0.50; RPD < 1.40) were EC, S, P, available water capacity (AWC), K, Zn, and penetration resistance. Combining VNIRS with selected field- and laboratory-measured SQ indicator values increased predictability. Furthermore, VNIRS showed moderate to substantial agreement in predicting interpretive SQ scores and a composite soil quality index (CSQI) especially when combined with directly measured SQ indicator values. In conclusion, VNIRS has good potential for low cost, rapid assessment of physical and biological SQ indicators but conventional soil chemical tests may need to be retained to provide comprehensive SQ assessments.
Rice, Karen C.; Bricker, Owen P.
1991-01-01
The report describes the results of a study to assess the sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland using a geology-based method. Water samples were collected from streams in July and August 1988 when streams were at base-flow conditions. Eighteen water samples collected from streams in Charles County, and 17 water samples from streams in Anne Arundel County were analyzed in the field for pH, specific conductance, and acid-neutralizing capacity (ANC); 8 water samples from streams in Charles County were analyzed in the laboratory for chloride and sulfate concentrations. The assessment revealed that streams in these counties are sensitive to acidification by acidic deposition.
Hoffmann, Sebastian; Hartung, Thomas; Stephens, Martin
Evidence-based toxicology (EBT) was introduced independently by two groups in 2005, in the context of toxicological risk assessment and causation as well as based on parallels between the evaluation of test methods in toxicology and evidence-based assessment of diagnostics tests in medicine. The role model of evidence-based medicine (EBM) motivated both proposals and guided the evolution of EBT, whereas especially systematic reviews and evidence quality assessment attract considerable attention in toxicology.Regarding test assessment, in the search of solutions for various problems related to validation, such as the imperfectness of the reference standard or the challenge to comprehensively evaluate tests, the field of Diagnostic Test Assessment (DTA) was identified as a potential resource. DTA being an EBM discipline, test method assessment/validation therefore became one of the main drivers spurring the development of EBT.In the context of pathway-based toxicology, EBT approaches, given their objectivity, transparency and consistency, have been proposed to be used for carrying out a (retrospective) mechanistic validation.In summary, implementation of more evidence-based approaches may provide the tools necessary to adapt the assessment/validation of toxicological test methods and testing strategies to face the challenges of toxicology in the twenty first century.
Assessment Methods of an Undergraduate Psychiatry Course at a Saudi University
Amr, Mostafa; Amin, Tarek
2012-01-01
Objectives: In Arab countries there are few studies on assessment methods in the field of psychiatry. The objective of this study was to assess the outcome of different forms of psychiatric course assessment among fifth year medical students at King Faisal University, Saudi Arabia. Methods: We examined the performance of 110 fifth-year medical students through objective structured clinical examinations (OSCE), traditional oral clinical examinations (TOCE), portfolios, multiple choice questions (MCQ), and a written examination. Results: The score ranges in TOCE, OSCE, portfolio, and MCQ were 32–50, 7–15, 5–10 and 22–45, respectively. In regression analysis, there was a significant correlation between OSCE and all forms of psychiatry examinations, except for the MCQ marks. OSCE accounted for 65.1% of the variance in total clinical marks and 31.5% of the final marks (P = 0.001), while TOCE alone accounted for 74.5% of the variance in the clinical scores. Conclusions: This study demonstrates a consistency among the students’ assessment methods used in the psychiatry course, particularly the clinical component, in an integrated manner. This information would be useful for future developments in undergraduate teaching. PMID:22548141
NASA Astrophysics Data System (ADS)
Liu, H. T.; Buck, J. W.; Germain, A. C.; Hinchee, M. E.; Solt, T. S.; Leroy, G. M.; Srnsky, R. A.
1988-09-01
The effects of upwind turbine wakes on the performance of a FloWind 17-m vertical-axis wind turbine (VAWT) were investigated through a series of field experiments conducted at the FloWind wind farm on Cameron Ridge, Tehachapi, California. From the field measurements, we derived the velocity and power/energy deficits under various turbine on/off configurations. Much information was provided to characterize the structure of VAWT wakes and to assess their effects on the performance of downwind turbines. A method to estimate the energy deficit was developed based on the measured power deficit and the wind speed distributions. This method may be adopted for other turbine types and sites. Recommendations are made for optimizing wind farm design and operations, as well as for wind energy management.
Bogers, Rik P; Bolte, John F B; Houtveen, Jan H; Lebret, Erik; van Strien, Rob T; Schipper, C Maarten A; Alkadhimi, Mehdi; Baliatsas, Christos; van Kamp, Irene
2013-01-01
Introduction Idiopathic Environmental Intolerance (IEI) attributed to electromagnetic fields (EMF) refers to self-reported sensitivity mainly characterised by the attribution of non-specific physical symptoms to low-level EMF exposure emitted from sources such as mobile phones. Scientific studies have not provided evidence for the existence of IEI-EMF, but these studies did not resemble the real-life situation or suffered from poor exposure characterisation and biased recall of health symptoms. To improve existing methods for the study of IEI-EMF, an Ecological Momentary Assessment (EMA) study is designed. Methods and analysis The study is an EMA study in which respondents carry personal exposure metres (exposimeters) that measure radiofrequency (RF) EMF, with frequent assessment of health symptoms and perceived EMF exposure through electronic diary registration during five consecutive days. Participants will be a selection from an epidemiological study who report to be sensitive to RF EMF. The exposimeters measure electric field strength in 12 frequency bands. Diary questions include the occurrence and severity of 10 non-specific physical symptoms, mood states and perceived exposure to (sources of) EMF. The relationship of actual and perceived EMF exposure and mood with non-specific physical symptoms will be analysed using multilevel regression analysis with time-shift models. Discussion The study has several advantages over previous studies, including assessment of personal EMF exposure and non-specific physical symptoms by an ecological method with a minimised chance of recall bias. The within-person design reduces confounding by time-stable factors (eg, personal characteristics). In the conduct of the study and the analysis and interpretation of its outcomes, some methodological issues including a high participant burden, reactivity, compliance to the study protocol and the potential of chance findings due to multiple statistical testing will be accounted for and limited as much as possible. PMID:23988360
NASA Astrophysics Data System (ADS)
Michalski, Krzysztof A.; Lin, Hung-I.
2018-01-01
Second-order asymptotic formulas for the electromagnetic fields of a horizontal electric dipole over an imperfectly conducting half-space are derived using the modified saddle-point method. Application examples are presented for ordinary and plasmonic media, and the accuracy of the new formulation is assessed by comparisons with two alternative state-of-the-art theories and with the rigorous results of numerical integration.
Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity
Krasteva, Vessela TZ; Papazov, Sava P; Daskalov, Ivan K
2003-01-01
Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM) of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium. PMID:14693034
Spatial and spatiotemporal pattern analysis of coconut lethal yellowing in Mozambique.
Bonnot, F; de Franqueville, H; Lourenço, E
2010-04-01
Coconut lethal yellowing (LY) is caused by a phytoplasma and is a major threat for coconut production throughout its growing area. Incidence of LY was monitored visually on every coconut tree in six fields in Mozambique for 34 months. Disease progress curves were plotted and average monthly disease incidence was estimated. Spatial patterns of disease incidence were analyzed at six assessment times. Aggregation was tested by the coefficient of spatial autocorrelation of the beta-binomial distribution of diseased trees in quadrats. The binary power law was used as an assessment of overdispersion across the six fields. Spatial autocorrelation between symptomatic trees was measured by the BB join count statistic based on the number of pairs of diseased trees separated by a specific distance and orientation, and tested using permutation methods. Aggregation of symptomatic trees was detected in every field in both cumulative and new cases. Spatiotemporal patterns were analyzed with two methods. The proximity of symptomatic trees at two assessment times was investigated using the spatiotemporal BB join count statistic based on the number of pairs of trees separated by a specific distance and orientation and exhibiting the first symptoms of LY at the two times. The semivariogram of times of appearance of LY was calculated to characterize how the lag between times of appearance of LY was related to the distance between symptomatic trees. Both statistics were tested using permutation methods. A tendency for new cases to appear in the proximity of previously diseased trees and a spatially structured pattern of times of appearance of LY within clusters of diseased trees were detected, suggesting secondary spread of the disease.
Injection of thermal and suprathermal seed particles into coronal shocks of varying obliquity
NASA Astrophysics Data System (ADS)
Battarbee, M.; Vainio, R.; Laitinen, T.; Hietala, H.
2013-10-01
Context. Diffusive shock acceleration in the solar corona can accelerate solar energetic particles to very high energies. Acceleration efficiency is increased by entrapment through self-generated waves, which is highly dependent on the amount of accelerated particles. This, in turn, is determined by the efficiency of particle injection into the acceleration process. Aims: We present an analysis of the injection efficiency at coronal shocks of varying obliquity. We assessed injection through reflection and downstream scattering, including the effect of a cross-shock potential. Both quasi-thermal and suprathermal seed populations were analysed. We present results on the effect of cross-field diffusion downstream of the shock on the injection efficiency. Methods: Using analytical methods, we present applicable injection speed thresholds that were compared with both semi-analytical flux integration and Monte Carlo simulations, which do not resort to binary thresholds. Shock-normal angle θBn and shock-normal velocity Vs were varied to assess the injection efficiency with respect to these parameters. Results: We present evidence of a significant bias of thermal seed particle injection at small shock-normal angles. We show that downstream isotropisation methods affect the θBn-dependence of this result. We show a non-negligible effect caused by the cross-shock potential, and that the effect of downstream cross-field diffusion is highly dependent on boundary definitions. Conclusions: Our results show that for Monte Carlo simulations of coronal shock acceleration a full distribution function assessment with downstream isotropisation through scatterings is necessary to realistically model particle injection. Based on our results, seed particle injection at quasi-parallel coronal shocks can result in significant acceleration efficiency, especially when combined with varying field-line geometry. Appendices are available in electronic form at http://www.aanda.org
Aguirre, Erik; Arpón, Javier; Azpilicueta, Leire; López, Peio; de Miguel, Silvia; Ramos, Victoria; Falcone, Francisco
2014-12-01
In this article, the impact of topology as well as morphology of a complex indoor environment such as a commercial aircraft in the estimation of dosimetric assessment is presented. By means of an in-house developed deterministic 3D ray-launching code, estimation of electric field amplitude as a function of position for the complete volume of a commercial passenger airplane is obtained. Estimation of electromagnetic field exposure in this environment is challenging, due to the complexity and size of the scenario, as well as to the large metallic content, giving rise to strong multipath components. By performing the calculation with a deterministic technique, the complete scenario can be considered with an optimized balance between accuracy and computational cost. The proposed method can aid in the assessment of electromagnetic dosimetry in the future deployment of embarked wireless systems in commercial aircraft.
Kotsis, Ioannis; Kontoes, Charalabos; Paradissis, Dimitrios; Karamitsos, Spyros; Elias, Panagiotis; Papoutsis, Ioannis
2008-06-10
The primary objective of this paper is the evaluation of the InSAR derived displacement field caused by the 07/09/1999 Athens earthquake, using as reference an external data source provided by terrestrial surveying along the Mornos river open aqueduct. To accomplish this, a processing chain to render comparable the leveling measurements and the interferometric derived measurements has been developed. The distinct steps proposed include a solution for reducing the orbital and atmospheric interferometric fringes and an innovative method to compute the actual InSAR estimated vertical ground subsidence, for direct comparison with the leveling data. Results indicate that the modeled deformation derived from a series of stacked interferograms, falls entirely within the confidence interval assessed for the terrestrial surveying data.
Kotsis, Ioannis; Kontoes, Charalabos; Paradissis, Dimitrios; Karamitsos, Spyros; Elias, Panagiotis; Papoutsis, Ioannis
2008-01-01
The primary objective of this paper is the evaluation of the InSAR derived displacement field caused by the 07/09/1999 Athens earthquake, using as reference an external data source provided by terrestrial surveying along the Mornos river open aqueduct. To accomplish this, a processing chain to render comparable the leveling measurements and the interferometric derived measurements has been developed. The distinct steps proposed include a solution for reducing the orbital and atmospheric interferometric fringes and an innovative method to compute the actual InSAR estimated vertical ground subsidence, for direct comparison with the leveling data. Results indicate that the modeled deformation derived from a series of stacked interferograms, falls entirely within the confidence interval assessed for the terrestrial surveying data. PMID:27879926
Tardaguila, Javier; Fernández-Novales, Juan; Gutiérrez, Salvador; Diago, Maria Paz
2017-08-01
Until now, the majority of methods employed to assess grapevine water status have been destructive, time-intensive, costly and provide information of a limited number of samples, thus the ability of revealing within-field water status variability is reduced. The goal of this work was to evaluate the capability of non-invasive, portable near infrared (NIR) spectroscopy acquired in the field, to assess the grapevine water status in diverse varieties, grown under different environmental conditions, in a fast and reliable way. The research was conducted 2 weeks before harvest in 2012, in two commercial vineyards, planted with eight different varieties. Spectral measurements were acquired in the field on the adaxial and abaxial sides of 160 individual leaves (20 leaves per variety) using a commercially available handheld spectrophotometer (1600-2400 nm). Principal component analysis (PCA) and modified partial least squares (MPLS) were used to interpret the spectra and to develop reliable prediction models for stem water potential (Ψ s ) (cross-validation correlation coefficient (r cv ) ranged from 0.77 to 0.93, and standard error of cross validation (SECV) ranged from 0.10 to 0.23), and leaf relative water content (RWC) (r cv ranged from 0.66 to 0.81, and SECV between 1.93 and 3.20). The performance differences between models built from abaxial and adaxial-acquired spectra is also discussed. The capability of non-invasive NIR spectroscopy to reliably assess the grapevine water status under field conditions was proved. This technique can be a suitable and promising tool to appraise within-field variability of plant water status, helpful to define optimised irrigation strategies in the wine industry. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Bai, Mingsian R; Wen, Jheng-Ciang; Hsu, Hoshen; Hua, Yi-Hsin; Hsieh, Yu-Hao
2014-10-01
A sound reconstruction system is proposed for audio reproduction with extended sweet spot and reduced reflections. An equivalent source method (ESM)-based sound field synthesis (SFS) approach, with the aid of dark zone minimization is adopted in the study. Conventional SFS that is based on the free-field assumption suffers from synthesis error due to boundary reflections. To tackle the problem, the proposed system utilizes convex optimization in designing array filters with both reproduction performance and acoustic contrast taken into consideration. Control points are deployed in the dark zone to minimize the reflections from the walls. Two approaches are employed to constrain the pressure and velocity in the dark zone. Pressure matching error (PME) and acoustic contrast (AC) are used as performance measures in simulations and experiments for a rectangular loudspeaker array. Perceptual Evaluation of Audio Quality (PEAQ) is also used to assess the audio reproduction quality. The results show that the pressure-constrained (PC) method yields better acoustic contrast, but poorer reproduction performance than the pressure-velocity constrained (PVC) method. A subjective listening test also indicates that the PVC method is the preferred method in a live room.
Antoninka, Anita; Bowker, Matthew A.; Chuckran, Peter; Barger, Nicole N.; Reed, Sasha C.; Belnap, Jayne
2017-01-01
AimsBiological soil crusts (biocrusts) are soil-surface communities in drylands, dominated by cyanobacteria, mosses, and lichens. They provide key ecosystem functions by increasing soil stability and influencing soil hydrologic, nutrient, and carbon cycles. Because of this, methods to reestablish biocrusts in damaged drylands are needed. Here we test the reintroduction of field-collected vs. greenhouse-cultured biocrusts for rehabilitation.MethodsWe collected biocrusts for 1) direct reapplication, and 2) artificial cultivation under varying hydration regimes. We added field-collected and cultivated biocrusts (with and without hardening treatments) to bare field plots and monitored establishment.ResultsBoth field-collected and cultivated cyanobacteria increased cover dramatically during the experimental period. Cultivated biocrusts established more rapidly than field-collected biocrusts, attaining ~82% cover in only one year, but addition of field-collected biocrusts led to higher species richness, biomass (as assessed by chlorophyll a) and level of development. Mosses and lichens did not establish well in either case, but late successional cover was affected by hardening and culture conditions.ConclusionsThis study provides further evidence that it is possible to culture biocrust components from later successional materials and reestablish cultured organisms in the field. However, more research is needed into effective reclamation techniques.
Fujii, Takuro; Taguchi, Yoshihiro; Saiki, Toshiharu; Nagasaka, Yuji
2011-01-01
We have developed a novel nanoscale temperature-measurement method using fluorescence in the near-field called fluorescence near-field optics thermal nanoscopy (Fluor-NOTN). Fluor-NOTN enables the temperature distributions of nanoscale materials to be measured in vivo/in situ. The proposed method measures temperature by detecting the temperature dependent fluorescence lifetimes of Cd/Se quantum dots (QDs). For a high-sensitivity temperature measurement, the auto-fluorescence generated from a fiber probe should be reduced. In order to decrease the noise, we have fabricated a novel near-field optical-fiber probe by fusion-splicing a photonic crystal fiber (PCF) and a conventional single-mode fiber (SMF). The validity of the novel fiber probe was assessed experimentally by evaluating the auto-fluorescence spectra of the PCF. Due to the decrease of auto-fluorescence, a six- to ten-fold increase of S/N in the near-field fluorescence lifetime detection was achieved with the newly fabricated fusion-spliced near-field optical fiber probe. Additionally, the near-field fluorescence lifetime of the quantum dots was successfully measured by the fabricated fusion-spliced near-field optical fiber probe at room temperature, and was estimated to be 10.0 ns.
NASA Astrophysics Data System (ADS)
Oettl, Dietmar; Kropsch, Michael; Mandl, Michael
2018-05-01
The assessment of odour annoyance varies vastly among countries even within the European Union. Using so-called odour-hour frequencies offers the distinct possibility for either applying dispersion models or field inspections, both generally assumed to be equivalent. In this study, odour-hours based on field inspections according to the European standard EN 16841-1 (2017) in the vicinity of a pig-fattening farm have been compared with modelled ones using the Lagrangian particle model GRAL, which uses odour-concentration variances for computing odour hours as recently proposed by Oettl and Ferrero (2017). Using a threshold of 1 ou m-3 (ou = odour units) for triggering odour hours in the model, as prescribed by the German guideline for odour assessment, led to reasonable agreements between the two different methodologies. It is pointed out that the individual odour sensitivity of qualified panel members, who carry out field inspections, is of crucial importance for selecting a proper odour-hour model. Statistical analysis of a large number of data stemming from dynamic olfactometry (EN 13725, 2003), that cover a wide range of odorants, suggests that the prescribed method in Germany for modelling odour hours may likely result in an overestimation, and hence, equivalence with field inspections is not given. The dataset is freely available on request.
High-speed engine/component performance assessment using exergy and thrust-based methods
NASA Technical Reports Server (NTRS)
Riggins, D. W.
1996-01-01
This investigation summarizes a comparative study of two high-speed engine performance assessment techniques based on energy (available work) and thrust-potential (thrust availability). Simple flow-fields utilizing Rayleigh heat addition and one-dimensional flow with friction are used to demonstrate the fundamental inability of conventional energy techniques to predict engine component performance, aid in component design, or accurately assess flow losses. The use of the thrust-based method on these same examples demonstrates its ability to yield useful information in all these categories. Energy and thrust are related and discussed from the stand-point of their fundamental thermodynamic and fluid dynamic definitions in order to explain the differences in information obtained using the two methods. The conventional definition of energy is shown to include work which is inherently unavailable to an aerospace Brayton engine. An engine-based energy is then developed which accurately accounts for this inherently unavailable work; performance parameters based on this quantity are then shown to yield design and loss information equivalent to the thrust-based method.
QUANTITATIVE PESTICIDE EXPOSURE ASSESSMENT OF CHILDREN LIVING IN AN AGRICULTURAL COMMUNITY
In support of planning efforts for the National Children's Study, we conducted a pilot study to test field methods characterizing pesticide exposures to 20 farmworker children aged 6-24 months living in the Salinas Valley, Monterey County, California. Sample collection included d...
DOT National Transportation Integrated Search
1992-01-01
This study was conducted to assess an FAA program to hire former military air traffic control specialists to enter ATC field training directly without first attending the Academy screening program. Selection of military controllers was based on meeti...
Transdisciplinarity in Research: Perspectives of Early Career Faculty
ERIC Educational Resources Information Center
Moore, Megan; Martinson, Melissa L.; Nurius, Paula S.; Kemp, Susan P.
2018-01-01
Background: Early career faculty experiences and perspectives on transdisciplinary research are important yet understudied. Methods: Assistant professors at 50 top-ranked social work programs completed an online survey assessing perspectives on the salience of transdisciplinary training in their field, obstacles to or negative impacts of…
Four New Course Competencies for Majors.
ERIC Educational Resources Information Center
Van Leuven, Jim
1999-01-01
Notes changes in the past decade in the field of public relations. Proposes four new required core competencies for all undergraduate public-relations majors in programs housed in journalism/mass-communication units. Articulates these regarding appropriate outcomes, pedagogies, and assessment methods. Notes special considerations for small,…
DOT National Transportation Integrated Search
1982-03-01
The important mechanical processes which influence the ballast physical state in track are tamping, crib and shoulder compaction and train traffic. Three methods of assessing physical state were used at four railroad sites to obtain needed data on th...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wroe, Andrew; Centre for Medical Radiation Physics, University of Wollongong, Wollongong; Clasie, Ben
2009-01-01
Purpose: Microdosimetric measurements were performed at Massachusetts General Hospital, Boston, MA, to assess the dose equivalent external to passively delivered proton fields for various clinical treatment scenarios. Methods and Materials: Treatment fields evaluated included a prostate cancer field, cranial and spinal medulloblastoma fields, ocular melanoma field, and a field for an intracranial stereotactic treatment. Measurements were completed with patient-specific configurations of clinically relevant treatment settings using a silicon-on-insulator microdosimeter placed on the surface of and at various depths within a homogeneous Lucite phantom. The dose equivalent and average quality factor were assessed as a function of both lateral displacement frommore » the treatment field edge and distance downstream of the beam's distal edge. Results: Dose-equivalent value range was 8.3-0.3 mSv/Gy (2.5-60-cm lateral displacement) for a typical prostate cancer field, 10.8-0.58 mSv/Gy (2.5-40-cm lateral displacement) for the cranial medulloblastoma field, 2.5-0.58 mSv/Gy (5-20-cm lateral displacement) for the spinal medulloblastoma field, and 0.5-0.08 mSv/Gy (2.5-10-cm lateral displacement) for the ocular melanoma field. Measurements of external field dose equivalent for the stereotactic field case showed differences as high as 50% depending on the modality of beam collimation. Average quality factors derived from this work ranged from 2-7, with the value dependent on the position within the phantom in relation to the primary beam. Conclusions: This work provides a valuable and clinically relevant comparison of the external field dose equivalents for various passively scattered proton treatment fields.« less
Multiple stressor effects in relation to declining amphibian populations
Linder, Greg L.; Krest, Sherry K.; Sparling, Donald; Little, E.
2003-01-01
Original research discusses the protocols and approaches to studying the effects of multiple environmental stressors on amphibian populations and gives new perspectives on this complicated subject. This new publication integrates a variety of stressors that can act in concert and may ultimately cause a decline in amphibian populations.Sixteen peer-reviewed papers cover:Toxicity Assessment examines methods, which range from long-established laboratory approaches for evaluating adverse chemical effects to amphibians, to methods that link chemicals in surface waters, sediments, and soils with adverse effects observed among amphibians in the field.Field and Laboratory Studies illustrates studies in the evaluation of multiple stressor effects that may lead to declining amphibian populations. A range of laboratory and field studies of chemicals, such as herbicides, insecticides, chlorinated organic compounds, metals, and complex mixtures are also included.Causal Analysis demonstrates the range of tools currently available for evaluating "cause-effect" relationships between environmental stressors and declining amphibian populations.Audience: This new publication is a must-have for scientists and resource management professionals from diverse fields, including ecotoxicology, chemistry, ecology, field biology, conservation biology, and natural resource management.
Dannenberg, Andrew L; Bhatia, Rajiv; Cole, Brian L; Dora, Carlos; Fielding, Jonathan E; Kraft, Katherine; McClymont-Peace, Diane; Mindell, Jennifer; Onyekere, Chinwe; Roberts, James A; Ross, Catherine L; Rutt, Candace D; Scott-Samuel, Alex; Tilson, Hugh H
2006-02-01
Health impact assessment (HIA) methods are used to evaluate the impact on health of policies and projects in community design, transportation planning, and other areas outside traditional public health concerns. At an October 2004 workshop, domestic and international experts explored issues associated with advancing the use of HIA methods by local health departments, planning commissions, and other decisionmakers in the United States. Workshop participants recommended conducting pilot tests of existing HIA tools, developing a database of health impacts of common projects and policies, developing resources for HIA use, building workforce capacity to conduct HIAs, and evaluating HIAs. HIA methods can influence decisionmakers to adjust policies and projects to maximize benefits and minimize harm to the public's health.
Optical Molecular Imaging for Diagnosing Intestinal Diseases
Kim, Sang-Yeob
2013-01-01
Real-time visualization of the molecular signature of cells can be achieved with advanced targeted imaging techniques using molecular probes and fluorescence endoscopy. This molecular optical imaging in gastrointestinal endoscopy is promising for improving the detection of neoplastic lesions, their characterization for patient stratification, and the assessment of their response to molecular targeted therapy and radiotherapy. In inflammatory bowel disease, this method can be used to detect dysplasia in the presence of background inflammation and to visualize inflammatory molecular targets for assessing disease severity and prognosis. Several preclinical and clinical trials have applied this method in endoscopy; however, this field has just started to evolve. Hence, many problems have yet to be solved to enable the clinical application of this novel method. PMID:24340254
Hyperactivity and Motoric Activity in ADHD: Characterization, Assessment, and Intervention
Gawrilow, Caterina; Kühnhausen, Jan; Schmid, Johanna; Stadler, Gertraud
2014-01-01
The aim of the present literature review is threefold. (1) We will review theories, models, and studies on symptomatic hyperactivity and motoric activity in attention-deficit/hyperactivity disorder (ADHD). (2) Another focus will be on assessment methods that have been proven to be effective in the detection of hyperactivity and motoric activity in children, adolescents, and adults with and without ADHD and emerging areas of research in the field of ADHD. We will compare subjective methods (i.e., rating scales) and objective methods (i.e., accelerometers). (3) Finally, physical activity intervention studies aiming at a modification of activity and overactive behavior will be summarized that seem to be promising candidates for alleviating hyperactivity symptoms in children, adolescents, and adults with ADHD. PMID:25506329
Sannino, Anna; Romeo, Stefania; Scarfì, Maria Rosaria; Massa, Rita; d’Angelo, Raffaele; Petrillo, Antonella; Cerciello, Vincenzo; Fusco, Roberta; Zeni, Olga
2017-01-01
Magnetic resonance imaging (MRI) has evolved rapidly over the past few decades as one of the most flexible tools in medical research and diagnostic imaging. MRI facilities are important sources of multiple exposure to electromagnetic fields for both patients and health-care staff, due to the presence of electromagnetic fields of multiple frequency ranges, different temporal variations, and field strengths. Due to the increasing use and technological advancements of MRI systems, clearer insights into exposure assessment and a better understanding of possible harmful effects due to long-term exposures are highly needed. In the present exploratory study, exposure assessment and biomonitoring of MRI workers at the Radio-diagnostics Unit of the National Cancer Institute of Naples “Pascale Foundation” (Naples, Italy) have been carried out. In particular, exposure to the MRI static magnetic field (SMF) has been evaluated by means of personal monitoring, while an application tool has been developed to provide an estimate of motion-induced, time-varying electric fields. Measurement results have highlighted a high day-to-day and worker-to-worker variability of the exposure to the SMF, which strongly depends on the characteristics of the environment and on personal behaviors, and the developed application tool can be adopted as an easy-to-use tool for rapid and qualitative evaluation of motion-induced, time-varying electric field exposure. Regarding biomonitoring, the 24 workers of the Radio-diagnostics Unit were enrolled to evaluate both spontaneous and mitomycin C-induced chromosomal fragility in human peripheral blood lymphocytes, by means of the cytokinesis-block micronucleus assay. The study subjects were 12 MRI workers, representative of different professional categories, as the exposed group, and 12 workers with no MRI exposure history, as the reference group. The results show a high worker-to-worker variability for both field exposure assessment and biomonitoring, as well as several critical issues and practicalities to be faced with in this type of investigations. The procedures for risk assessment and biomonitoring proposed here can be used to inform future research in this field, which will require a refinement of exposure assessment methods and an enlargement of the number of subjects enrolled in the biomonitoring study to gain robust statistics and reliable results. PMID:29326919
NASA Astrophysics Data System (ADS)
Duval, Rodolphe; Fauchard, Cyrille; Antoine, Raphael
2014-05-01
We study the influence of the topography of a levee on the electric and magnetic signals obtained with the Radio-Magnetotelluric method (RMT) and the Slingram method, respectively. For the RMT method, field measurements have been modelled with a finite element commercial software (AC/DC and Radio-Frequency modules of Comsol Multiphysics). A levee situated in Orléans (France) along the Loire river has been considered in order to design a model taking into account the skin depth and the incident wavelength. The effect of the incident electromagnetic field direction has been assessed with two different incident wave directions: BBC 5 from Salford (UK) and France-Inter from Allouis (France). The simulations highlight the tri-dimensional effects of the topography in the apparent resistivity, observed on the crest of the levee, depending on the incident field direction and topography. For the Slingram method, the magnetic field has been simulated using the AC/DC module of Comsol. The ratio of the primary magnetic field on the secondary one, received in a loop is determined above a straight levee. The study aims to show the various responses obtained in function of both vertical and horizontal coil configurations. We show that the signal also depends on the topography and the right configuration of the coils alignment with respect to the levee stretch direction. In this study, a buried gas pipe is also characterized by the two methods. Numerical modelling of 3D electromagnetic effects on geophysical signals helps to interpret the field measurements and offers to the stakeholder an optimized methodology for geophysical surveys on levees.
[Research progresses on ergonomics assessment and measurement methods for push-pull behavior].
Zhao, Yan; Li, Dongxu; Guo, Shengpeng
2011-10-01
Pushing and pulling (P&P) is a common operating mode of operator's physical works, and plays an important role in evaluation of human behavior health and operation performance. At present, there are many research methods of P&P, and this article is a state-of-art review of the classification of P&P research methods, the various impact factors in P&P program, technical details of internal/external P&P force measurement and evaluation, the limitation of current research methods and the future developments in the ergonomics field.
Evaluation in context: ATC automation in the field
NASA Technical Reports Server (NTRS)
Harwood, Kelly; Sanford, Beverly
1994-01-01
The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.
A polyvalent harmonic coil testing method for small-aperture magnets
NASA Astrophysics Data System (ADS)
Arpaia, Pasquale; Buzio, Marco; Golluccio, Giancarlo; Walckiers, Louis
2012-08-01
A method to characterize permanent and fast-pulsed iron-dominated magnets with small apertures is presented. The harmonic coil measurement technique is enhanced specifically for small-aperture magnets by (1) in situ calibration, for facing search-coil production inaccuracy, (2) rotating the magnet around its axis, for correcting systematic effects, and (3) measuring magnetic fluxes by stationary coils at different angular positions for measuring fast pulsed magnets. This method allows a quadrupole magnet for particle accelerators to be characterized completely, by assessing multipole field components, magnetic axis position, and field direction. In this paper, initially the metrological problems arising from testing small-aperture magnets are highlighted. Then, the basic ideas of the proposed method and the architecture of the corresponding measurement system are illustrated. Finally, experimental validation results are shown for small-aperture permanent and fast-ramped quadrupole magnets for the new linear accelerator Linac4 at CERN (European Organization for Nuclear Research).
Field Analysis of Microbial Contamination Using Three Molecular Methods in Parallel
NASA Technical Reports Server (NTRS)
Morris, H.; Stimpson, E.; Schenk, A.; Kish, A.; Damon, M.; Monaco, L.; Wainwright, N.; Steele, A.
2010-01-01
Advanced technologies with the capability of detecting microbial contamination remain an integral tool for the next stage of space agency proposed exploration missions. To maintain a clean, operational spacecraft environment with minimal potential for forward contamination, such technology is a necessity, particularly, the ability to analyze samples near the point of collection and in real-time both for conducting biological scientific experiments and for performing routine monitoring operations. Multiple molecular methods for detecting microbial contamination are available, but many are either too large or not validated for use on spacecraft. Two methods, the adenosine- triphosphate (ATP) and Limulus Amebocyte Lysate (LAL) assays have been approved by the NASA Planetary Protection Office for the assessment of microbial contamination on spacecraft surfaces. We present the first parallel field analysis of microbial contamination pre- and post-cleaning using these two methods as well as universal primer-based polymerase chain reaction (PCR).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Greg D.; Evans, Nathan R.; Pearson, Walter H.
2001-10-30
The primary goal of our research this spring/ summer was to refine techniques and examine scenarios under which a standardized protocol could be applied to assess juvenile coho salmon (O. kisutch) passage through road culverts. Field evaluations focused on capture-mark- recapture methods that allowed analysis of fish movement patterns, estimates of culvert passability, and potential identification of cues inducing these movements. At this stage, 0+ age coho salmon fry 30 mm to 65 mm long (fork length) were the species and age class of interest. Ultimately, the protocol will provide rapid, statistically rigorous methods for trained personnel to perform standardizedmore » biological assessments of culvert passability to a number of juvenile salmon species. Questions to be addressed by the research include the following: ? Do hydraulic structures such as culverts restrict habitat for juvenile salmonids? ? How do existing culverts and retrofits perform relative to juvenile salmonid passage? ? Do some culvert characteristics and hydraulic conditions provide better passage than others? ? Does the culvert represent a barrier to certain size classes of fish? Recommendations addressed issues of study site selection, initial capture, marking, recapture/observations, and estimating movement.« less
Dietary assessment in elderly people: experiences gained from studies in the Netherlands.
de Vries, J H M; de Groot, L C P G M; van Staveren, W A
2009-02-01
In selecting a dietary assessment method, several aspects such as the aim of the study and the characteristics of the target population should be taken into account. In elderly people, diminished functionality and cognitive decline may hamper dietary assessment and require tailored approaches to assess dietary intake. The objective of this paper is to summarize our experience in dietary assessment in a number of different studies in population groups over 65 years of age in the Netherlands, and to discuss this experience in the perspective of other nutrition surveys in the elderly. In longitudinal studies, we applied a modified dietary history; in clinical nursing home studies, trained staff observed and recorded food consumption; and in a controlled trial in healthy elderly men, we used a food frequency questionnaire (FFQ). For all methods applied in the community-dwelling elderly people, validation studies showed a similar underestimation of intake of 10-15% compared with the reference value. In the care-depending elderly, the underestimation was less: 5% according to an observational method. The methods varied widely in the resources required, including burden to the participants, field staff and finances. For effective dietary assessment in older adults, the major challenge will be to distinguish between those elderly who are able to respond correctly to the less intensive methods, such as 24-h recalls or FFQ, and those who are not able to respond to these methods and require adapted techniques, for example, observational records.
Patching, Geoffrey R.; Rahm, Johan; Jansson, Märit; Johansson, Maria
2017-01-01
Accurate assessment of people’s preferences for different outdoor lighting applications is increasingly considered important in the development of new urban environments. Here a new method of random environmental walking is proposed to complement current methods of assessing urban lighting applications, such as self-report questionnaires. The procedure involves participants repeatedly walking between different lighting applications by random selection of a lighting application and preferred choice or by random selection of a lighting application alone. In this manner, participants are exposed to all lighting applications of interest more than once and participants’ preferences for the different lighting applications are reflected in the number of times they walk to each lighting application. On the basis of an initial simulation study, to explore the feasibility of this approach, a comprehensive field test was undertaken. The field test included random environmental walking and collection of participants’ subjective ratings of perceived pleasantness (PP), perceived quality, perceived strength, and perceived flicker of four lighting applications. The results indicate that random environmental walking can reveal participants’ preferences for different lighting applications that, in the present study, conformed to participants’ ratings of PP and perceived quality of the lighting applications. As a complement to subjectively stated environmental preferences, random environmental walking has the potential to expose behavioral preferences for different lighting applications. PMID:28337163
NASA Astrophysics Data System (ADS)
Gurk, M.; Bosch, F. P.; Tougiannidis, N.
2013-04-01
Common studies on the static electric field distribution over a conductivity anomaly use the self-potential method. However, this method is time consuming and requires nonpolarizable electrodes to be placed in the ground. Moreover, the information gained by this method is restricted to the horizontal variations of the electric field. To overcome the limitation in the self-potential technique, we conducted a field experiment using a non conventional technique to assess the static electric field over a conductivity anomaly. We use two metallic potential probes arranged on an insulated boom with a separation of 126 cm. When placed into the electric field of the free air, a surface charge will be induced on each probe trying to equalize with the potential of the surrounding atmosphere. The use of a plasma source at both probes facilitated continuous and quicker measurement of the electric field in the air. The present study shows first experimental measurements with a modified potential probe technique (MPP) along a 600-meter-long transect to demonstrate the general feasibility of this method for studying the static electric field distribution over shallow conductivity anomalies. Field measurements were carried out on a test site on top of the Bramsche Massif near Osnabrück (Northwest Germany) to benefit from a variety of available near surface data over an almost vertical conductivity anomaly. High resolution self-potential data served in a numerical analysis to estimate the expected individual components of the electric field vector. During the experiment we found more anomalies in the vertical and horizontal components of the electric field than self-potential anomalies. These contrasting findings are successfully cross-validated with conventional near surface geophysical methods. Among these methods, we used self-potential, radiomagnetotelluric, electric resistivity tomography and induced polarization data to derive 2D conductivity models of the subsurface in order to infer the geometrical properties and the origin of the conductivity anomaly in the survey area. The presented study demonstrates the feasibility of electric field measurements in free air to detect and study near surface conductivity anomalies. Variations in Ez correlate well with the conductivity distribution obtained from resistivity methods. Compared to the self-potential technique, continuously free air measurements of the electric field are more rapid and of better lateral resolution combined with the unique ability to analyze vertical components of the electric field which are of particular importance to detect lateral conductivity contrasts. Mapping Ez in free air is a good tool to precisely map lateral changes of the electric field distribution in areas where SP generation fails. MPP offers interesting application in other geophysical techniques e.g. in time domain electromagnetics, DC and IP. With this method we were able to reveal a ca. 150 m broad zone of enhanced electric field strength.
Ogourtsova, Tatiana; Archambault, Philippe S; Lamontagne, Anouk
2017-11-07
Hemineglect, defined as a failure to attend to the contralesional side of space, is a prevalent and disabling post-stroke deficit. Conventional hemineglect assessments lack sensitivity as they contain mainly non-functional tasks performed in near-extrapersonal space, using static, two-dimensional methods. This is of concern given that hemineglect is a strong predictor for functional deterioration, limited post-stroke recovery, and difficulty in community reintegration. With the emerging field of virtual reality, several virtual tools have been proposed and have reported better sensitivity in neglect-related deficits detection than conventional methods. However, these and future virtual reality-based tools are yet to be implemented in clinical practice. The present study aimed to explore the barriers/facilitators perceived by clinicians in the use of virtual reality for hemineglect assessment; and to identify features of an optimal virtual assessment. A qualitative descriptive process, in the form of focus groups, self-administered questionnaire and individual interviews was used. Two focus groups (n = 11 clinicians) were conducted and experts in the field (n = 3) were individually interviewed. Several barriers and facilitators, including personal, institutional, client suitability, and equipment factors, were identified. Clinicians and experts in the field reported numerous features for the virtual tool optimization. Factors identified through this study lay the foundation for the development of a knowledge translation initiative towards an implementation of a virtual assessment for hemineglect. Addressing the identified barriers/facilitators during implementation and incorporating the optimal features in the design of the virtual assessment could assist and promote its eventual adoption in clinical settings. Implications for rehabilitation A multimodal and active knowledge translation intervention built on the presently identified modifiable factors is suggested to be implemented to support the clinical integration of a virtual reality-based assessment for post-stroke hemineglect. To amplify application and usefulness of a virtual-reality based tool in the assessment of post-stroke hemineglect, optimal features identified in the present study should be incorporated in the design of such technology.
Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally
2015-01-01
Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778
Performance evaluation of infrared imaging system in field test
NASA Astrophysics Data System (ADS)
Wang, Chensheng; Guo, Xiaodong; Ren, Tingting; Zhang, Zhi-jie
2014-11-01
Infrared imaging system has been applied widely in both military and civilian fields. Since the infrared imager has various types and different parameters, for system manufacturers and customers, there is great demand for evaluating the performance of IR imaging systems with a standard tool or platform. Since the first generation IR imager was developed, the standard method to assess the performance has been the MRTD or related improved methods which are not perfect adaptable for current linear scanning imager or 2D staring imager based on FPA detector. For this problem, this paper describes an evaluation method based on the triangular orientation discrimination metric which is considered as the effective and emerging method to evaluate the synthesis performance of EO system. To realize the evaluation in field test, an experiment instrument is developed. And considering the importance of operational environment, the field test is carried in practical atmospheric environment. The test imagers include panoramic imaging system and staring imaging systems with different optics and detectors parameters (both cooled and uncooled). After showing the instrument and experiment setup, the experiment results are shown. The target range performance is analyzed and discussed. In data analysis part, the article gives the range prediction values obtained from TOD method, MRTD method and practical experiment, and shows the analysis and results discussion. The experimental results prove the effectiveness of this evaluation tool, and it can be taken as a platform to give the uniform performance prediction reference.
Description and evaluation of a peracetic acid air sampling and analysis method.
Nordling, John; Kinsky, Owen R; Osorio, Magdalena; Pechacek, Nathan
2017-12-01
Peracetic acid (PAA) is a corrosive chemical with a pungent odor, which is extensively used in occupational settings and causes various health hazards in exposed workers. Currently, there is no US government agency recommended method that could be applied universally for the sampling and analysis of PAA. Legacy methods for determining airborne PAA vapor levels frequently suffered from cross-reactivity with other chemicals, particularly hydrogen peroxide (H 2 O 2 ). Therefore, to remove the confounding factor of cross-reactivity, a new viable, sensitive method was developed for assessment of PAA exposure levels, based on the differential reaction kinetics of PAA with methyl p-tolylsulfide (MTS), relative to H 2 O 2 , to preferentially derive methyl p-tolysulfoxide (MTSO). By quantifying MTSO concentration produced in the liquid capture solution from an air sampler, using an internal standard, and utilizing the reaction stoichiometry of PAA and MTS, the original airborne concentration of PAA is determined. After refining this liquid trap high-performance liquid chromatography (HPLC) method in the laboratory, it was tested in five workplace settings where PAA products were used. PAA levels ranged from the detection limit of 0.013 parts per million (ppm) to 0.4 ppm. The results indicate a viable and potentially dependable method to assess the concentrations of PAA vapors under occupational exposure scenarios, though only a small number of field measurements were taken while field testing this method. However, the low limit of detection and precision offered by this method makes it a strong candidate for further testing and validation to expand the uses of this liquid trap HPLC method.
Östlund, Karl; Samuelsson, Christer; Mattsson, Sören; Rääf, Christopher L
2017-02-01
The peak-to-valley (PTV) method was investigated experimentally comparing PTV ratios for three HPGe detectors, with complementary Monte Carlo simulations of scatter in air for larger source-detector distances. The measured PTV ratios for 137Cs in air were similar for three different detectors for incident angles between 0 and 90°. The study indicated that the PTV method can differentiate between surface and shallow depth sources if the detector field of view is limited to a radius of less than 3.5m. Copyright © 2016 Elsevier Ltd. All rights reserved.
Correa-de-Araujo, Rosaly; Harris-Love, Michael O; Miljkovic, Iva; Fragala, Maren S; Anthony, Brian W; Manini, Todd M
2017-01-01
A growing body of scientific literature suggests that not only changes in skeletal muscle mass, but also other factors underpinning muscle quality, play a role in the decline in skeletal muscle function and impaired mobility associated with aging. A symposium on muscle quality and the need for standardized assessment was held on April 28, 2016 at the International Conference on Frailty and Sarcopenia Research in Philadelphia, Pennsylvania. The purpose of this symposium was to provide a venue for basic science and clinical researchers and expert clinicians to discuss muscle quality in the context of skeletal muscle function deficit and other aging-related muscle dysfunctions. The present article provides an expanded introduction concerning the emerging definitions of muscle quality and a potential framework for scientific inquiry within the field. Changes in muscle tissue composition, based on excessive levels of inter- and intra-muscular adipose tissue and intramyocellular lipids, have been found to adversely impact metabolism and peak force generation. However, methods to easily and rapidly assess muscle tissue composition in multiple clinical settings and with minimal patient burden are needed. Diagnostic ultrasound and other assessment methods continue to be developed for characterizing muscle pathology, and enhanced sonography using sensors to provide user feedback and improve reliability is currently the subject of ongoing investigation and development. In addition, measures of relative muscle force such as specific force or grip strength adjusted for body size have been proposed as methods to assess changes in muscle quality. Furthermore, performance-based assessments of muscle power via timed tests of function and body size estimates, are associated with lower extremity muscle strength may be responsive to age-related changes in muscle quality. Future aims include reaching consensus on the definition and standardized assessments of muscle quality, and providing recommendations to address critical clinical and technology research gaps within the field.
Collins, Sarah
2016-01-01
Due to rapid advances in technology, HIT competencies for nursing leaders require frequent attention and updating from experts in the field to ensure relevance to nursing leaders' work. This workshop will target nursing informatics researchers and leaders to: 1) learn methods and findings from a study validating a Self-Assessment Scale for Nursing Informatics Competencies for Nurse Leaders, 2) generate awareness of the Self-Assessment scale, 3) discuss strategies for maintenance of competencies overtime and 4) identify strategies to engage nursing leaders in this pursuit.
Generation of Particles and Seeding
NASA Technical Reports Server (NTRS)
Meyers, James F.
1991-01-01
One of the most important elements in laser velocimetry, yet the most neglected, is the small particle embedded in the flow field that scatters the light necessary to make velocity measurements. An attempt to remove the confusion in choosing a seeding method by assessing many of the techniques currently used is presented. Their characteristics and typical limitations imposed by various applications are outlined. The ramifications of these methods on measurement accuracy are addressed.
Andrew N. Gray; Thomas R. Whittier; David L. Azuma
2014-01-01
A substantial portion of the carbon (C) emitted by human activity is apparently being stored in forest ecosystems in the Northern Hemisphere, but the magnitude and cause are not precisely understood. Current official estimates of forest C flux are based on a combination of field measurements and other methods. The goal of this study was to improve on existing methods...
Analysis of citation networks as a new tool for scientific research
Vasudevan, R. K.; Ziatdinov, M.; Chen, C.; ...
2016-12-06
The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less
Intact skull chronic windows for mesoscopic wide-field imaging in awake mice
Silasi, Gergely; Xiao, Dongsheng; Vanni, Matthieu P.; Chen, Andrew C. N.; Murphy, Timothy H.
2016-01-01
Background Craniotomy-based window implants are commonly used for microscopic imaging, in head-fixed rodents, however their field of view is typically small and incompatible with mesoscopic functional mapping of cortex. New Method We describe a reproducible and simple procedure for chronic through-bone wide-field imaging in awake head-fixed mice providing stable optical access for chronic imaging over large areas of the cortex for months. Results The preparation is produced by applying clear-drying dental cement to the intact mouse skull, followed by a glass coverslip to create a partially transparent imaging surface. Surgery time takes about 30 minutes. A single set-screw provides a stable means of attachment for mesoscale assessment without obscuring the cortical field of view. Comparison with Existing Methods We demonstrate the utility of this method by showing seed-pixel functional connectivity maps generated from spontaneous cortical activity of GCAMP6 signals in both awake and anesthetized mice. Conclusions We propose that the intact skull preparation described here may be used for most longitudinal studies that do not require micron scale resolution and where cortical neural or vascular signals are recorded with intrinsic sensors. PMID:27102043
Numerical analysis of biosonar beamforming mechanisms and strategies in bats.
Müller, Rolf
2010-09-01
Beamforming is critical to the function of most sonar systems. The conspicuous noseleaf and pinna shapes in bats suggest that beamforming mechanisms based on diffraction of the outgoing and incoming ultrasonic waves play a major role in bat biosonar. Numerical methods can be used to investigate the relationships between baffle geometry, acoustic mechanisms, and resulting beampatterns. Key advantages of numerical approaches are: efficient, high-resolution estimation of beampatterns, spatially dense predictions of near-field amplitudes, and the malleability of the underlying shape representations. A numerical approach that combines near-field predictions based on a finite-element formulation for harmonic solutions to the Helmholtz equation with a free-field projection based on the Kirchhoff integral to obtain estimates of the far-field beampattern is reviewed. This method has been used to predict physical beamforming mechanisms such as frequency-dependent beamforming with half-open resonance cavities in the noseleaf of horseshoe bats and beam narrowing through extension of the pinna aperture with skin folds in false vampire bats. The fine structure of biosonar beampatterns is discussed for the case of the Chinese noctule and methods for assessing the spatial information conveyed by beampatterns are demonstrated for the brown long-eared bat.
Analysis of citation networks as a new tool for scientific research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasudevan, R. K.; Ziatdinov, M.; Chen, C.
The rapid growth of scientific publications necessitates new methods to understand the direction of scientific research within fields of study, ascertain the importance of particular groups, authors, or institutions, compute metrics that can determine the importance (centrality) of particular seminal papers, and provide insight into the social (collaboration) networks that are present. We present one such method based on analysis of citation networks, using the freely available CiteSpace Program. We use citation network analysis on three examples, including a single material that has been widely explored in the last decade (BiFeO 3), two small subfields with a minimal number ofmore » authors (flexoelectricity and Kitaev physics), and a much wider field with thousands of publications pertaining to a single technique (scanning tunneling microscopy). Interpretation of the analysis and key insights into the fields, such as whether the fields are experiencing resurgence or stagnation, are discussed, and author or collaboration networks that are prominent are determined. Such methods represent a paradigm shift in our way of dealing with the large volume of scientific publications and could change the way literature searches and reviews are conducted, as well as how the impact of specific work is assessed.« less
Description and evaluation of an interference assessment for a slotted-wall wind tunnel
NASA Technical Reports Server (NTRS)
Kemp, William B., Jr.
1991-01-01
A wind-tunnel interference assessment method applicable to test sections with discrete finite-length wall slots is described. The method is based on high order panel method technology and uses mixed boundary conditions to satisfy both the tunnel geometry and wall pressure distributions measured in the slotted-wall region. Both the test model and its sting support system are represented by distributed singularities. The method yields interference corrections to the model test data as well as surveys through the interference field at arbitrary locations. These results include the equivalent of tunnel Mach calibration, longitudinal pressure gradient, tunnel flow angularity, wall interference, and an inviscid form of sting interference. Alternative results which omit the direct contribution of the sting are also produced. The method was applied to the National Transonic Facility at NASA Langley Research Center for both tunnel calibration tests and tests of two models of subsonic transport configurations.
Javelot, Hervé; Spadazzi, Anne; Weiner, Luisa; Garcia, Sonia; Gentili, Claudio; Kosel, Markus; Bertschy, Gilles
2014-01-01
This paper reviews what we know about prediction in relation to mood disorders from the perspective of clinical, biological, and physiological markers. It then also presents how information and communication technologies have developed in the field of mood disorders, from the first steps, for example, the transition from paper and pencil to more sophisticated methods, to the development of ecological momentary assessment methods and, more recently, wearable systems. These recent developments have paved the way for the use of integrative approaches capable of assessing multiple variables. The PSYCHE project stands for Personalised monitoring SYstems for Care in mental HEalth. PMID:25050321
Javelot, Hervé; Spadazzi, Anne; Weiner, Luisa; Garcia, Sonia; Gentili, Claudio; Kosel, Markus; Bertschy, Gilles
2014-01-01
This paper reviews what we know about prediction in relation to mood disorders from the perspective of clinical, biological, and physiological markers. It then also presents how information and communication technologies have developed in the field of mood disorders, from the first steps, for example, the transition from paper and pencil to more sophisticated methods, to the development of ecological momentary assessment methods and, more recently, wearable systems. These recent developments have paved the way for the use of integrative approaches capable of assessing multiple variables. The PSYCHE project stands for Personalised monitoring SYstems for Care in mental HEalth.
Sport Fields as Potential Catalysts for Physical Activity in the Neighbourhood
Cutumisu, Nicoleta; Spence, John C.
2012-01-01
Physical activity is associated with access to recreational facilities such as sports fields. Because it is not clear whether objectively- or subjectively-assessed access to facilities exerts a stronger influence on physical activity, we investigated the association between the objective and perceived accessibility of sport fields and the levels of self-reported physical activity among adults in Edmonton, Canada. A sample of 2879 respondents was surveyed regarding their socio-demographics, health status, self-efficacy, levels of physical activity, as well as their perceptions of built environment in relation to physical activity. Neighbourhood-level data were obtained for each respondent based on their residence. Accessibility to facilities was assessed using the enhanced Two-Step Floating Catchment Area method. Geographic Information Systems were employed. A logistic regression was performed to predict physical activity using individual- and neighbourhood-level variables. Women, older individuals, and individuals with higher educational attainment were less likely to be physically active. Also, individuals with higher self-efficacy and higher objectively-assessed access to facilities were more likely to be physically active. Interventions that integrate provision of relevant programs for various population groups and of improved recreational facilities may contribute to sport fields becoming catalysts for physical activity by generating movement both on the site and in the neighbourhood. PMID:22470293
Sheehy, Susanne H; Douglas, Alexander D; Draper, Simon J
2013-09-01
In the absence of any highly effective vaccine candidate against Plasmodium falciparum malaria, it remains imperative for the field to pursue all avenues that may lead to the successful development of such a formulation. The development of a subunit vaccine targeting the asexual blood-stage of Plasmodium falciparum malaria infection has proven particularly challenging with only limited success to date in clinical trials. However, only a fraction of potential blood-stage vaccine antigens have been evaluated as targets, and a number of new promising candidate antigen formulations and delivery platforms are approaching clinical development. It is therefore essential that reliable and sensitive methods of detecting, or ruling out, even modest efficacy of blood-stage vaccines in small clinical trials be established. In this article we evaluate the challenges facing blood-stage vaccine developers, assess the appropriateness and limitations of various in vivo approaches for efficacy assessment and suggest future directions for the field.
Lee, Sangyeol; Reinhardt, Joseph M; Cattin, Philippe C; Abràmoff, Michael D
2010-08-01
Fundus camera imaging of the retina is widely used to diagnose and manage ophthalmologic disorders including diabetic retinopathy, glaucoma, and age-related macular degeneration. Retinal images typically have a limited field of view, and multiple images can be joined together using an image registration technique to form a montage with a larger field of view. A variety of methods for retinal image registration have been proposed, but evaluating such methods objectively is difficult due to the lack of a reference standard for the true alignment of the individual images that make up the montage. A method of generating simulated retinal images by modeling the geometric distortions due to the eye geometry and the image acquisition process is described in this paper. We also present a validation process that can be used for any retinal image registration method by tracing through the distortion path and assessing the geometric misalignment in the coordinate system of the reference standard. The proposed method can be used to perform an accuracy evaluation over the whole image, so that distortion in the non-overlapping regions of the montage components can be easily assessed. We demonstrate the technique by generating test image sets with a variety of overlap conditions and compare the accuracy of several retinal image registration models. Copyright 2010 Elsevier B.V. All rights reserved.
Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine
2015-01-01
Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402
. To assess the ambient concentration levels of the six criteria air pollutants regulated by National Ambient Air Quality Standards (NAAQS), the U.S. Environmental Protection Agency (EPA) developed a systematic framework of: (a) field measurements of ambient air pollutant levels ...
Accuracy of the NDI Wave Speech Research System
ERIC Educational Resources Information Center
Berry, Jeffrey J.
2011-01-01
Purpose: This work provides a quantitative assessment of the positional tracking accuracy of the NDI Wave Speech Research System. Method: Three experiments were completed: (a) static rigid-body tracking across different locations in the electromagnetic field volume, (b) dynamic rigid-body tracking across different locations within the…
Community duplicate diet methodology: A new tool for estimating dietary exposure to pesticides
An observational field study was conducted to assess the feasibility of a community duplicate diet collection method; a dietary monitoring procedure that is population-based. The purpose was to establish an alternative procedure to duplicate diet sampling that would be more effi...
Effects of field storage method on E. coli concentrations measured in storm water runoff
USDA-ARS?s Scientific Manuscript database
Storm water runoff is increasingly assessed for fecal indicator organisms (e.g., Escherichia coli, E. coli) and its impact on contact recreation. Concurrently, use of autosamplers along with logistic, economic, technical, and personnel barriers are challenging conventional protocols for sample hold...
Assessment protocols of maximum oxygen consumption in young people with Down syndrome--a review.
Seron, Bruna Barboza; Greguol, Márcia
2014-03-01
Maximum oxygen consumption is considered the gold standard measure of cardiorespiratory fitness. Young people with Down syndrome (DS) present low values of this indicator compared to their peers without disabilities and to young people with an intellectual disability but without DS. The use of reliable and valid assessment methods provides more reliable results for the diagnosis of cardiorespiratory fitness and the response of this variable to exercise. The aim of the present study was to review the literature on the assessment protocols used to measure maximum oxygen consumption in children and adolescents with Down syndrome giving emphasis to the protocols used, the validation process and their feasibility. The search was carried out in eight electronic databases--Scopus, Medline-Pubmed, Web of science, SportDiscus, Cinhal, Academic Search Premier, Scielo, and Lilacs. The inclusion criteria were: (a) articles which assessed VO2peak and/or VO2max (independent of the validation method), (b) samples composed of children and/or adolescents with Down syndrome, (c) participants of up to 20 years old, and (d) studies performed after 1990. Fifteen studies were selected and, of these, 11 measured the VO2peak using tests performed in a laboratory, 2 used field tests and the remaining 2 used both laboratory and field tests. The majority of the selected studies used maximal tests and conducted familiarization sessions. All the studies took into account the clinical conditions that could hamper testing or endanger the individuals. However, a large number of studies used tests which had not been specifically validated for the evaluated population. Finally, the search emphasized the small number of studies which use field tests to evaluate oxygen consumption. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Tweedt, Daniel L.
2014-01-01
Computational Aerodynamic simulations of a 1215 ft/sec tip speed transonic fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, low-noise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15-foot Low Speed Wind Tunnel at the NASA Glenn Research Center. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating points simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, which for this model did not include a split flow path with core and bypass ducts. As a result, it was only necessary to adjust fan rotational speed in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. Computed blade row flow fields at all fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the flow fields at all operating conditions reveals no excessive boundary layer separations or related secondary-flow problems.
Scalar flux modeling in turbulent flames using iterative deconvolution
NASA Astrophysics Data System (ADS)
Nikolaou, Z. M.; Cant, R. S.; Vervisch, L.
2018-04-01
In the context of large eddy simulations, deconvolution is an attractive alternative for modeling the unclosed terms appearing in the filtered governing equations. Such methods have been used in a number of studies for non-reacting and incompressible flows; however, their application in reacting flows is limited in comparison. Deconvolution methods originate from clearly defined operations, and in theory they can be used in order to model any unclosed term in the filtered equations including the scalar flux. In this study, an iterative deconvolution algorithm is used in order to provide a closure for the scalar flux term in a turbulent premixed flame by explicitly filtering the deconvoluted fields. The assessment of the method is conducted a priori using a three-dimensional direct numerical simulation database of a turbulent freely propagating premixed flame in a canonical configuration. In contrast to most classical a priori studies, the assessment is more stringent as it is performed on a much coarser mesh which is constructed using the filtered fields as obtained from the direct simulations. For the conditions tested in this study, deconvolution is found to provide good estimates both of the scalar flux and of its divergence.
Hinderer, Svenja; Brauchle, Eva; Schenke-Layland, Katja
2015-11-18
Current clinically applicable tissue and organ replacement therapies are limited in the field of cardiovascular regenerative medicine. The available options do not regenerate damaged tissues and organs, and, in the majority of the cases, show insufficient restoration of tissue function. To date, anticoagulant drug-free heart valve replacements or growing valves for pediatric patients, hemocompatible and thrombus-free vascular substitutes that are smaller than 6 mm, and stem cell-recruiting delivery systems that induce myocardial regeneration are still only visions of researchers and medical professionals worldwide and far from being the standard of clinical treatment. The design of functional off-the-shelf biomaterials as well as automatable and up-scalable biomaterial processing methods are the focus of current research endeavors and of great interest for fields of tissue engineering and regenerative medicine. Here, various approaches that aim to overcome the current limitations are reviewed, focusing on biomaterials design and generation methods for myocardium, heart valves, and blood vessels. Furthermore, novel contact- and marker-free biomaterial and extracellular matrix assessment methods are highlighted. © 2015 The Authors. Published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative Assessment of Fat Levels in Caenorhabditis elegans Using Dark Field Microscopy
Fouad, Anthony D.; Pu, Shelley H.; Teng, Shelly; Mark, Julian R.; Fu, Moyu; Zhang, Kevin; Huang, Jonathan; Raizen, David M.; Fang-Yen, Christopher
2017-01-01
The roundworm Caenorhabditis elegans is widely used as a model for studying conserved pathways for fat storage, aging, and metabolism. The most broadly used methods for imaging fat in C. elegans require fixing and staining the animal. Here, we show that dark field images acquired through an ordinary light microscope can be used to estimate fat levels in worms. We define a metric based on the amount of light scattered per area, and show that this light scattering metric is strongly correlated with worm fat levels as measured by Oil Red O (ORO) staining across a wide variety of genetic backgrounds and feeding conditions. Dark field imaging requires no exogenous agents or chemical fixation, making it compatible with live worm imaging. Using our method, we track fat storage with high temporal resolution in developing larvae, and show that fat storage in the intestine increases in at least one burst during development. PMID:28404661
NASA Technical Reports Server (NTRS)
Huang, Dong; Yang, Wenze; Tan, Bin; Rautiainen, Miina; Zhang, Ping; Hu, Jiannan; Shabanov, Nikolay V.; Linder, Sune; Knyazikhin, Yuri; Myneni, Ranga B.
2006-01-01
The validation of moderate-resolution satellite leaf area index (LAI) products such as those operationally generated from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor data requires reference LAI maps developed from field LAI measurements and fine-resolution satellite data. Errors in field measurements and satellite data determine the accuracy of the reference LAI maps. This paper describes a method by which reference maps of known accuracy can be generated with knowledge of errors in fine-resolution satellite data. The method is demonstrated with data from an international field campaign in a boreal coniferous forest in northern Sweden, and Enhanced Thematic Mapper Plus images. The reference LAI map thus generated is used to assess modifications to the MODIS LAI/fPAR algorithm recently implemented to derive the next generation of the MODIS LAI/fPAR product for this important biome type.
Ion and impurity transport in turbulent, anisotropic magnetic fields
NASA Astrophysics Data System (ADS)
Negrea, M.; Petrisor, I.; Isliker, H.; Vogiannou, A.; Vlahos, L.; Weyssow, B.
2011-08-01
We investigate ion and impurity transport in turbulent, possibly anisotropic, magnetic fields. The turbulent magnetic field is modeled as a correlated stochastic field, with Gaussian distribution function and prescribed spatial auto-correlation function, superimposed onto a strong background field. The (running) diffusion coefficients of ions are determined in the three-dimensional environment, using two alternative methods, the semi-analytical decorrelation trajectory (DCT) method, and test-particle simulations. In a first step, the results of the test-particle simulations are compared with and used to validate the results obtained from the DCT method. For this purpose, a drift approximation was made in slab geometry, and relatively good qualitative agreement between the DCT method and the test-particle simulations was found. In a second step, the ion species He, Be, Ne and W, all assumed to be fully ionized, are considered under ITER-like conditions, and the scaling of their diffusivities is determined with respect to varying levels of turbulence (varying Kubo number), varying degrees of anisotropy of the turbulent structures and atomic number. In a third step, the test-particle simulations are repeated without drift approximation, directly using the Lorentz force, first in slab geometry, in order to assess the finite Larmor radius effects, and second in toroidal geometry, to account for the geometric effects. It is found that both effects are important, most prominently the effects due to toroidal geometry and the diffusivities are overestimated in slab geometry by an order of magnitude.
A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI
Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.
2016-01-01
Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244
Pecly, José Otavio Goulart
2018-01-01
The alternative use of effluent turbidity to determine the dilution field of a domestic marine outfall located off the city of Rio de Janeiro was evaluated through field work comprising fluorescent dye tracer injection and tracking with simultaneous monitoring of sea water turbidity. A preliminary laboratory assessment was carried out with a sample of the outfall effluent whose turbidity was measured by the nephelometric method before and during a serial dilution process. During the field campaign, the dye tracer was monitored with field fluorometers and the turbidity was observed with an optical backscattering sensor interfaced to an OEM data acquisition system. About 4,000 samples were gathered, covering an area of 3 km × 3 km near the outfall diffusers. At the far field - where a drift towards the coastline was observed - the effluent plume was adequately labeled by the dye tracer. The turbidity plume was biased due to the high and variable background turbidity of sea water. After processing the turbidity dataset with a baseline detrending method, the plume presented high correlation with the dye tracer plume drawn on the near dilution field. However, dye tracer remains more robust than effluent turbidity.
Patlovich, Scott J; Emery, Robert J; Whitehead, Lawrence W; Brown, Eric L; Flores, Rene
2015-03-01
Because the origins of the biological safety profession are rooted in the control and prevention of laboratory-associated infections, the vocation focuses primarily on the safe handling of specimens within the laboratory. But in many cases, the specimens and samples handled in the lab are originally collected in the field where a broader set of possible exposure considerations may be present, each with varying degrees of controllability. The failure to adequately control the risks associated with collecting biological specimens in the field may result in illness or injury, and could have a direct impact on laboratory safety, if infectious specimens were packaged or transported inappropriately, for example. This study developed a web-based survey distributed to practicing biological safety professionals to determine the prevalence of and extent to which biological safety programs consider and evaluate field collection activities. In cases where such issues were considered, the data collected characterize the types of controls and methods of oversight at the institutional level that are employed. Sixty-one percent (61%) of the survey respondents indicated that research involving the field collection of biological specimens is conducted at their institutions. A majority (79%) of these field collection activities occur at academic institutions. Twenty-seven percent (27%) of respondents indicated that their safety committees do not consider issues related to biological specimens collected in the field, and only 25% with an oversight committee charged to review field collection protocols have generated a field research-specific risk assessment form to facilitate the assembly of pertinent information for a project risk assessment review. The results also indicated that most biosafety professionals (73% overall; 71% from institutions conducting field collection activities) have not been formally trained on the topic, but many (64% overall; 87% from institutions conducting field collection activities) indicated that training on field research safety issues would be helpful, and even more (71% overall; 93% from institutions conducting field collection activities) would consider participation in such a training course. Results obtained from this study can be used to develop a field research safety toolkit and associated training curricula specifically targeted to biological safety professionals.
Hein, Andreas; Steen, Enno-Edzard; Thiel, Andreas; Hülsken-Giesler, Manfred; Wist, Thorben; Helmer, Axel; Frenken, Thomas; Isken, Melvin; Schulze, Gisela C; Remmers, Hartmut
2014-01-01
This article describes the results of field studies performed over a period between five months and 24 months. The objectives of these studies were to collect long-term real-life data to evaluate how these data can be mapped to items on standardized assessment tests and which presentation method is most suitable to inform caregivers about critical situations and changes in health or care needs. A Home-monitoring system which uses modern sensor technologies was developed for and used in these field studies. It was installed in living environments of seven people (three who were not in need of care, two in need of care, and two with mental disabilities). The data were generated by sensor data acquisition and questionnaire reporting. Four types of data analysis and representation were evaluated to support caregivers. Results show that sensor data can be used to determine information directly or indirectly, which can be mapped to relevant assessment items and presented with different degrees of granularity. It is also feasible to determine and present additional information of potential interest which cannot be directly mapped to any assessment item. Sensor data can also be displayed in a live view. This live data representation led to a decrease in the caregivers' workload when assessed according to the German version of the Perceived Stress Questionnaire.
Novel dietary intake assessment in populations with poor literacy.
Subasinghe, Asvini K; Thrift, Amanda G; Evans, Roger G; Arabshahi, Simin; Suresh, Oduru; Kartik, Kamakshi; Kalyanram, Kartik; Walker, Karen Z
2016-01-01
Cultural and/or environmental barriers make the assessment of dietary intake in rural populations challenging. We aimed to assess the accuracy of a meal recall questionnaire, adapted for use with impoverished South Indian populations living in rural areas. Dietary data collected by recall versus weighed meals were compared. Data were obtained from 45 adults aged 19-85 years, living in rural Andhra Pradesh, who were recruited by convenience sampling. Weighed meal records (WMRs) were conducted in the household by a researcher aided by a trained field worker. The following day, field workers conducted a recall interview with the same participant. Eight life size photographs of portions of South Indian foods were created to aid each participant's recall and a database of nutrients was developed to calculate nutrient intake. Pearson correlations were used to assess the strength of associations between intake of energy and nutrients calculated from meal recalls versus WMRs. Least products regression was conducted to examine fixed and proportional bias. Bland-Altman plots were constructed to measure systematic or differential bias. Significant correlations were observed between estimates for energy and nutrients obtained by the two methods (r2=0.19-0.67, p<0.001). No systematic bias was detected by Bland-Altman plots. Recall method underestimated the intake of protein and fat in a manner proportional to the level of intake. Our culturally adapted meal recall questionnaire provides an accurate measure for assessment of the intake of energy, macronutrients and some micronutrients in rural Indian populations.
The Technology Crisis in Neuropsychology.
Miller, Justin B; Barr, William B
2017-08-01
Neuropsychology has fallen reliant on outdated and labor intensive methods of data collection that are slow, highly inefficient, and expensive, and provide relatively data-poor estimates of human behavior despite rapid technological advance in most other fields of medicine. Here we present a brief historical overview of current testing practices in an effort to frame the current crisis, followed by an overview of different settings in which technology can and should be integrated. Potential benefits of laboratory based assessments, remote assessments, as well as passive and high-frequency data collection tools rooted in technology are discussed, along with several relevant examples and how these technologies might be deployed. Broader issues of data security and privacy are discussed, as well as additional considerations to be addressed within each setting. Some of the historical barriers to adoption of technology are also presented, along with a brief discussion of the remaining uncertainties. While by no means intended as a comprehensive review or prescriptive roadmap, our goal is to show that there are a tremendous number of advantages to technologically driven data collection methods, and that technology should be embraced by the field. Our predictions are that the comprehensive assessments of the future will likely entail a combination of lab-based assessments, remote assessments, and passive data capture, and leading the development of these efforts will cement the role of neuropsychology at the forefront of cognitive and behavioral science. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Gender Development Research in Sex Roles: Historical Trends and Future Directions.
Zosuls, Kristina M; Miller, Cindy Faith; Ruble, Diane N; Martin, Carol Lynn; Fabes, Richard A
2011-06-01
The late 1960s through the 1970s marked an important turning point in the field of gender research, including theory and research in gender development. The establishment of Sex Roles in 1975 as a forum for this research represented an important milestone in the field. In this article, we celebrate the 35th anniversary of Sex Roles and, in particular, its contributions to the field of research on children's and adolescents' gender development. We examine the trends in research on gender development published in Sex Roles since its inception and use this analysis as a vehicle for exploring how the field has grown and evolved over the past few decades. We begin with a brief review of the history of this field of research since 1975. Then, we present a descriptive assessment of articles published on gender development in Sex Roles over time, and link this assessment to general trends that have occurred in the study of gender development over the past 35 years. We conclude with a discussion of future directions for the field of gender development. In particular, we highlight areas in which the journal could play a role in promoting more diversity in topics, methods, and ages employed in gender development research.
Gender Development Research in Sex Roles: Historical Trends and Future Directions
Miller, Cindy Faith; Ruble, Diane N.; Martin, Carol Lynn; Fabes, Richard A.
2011-01-01
The late 1960s through the 1970s marked an important turning point in the field of gender research, including theory and research in gender development. The establishment of Sex Roles in 1975 as a forum for this research represented an important milestone in the field. In this article, we celebrate the 35th anniversary of Sex Roles and, in particular, its contributions to the field of research on children’s and adolescents’ gender development. We examine the trends in research on gender development published in Sex Roles since its inception and use this analysis as a vehicle for exploring how the field has grown and evolved over the past few decades. We begin with a brief review of the history of this field of research since 1975. Then, we present a descriptive assessment of articles published on gender development in Sex Roles over time, and link this assessment to general trends that have occurred in the study of gender development over the past 35 years. We conclude with a discussion of future directions for the field of gender development. In particular, we highlight areas in which the journal could play a role in promoting more diversity in topics, methods, and ages employed in gender development research. PMID:21747580
Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case
Diez, Luis; Agüero, Ramón; Muñoz, Luis
2017-01-01
Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population’s health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis. PMID:28561783
Knowledge transfer on complex social interventions in public health: a scoping study.
Dagenais, Christian; Malo, Marie; Robert, Émilie; Ouimet, Mathieu; Berthelette, Diane; Ridde, Valéry
2013-01-01
Scientific knowledge can help develop interventions that improve public health. The objectives of this review are (1) to describe the status of research on knowledge transfer strategies in the field of complex social interventions in public health and (2) to identify priorities for future research in this field. A scoping study is an exploratory study. After searching databases of bibliographic references and specialized periodicals, we summarized the relevant studies using a predetermined assessment framework. In-depth analysis focused on the following items: types of knowledge transfer strategies, fields of public health, types of publics, types of utilization, and types of research specifications. From the 1,374 references identified, we selected 26 studies. The strategies targeted mostly administrators of organizations and practitioners. The articles generally dealt with instrumental utilization and most often used qualitative methods. In general, the bias risk for the studies is high. Researchers need to consider the methodological challenges in this field of research in order to improve assessment of more complex knowledge transfer strategies (when they exist), not just diffusion/dissemination strategies and conceptual and persuasive utilization.
Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case.
Diez, Luis; Agüero, Ramón; Muñoz, Luis
2017-05-31
Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population's health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis.
Assessment of anxiety in open field and elevated plus maze using infrared thermography.
Lecorps, Benjamin; Rödel, Heiko G; Féron, Christophe
2016-04-01
Due to their direct inaccessibility, affective states are classically assessed by gathering concomitant physiological and behavioral measures. Although such a dual approach to assess emotional states is frequently used in different species including humans, the invasiveness of procedures for physiological recordings particularly in smaller-sized animals strongly restricts their application. We used infrared thermography, a non-invasive method, to assess physiological arousal during open field and elevated plus maze tests in mice. By measuring changes in surface temperature indicative of the animals' emotional response, we aimed to improve the inherently limited and still controversial information provided by behavioral parameters commonly used in these tests. Our results showed significant and consistent thermal responses during both tests, in accordance with classical physiological responses occurring in stressful situations. Besides, we found correlations between these thermal responses and the occurrence of anxiety-related behaviors. Furthermore, initial temperatures measured at the start of each procedure (open field, elevated plus maze), which can be interpreted as a measure of the animals' initial physiological arousal, predicted the levels of activity and of anxiety-related behaviors displayed during the tests. Our results stress the strong link between physiological correlates of emotions and behaviors expressed during unconditioned fear tests. Copyright © 2016 Elsevier Inc. All rights reserved.
Toward a More Systematic Assessment of Smoking: Development of a Smoking Module for PROMIS®
Tucker, Joan S.; Shadel, William G.; Stucky, Brian D.; Cai, Li
2012-01-01
Introduction The aim of the PROMIS® Smoking Initiative is to develop, evaluate, and standardize item banks to assess cigarette smoking behavior and biopsychosocial constructs associated with smoking for both daily and non-daily smokers. Methods We used qualitative methods to develop the item pool (following the PROMIS® approach: e.g., literature search, “binning and winnowing” of items, and focus groups and cognitive interviews to finalize wording and format), and quantitative methods (e.g., factor analysis) to develop the item banks. Results We considered a total of 1622 extant items, and 44 new items for inclusion in the smoking item banks. A final set of 277 items representing 11 conceptual domains was selected for field testing in a national sample of smokers. Using data from 3021 daily smokers in the field test, an iterative series of exploratory factor analyses and project team discussions resulted in six item banks: Positive Consequences of Smoking (40 items), Smoking Dependence/Craving (55 items), Health Consequences of Smoking (26 items), Psychosocial Consequences of Smoking (37 items), Coping Aspects of Smoking (30 items), and Social Factors of Smoking (23 items). Conclusions Inclusion of a smoking domain in the PROMIS® framework will standardize measurement of key smoking constructs using state-of-the-art psychometric methods, and make them widely accessible to health care providers, smoking researchers and the large community of researchers using PROMIS® who might not otherwise include an assessment of smoking in their design. Next steps include reducing the number of items in each domain, conducting confirmatory analyses, and duplicating the process for non-daily smokers. PMID:22770824
The Contributions and Prospects of Goal Orientation Theory
ERIC Educational Resources Information Center
Kaplan, Avi; Maehr, Martin L.
2007-01-01
In the last two decades, goal orientation theory has become an important perspective in the field of achievement motivation, and particularly in academic motivation. However, as research in the theory has proliferated, the use of multiple methods to assess goal orientations seems to have contributed to theoretical vagueness, especially with regard…
A Game of Thrones: Organising and Legitimising Knowledge through PISA Research
ERIC Educational Resources Information Center
Mølstad, Christina E.; Pettersson, Daniel; Forsberg, Eva
2017-01-01
This study investigates knowledge structures and scientific communication using bibliometric methods to explore scientific knowledge production and dissemination. The aim is to develop knowledge about this growing field by investigating studies using international large-scale assessment (ILSA) data, with a specific focus on those using Programme…
USDA-ARS?s Scientific Manuscript database
Background/Question/Methods: Soil and site stability are key attributes of assessing the health of dryland landscapes because these lands are susceptible to high rates of wind- and water-caused erosion. Field techniques for measuring and monitoring soil erosion in drylands are often labor intensive...
Decision making in prioritization of required operational capabilities
NASA Astrophysics Data System (ADS)
Andreeva, P.; Karev, M.; Kovacheva, Ts.
2015-10-01
The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.
Near-infrared hyperspectral imaging for detecting Aflatoxin B1 of maize kernels
USDA-ARS?s Scientific Manuscript database
The feasibility of detecting the Aflatoxin B1 in maize kernels inoculated with Aspergillus flavus conidia in the field was assessed using near-infrared hyperspectral imaging technique. After pixel-level calibration, wavelength dependent offset, the masking method was adopted to reduce the noise and ...
Universal Design: Implications for Computing Education
ERIC Educational Resources Information Center
Burgstahler, Sheryl
2011-01-01
Universal design (UD), a concept that grew from the field of architecture, has recently emerged as a paradigm for designing instructional methods, curriculum, and assessments that are welcoming and accessible to students with a wide range of characteristics, including those related to race, ethnicity, native language, gender, age, and disability.…
Field testing a mobile inelastic neutron scattering system to measure soil carbon
USDA-ARS?s Scientific Manuscript database
Cropping history in conjunction with soil management practices can have a major impact on the amount of organic carbon (C) stored in soil. Current methods of assessing soil C based on soil coring and subsequent processing procedures prior to laboratory analysis are labor intensive and time consuming...
Arsenic concentrations (Total Recoverable As by EPA Method 3051) and solid-phase speciation (by X-ray Absorption Near-Edge Spectroscopy-XANES) were assessed as a function of depth through Fe-media beds for two commercially available products from pilot-scale field tests. These re...
THE DISTRIBUTION AND SOLID-PHASE SPECIATION OF AS IN IRON-BASED TREATMENT MEDIA
Arsenic concentrations (Total Recoverable As by EPA Method 3051) and solid-phase speciation (by X-ray Absorption Near-Edge Spectroscopy-XANES) were assessed as a function of depth through Fe-media beds for two commercially available products from pilot-scale field tests. These r...
ERIC Educational Resources Information Center
Deveaugh-Geiss, Joseph; March, John; Shapiro, Mark; Andreason, Paul J.; Emslie, Graham; Ford, Lisa M.; Greenhill, Laurence; Murphy, Dianne; Prentice, Ernest; Roberts, Rosemary; Silva, Susan; Swanson, James M.; van Zwieten-Boot, Barbara; Vitiello, Benedetto; Wagner, Karen Dineen; Mangum, Barry
2006-01-01
Objective: To give academic researchers, government officials, and industry scientists an opportunity to assess the state of pediatric psychopharmacology and identify challenges facing professionals in the field. Method: Increased federal spending and the introduction of pediatric exclusivity led to large increases in pediatric psychopharmacology…
One proposed method for reducing exposure to mobile-source air pollution is the construction or preservation of vegetation barriers between major roads and nearby populations. This study combined stationary and mobile monitoring approaches to determine the effects of an existing,...
Marx, Sabrina; Hämmerle, Martin; Klonner, Carolin; Höfle, Bernhard
2016-01-01
The integration of local agricultural knowledge deepens the understanding of complex phenomena such as the association between climate variability, crop yields and undernutrition. Participatory Sensing (PS) is a concept which enables laymen to easily gather geodata with standard low-cost mobile devices, offering new and efficient opportunities for agricultural monitoring. This study presents a methodological approach for crop height assessment based on PS. In-field crop height variations of a maize field in Heidelberg, Germany, are gathered with smartphones and handheld GPS devices by 19 participants. The comparison of crop height values measured by the participants to reference data based on terrestrial laser scanning (TLS) results in R2 = 0.63 for the handheld GPS devices and R2 = 0.24 for the smartphone-based approach. RMSE for the comparison between crop height models (CHM) derived from PS and TLS data is 10.45 cm (GPS devices) and 14.69 cm (smartphones). Furthermore, the results indicate that incorporating participants' cognitive abilities in the data collection process potentially improves the quality data captured with the PS approach. The proposed PS methods serve as a fundament to collect agricultural parameters on field-level by incorporating local people. Combined with other methods such as remote sensing, PS opens new perspectives to support agricultural development.
Cervantes, Jose A; Costello, Collin M; Maarouf, Melody; McCrary, Hilary C; Zeitouni, Nathalie C
2017-09-01
A realistic model for the instruction of basic dermatologic procedural skills was developed, while simultaneously increasing medical student exposure to the field of dermatology. The primary purpose of the authors' study was to evaluate the utilization of a fresh-tissue cadaver model (FTCM) as a method for the instruction of common dermatologic procedures. The authors' secondary aim was to assess students' perceived clinical skills and overall perception of the field of dermatology after the lab. Nineteen first- and second-year medical students were pre- and post-tested on their ability to perform punch and excisional biopsies on a fresh-tissue cadaver. Students were then surveyed on their experience. Assessment of the cognitive knowledge gain and technical skills revealed a statistically significant improvement in all categories (p < .001). An analysis of the survey demonstrated that 78.9% were more interested in selecting dermatology as a career and 63.2% of participants were more likely to refer their future patients to a Mohs surgeon. An FTCM is a viable method for the instruction and training of dermatologic procedures. In addition, the authors conclude that an FTCM provides realistic instruction for common dermatologic procedures and enhances medical students' early exposure and interest in the field of dermatology.
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
Brown, Richard J C; Beccaceci, Sonya; Butterfield, David M; Quincey, Paul G; Harris, Peter M; Maggos, Thomas; Panteliadis, Pavlos; John, Astrid; Jedynska, Aleksandra; Kuhlbusch, Thomas A J; Putaud, Jean-Philippe; Karanasiou, Angeliki
2017-10-18
The European Committee for Standardisation (CEN) Technical Committee 264 'Air Quality' has recently produced a standard method for the measurements of organic carbon and elemental carbon in PM 2.5 within its working group 35 in response to the requirements of European Directive 2008/50/EC. It is expected that this method will be used in future by all Member States making measurements of the carbonaceous content of PM 2.5 . This paper details the results of a laboratory and field measurement campaign and the statistical analysis performed to validate the standard method, assess its uncertainty and define its working range to provide clarity and confidence in the underpinning science for future users of the method. The statistical analysis showed that the expanded combined uncertainty for transmittance protocol measurements of OC, EC and TC is expected to be below 25%, at the 95% level of confidence, above filter loadings of 2 μg cm -2 . An estimation of the detection limit of the method for total carbon was 2 μg cm -2 . As a result of the laboratory and field measurement campaign the EUSAAR2 transmittance measurement protocol was chosen as the basis of the standard method EN 16909:2017.
Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M
2014-01-01
Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.
NASA Astrophysics Data System (ADS)
Ranzato, Laura; Barausse, Alberto; Mantovani, Alice; Pittarello, Alberto; Benzo, Maurizio; Palmeri, Luca
2012-12-01
Unpleasant odors are a major cause of public complaints concerning air quality and represent a growing social problem in industrialized countries. However, the assessment of odor pollution is still regarded as a difficult task, because olfactory nuisance can be caused by many different chemical compounds, often found in hard-to-detect concentrations, and the perception of odors is influenced by subjective thresholds; moreover, the impact of odor sources on air quality is mediated by complex atmospheric dispersion processes. The development of standardized assessment approaches to odor pollution and proper international regulatory tools are urgently needed. In particular, comparisons of the methodologies commonly used nowadays to assess odor impacts on air quality are required. Here, we assess the olfactory nuisance caused by an anaerobic treatment plant for municipal solid waste by means of two alternative techniques: the field inspection procedure and the atmospheric dispersion model CALPUFF. Our goal was to compare rigorously their estimates of odor nuisance, both qualitatively (spatial extent of odor impact) and quantitatively (intensity of odor nuisance). To define the impact of odors, we referred to the German standards, based on the frequency of odor episodes in terms of odor hours. We report a satisfying, although not perfect agreement between the estimates provided by the two techniques. For example, they assessed similar spatial extents of odor pollution, but different frequencies of odor episodes in locations where the odor nuisance was highest. The comparison highlights strengths and weaknesses for both approaches. CALPUFF is a cheaper methodology which can be used predictively, but fugitive emissions are difficult to model reliably, because of uncertainty regarding timing, location and emission rate. Field inspection takes into account the role of human perception, but unlike the model it does not always characterize precisely the extent of the odor nuisance caused by a single source when other odors are present, because only the most unpleasant odor is reported. We conclude that these two assessment methods provide reasonable estimates of odor nuisance.
Organizational readiness for implementing change: a psychometric assessment of a new measure.
Shea, Christopher M; Jacobs, Sara R; Esserman, Denise A; Bruce, Kerry; Weiner, Bryan J
2014-01-10
Organizational readiness for change in healthcare settings is an important factor in successful implementation of new policies, programs, and practices. However, research on the topic is hindered by the absence of a brief, reliable, and valid measure. Until such a measure is developed, we cannot advance scientific knowledge about readiness or provide evidence-based guidance to organizational leaders about how to increase readiness. This article presents results of a psychometric assessment of a new measure called Organizational Readiness for Implementing Change (ORIC), which we developed based on Weiner's theory of organizational readiness for change. We conducted four studies to assess the psychometric properties of ORIC. In study one, we assessed the content adequacy of the new measure using quantitative methods. In study two, we examined the measure's factor structure and reliability in a laboratory simulation. In study three, we assessed the reliability and validity of an organization-level measure of readiness based on aggregated individual-level data from study two. In study four, we conducted a small field study utilizing the same analytic methods as in study three. Content adequacy assessment indicated that the items developed to measure change commitment and change efficacy reflected the theoretical content of these two facets of organizational readiness and distinguished the facets from hypothesized determinants of readiness. Exploratory and confirmatory factor analysis in the lab and field studies revealed two correlated factors, as expected, with good model fit and high item loadings. Reliability analysis in the lab and field studies showed high inter-item consistency for the resulting individual-level scales for change commitment and change efficacy. Inter-rater reliability and inter-rater agreement statistics supported the aggregation of individual level readiness perceptions to the organizational level of analysis. This article provides evidence in support of the ORIC measure. We believe this measure will enable testing of theories about determinants and consequences of organizational readiness and, ultimately, assist healthcare leaders to reduce the number of health organization change efforts that do not achieve desired benefits. Although ORIC shows promise, further assessment is needed to test for convergent, discriminant, and predictive validity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, B.G.J.; Grindrod, P.
Her Majesty`s Inspectorate of Polution (HMIP) of the United Kingdom has developed a procedure for the post closure assessment of the underground disposal of radioactive waste. In this paper the method of using theory and ideas from the mathematical sciences for assessment is described. The system simulation methodology seeks to discover key combinations of processes or effects which may yield behaviour of interest by sampling across functional and parametric uncertainties, and treating the systems within a probabilistic framework. This paper also discusses how HMIP assessment methodology has been presented, independent of any current application, for review by leading scientists whomore » are independent of the performance assessment field.« less
ERIC Educational Resources Information Center
Huang, Jie; Francis, Andrea P.; Carr, Thomas H.
2008-01-01
A quantitative method is introduced for detecting and correcting artifactual signal changes in BOLD time series data arising from the magnetic field warping caused by motion of the articulatory apparatus when speaking aloud, with extensions to detection of subvocal articulatory activity during silent reading. Whole-head images allow the large,…
Bardhan, Jaydeep P
2008-10-14
The importance of molecular electrostatic interactions in aqueous solution has motivated extensive research into physical models and numerical methods for their estimation. The computational costs associated with simulations that include many explicit water molecules have driven the development of implicit-solvent models, with generalized-Born (GB) models among the most popular of these. In this paper, we analyze a boundary-integral equation interpretation for the Coulomb-field approximation (CFA), which plays a central role in most GB models. This interpretation offers new insights into the nature of the CFA, which traditionally has been assessed using only a single point charge in the solute. The boundary-integral interpretation of the CFA allows the use of multiple point charges, or even continuous charge distributions, leading naturally to methods that eliminate the interpolation inaccuracies associated with the Still equation. This approach, which we call boundary-integral-based electrostatic estimation by the CFA (BIBEE/CFA), is most accurate when the molecular charge distribution generates a smooth normal displacement field at the solute-solvent boundary, and CFA-based GB methods perform similarly. Conversely, both methods are least accurate for charge distributions that give rise to rapidly varying or highly localized normal displacement fields. Supporting this analysis are comparisons of the reaction-potential matrices calculated using GB methods and boundary-element-method (BEM) simulations. An approximation similar to BIBEE/CFA exhibits complementary behavior, with superior accuracy for charge distributions that generate rapidly varying normal fields and poorer accuracy for distributions that produce smooth fields. This approximation, BIBEE by preconditioning (BIBEE/P), essentially generates initial guesses for preconditioned Krylov-subspace iterative BEMs. Thus, iterative refinement of the BIBEE/P results recovers the BEM solution; excellent agreement is obtained in only a few iterations. The boundary-integral-equation framework may also provide a means to derive rigorous results explaining how the empirical correction terms in many modern GB models significantly improve accuracy despite their simple analytical forms.
Development of a method for personal, spatiotemporal exposure assessment.
Adams, Colby; Riggs, Philip; Volckens, John
2009-07-01
This work describes the development and evaluation of a high resolution, space and time-referenced sampling method for personal exposure assessment to airborne particulate matter (PM). This method integrates continuous measures of personal PM levels with the corresponding location-activity (i.e. work/school, home, transit) of the subject. Monitoring equipment include a small, portable global positioning system (GPS) receiver, a miniature aerosol nephelometer, and an ambient temperature monitor to estimate the location, time, and magnitude of personal exposure to particulate matter air pollution. Precision and accuracy of each component, as well as the integrated method performance were tested in a combination of laboratory and field tests. Spatial data was apportioned into pre-determined location-activity categories (i.e. work/school, home, transit) with a simple, temporospatially-based algorithm. The apportioning algorithm was extremely effective with an overall accuracy of 99.6%. This method allows examination of an individual's estimated exposure through space and time, which may provide new insights into exposure-activity relationships not possible with traditional exposure assessment techniques (i.e., time-integrated, filter-based measurements). Furthermore, the method is applicable to any contaminant or stressor that can be measured on an individual with a direct-reading sensor.
Nannoni, Francesco; Protano, Giuseppe
2016-10-15
A biogeochemistry field study was conducted in the Siena urban area (Italy) with the main objective of establishing the relationship between available amounts of heavy metals in soil assessed by a chemical method (soil fractionation) and bioavailability assessed by a biological method (bioaccumulation in earthworm tissues). The total content of traffic-related (Cd, Cu, Pb, Sb, Zn) and geogenic (Co, Cr, Ni, U) heavy metals in uncontaminated and contaminated soils and their concentrations in soil fractions and earthworms were used for this purpose. The bioavailability of heavy metals assessed by earthworms did not always match the availability defined by soil fractionation. Earthworms were a good indicator to assess the bioavailability of Pb and Sb in soil, while due to physiological mechanisms of regulation and excretion, Cd, Cu and Zn tissue levels in these invertebrates gave misleading estimates of their bioavailable pool. No relationship was identified between chemical and biological availability for the geogenic heavy metals, characterized by a narrow range of total contents in soil. The study highlighted that chemical and biological methods should be combined to provide more complete information about heavy element bioavailability in soils. Copyright © 2016 Elsevier B.V. All rights reserved.
Aligning the 3Rs with new paradigms in the safety assessment of chemicals.
Burden, Natalie; Mahony, Catherine; Müller, Boris P; Terry, Claire; Westmoreland, Carl; Kimber, Ian
2015-04-01
There are currently several factors driving a move away from the reliance on in vivo toxicity testing for the purposes of chemical safety assessment. Progress has started to be made in the development and validation of non-animal methods. However, recent advances in the biosciences provide exciting opportunities to accelerate this process and to ensure that the alternative paradigms for hazard identification and risk assessment deliver lasting 3Rs benefits, whilst improving the quality and relevance of safety assessment. The NC3Rs, a UK-based scientific organisation which supports the development and application of novel 3Rs techniques and approaches, held a workshop recently which brought together over 20 international experts in the field of chemical safety assessment. The aim of this workshop was to review the current scientific, technical and regulatory landscapes, and to identify key opportunities towards reaching these goals. Here, we consider areas where further strategic investment will need to be focused if significant impact on 3Rs is to be matched with improved safety science, and why the timing is right for the field to work together towards an environment where we no longer rely on whole animal data for the accurate safety assessment of chemicals.
NEXUS: tracing the cosmic web connection
NASA Astrophysics Data System (ADS)
Cautun, Marius; van de Weygaert, Rien; Jones, Bernard J. T.
2013-02-01
We introduce the NEXUS algorithm for the identification of cosmic web environments: clusters, filaments, walls and voids. This is a multiscale and automatic morphological analysis tool that identifies all the cosmic structures in a scale free way, without preference for a certain size or shape. We develop the NEXUS method to incorporate the density, tidal field, velocity divergence and velocity shear as tracers of the cosmic web. We also present the NEXUS+ procedure which, taking advantage of a novel filtering of the density in logarithmic space, is very successful at identifying the filament and wall environments in a robust and natural way. To assess the algorithms we apply them to an N-body simulation. We find that all methods correctly identify the most prominent filaments and walls, while there are differences in the detection of the more tenuous structures. In general, the structures traced by the density and tidal fields are clumpier and more rugged than those present in the velocity divergence and velocity shear fields. We find that the NEXUS+ method captures much better the filamentary and wall networks and is successful in detecting even the fainter structures. We also confirm the efficiency of our methods by examining the dark matter particle and halo distributions.
Methods to control ectomycorrhizal colonization: effectiveness of chemical and physical barriers.
Teste, François P; Karst, Justine; Jones, Melanie D; Simard, Suzanne W; Durall, Daniel M
2006-12-01
We conducted greenhouse experiments using Douglas-fir (Pseudotsuga menziesii var. glauca) seedlings where chemical methods (fungicides) were used to prevent ectomycorrhizal colonization of single seedlings or physical methods (mesh barriers) were used to prevent formation of mycorrhizal connections between neighboring seedlings. These methods were chosen for their ease of application in the field. We applied the fungicides, Topas (nonspecific) and Senator (ascomycete specific), separately and in combination at different concentrations and application frequencies to seedlings grown in unsterilized forest soils. Additionally, we assessed the ability of hyphae to penetrate mesh barriers of various pore sizes (0.2, 1, 20, and 500 microm) to form mycorrhizas on roots of neighboring seedlings. Ectomycorrhizal colonization was reduced by approximately 55% with the application of Topas at 0.5 g l(-1). Meshes with pore sizes of 0.2 and 1 microm were effective in preventing the formation of mycorrhizas via hyphal growth across the mesh barriers. Hence, meshes in this range of pore sizes could also be used to prevent the formation of common mycorrhizal networks in the field. Depending on the ecological question of interest, Topas or the employment of mesh with pore sizes <1 microm are suitable for restricting mycorrhization in the field.
Nuclear model calculations and their role in space radiation research
NASA Technical Reports Server (NTRS)
Townsend, L. W.; Cucinotta, F. A.; Heilbronn, L. H.
2002-01-01
Proper assessments of spacecraft shielding requirements and concomitant estimates of risk to spacecraft crews from energetic space radiation requires accurate, quantitative methods of characterizing the compositional changes in these radiation fields as they pass through thick absorbers. These quantitative methods are also needed for characterizing accelerator beams used in space radiobiology studies. Because of the impracticality/impossibility of measuring these altered radiation fields inside critical internal body organs of biological test specimens and humans, computational methods rather than direct measurements must be used. Since composition changes in the fields arise from nuclear interaction processes (elastic, inelastic and breakup), knowledge of the appropriate cross sections and spectra must be available. Experiments alone cannot provide the necessary cross section and secondary particle (neutron and charged particle) spectral data because of the large number of nuclear species and wide range of energies involved in space radiation research. Hence, nuclear models are needed. In this paper current methods of predicting total and absorption cross sections and secondary particle (neutrons and ions) yields and spectra for space radiation protection analyses are reviewed. Model shortcomings are discussed and future needs presented. c2002 COSPAR. Published by Elsevier Science Ltd. All right reserved.
NASA Astrophysics Data System (ADS)
Liu, G.; Aspinall, M. D.; Ma, X.; Joyce, M. J.
2009-08-01
The discrimination of neutron and γ-ray events in an organic scintillator has been investigated by using a method based on an artificial neural network (ANN). Voltage pulses arising from an EJ-301 organic liquid scintillation detector in a mixed radiation field have been recorded with a fast digital sampling oscilloscope. Piled-up events have been disentangled using a pile-up management unit based on a fitting method. Each individual pulse has subsequently been sent to a discrimination unit which discriminates neutron and γ-ray events with a method based on an artificial neural network. This discrimination technique has been verified by the corresponding mixed-field data assessed by time of flight (TOF). It is shown that the characterization of the neutrons and photons achieved by the discrimination method based on the ANN is consistent with that afforded by TOF. This approach enables events that are often as a result of scattering or pile-up to be identified and returned to the data set and affords digital discrimination of mixed radiation fields in a broad range of environments on the basis of training obtained with a single TOF dataset.