Electrofishing effort requirements for estimating species richness in the Kootenai River, Idaho
Watkins, Carson J.; Quist, Michael C.; Shepard, Bradley B.; Ireland, Susan C.
2016-01-01
This study was conducted on the Kootenai River, Idaho to provide insight on sampling requirements to optimize future monitoring effort associated with the response of fish assemblages to habitat rehabilitation. Our objective was to define the electrofishing effort (m) needed to have a 95% probability of sampling 50, 75, and 100% of the observed species richness and to evaluate the relative influence of depth, velocity, and instream woody cover on sample size requirements. Sidechannel habitats required more sampling effort to achieve 75 and 100% of the total species richness than main-channel habitats. The sampling effort required to have a 95% probability of sampling 100% of the species richness was 1100 m for main-channel sites and 1400 m for side-channel sites. We hypothesized that the difference in sampling requirements between main- and side-channel habitats was largely due to differences in habitat characteristics and species richness between main- and side-channel habitats. In general, main-channel habitats had lower species richness than side-channel habitats. Habitat characteristics (i.e., depth, current velocity, and woody instream cover) were not related to sample size requirements. Our guidelines will improve sampling efficiency during monitoring effort in the Kootenai River and provide insight on sampling designs for other large western river systems where electrofishing is used to assess fish assemblages.
We empirically examined the sampling effort required to adequately represent species richness and proportionate abundance when backpack electrofishing western Oregon streams. When sampling, we separately recorded data for each habitat unit. In data analyses, we repositioned each...
Electrofishing Effort Required to Estimate Biotic Condition in Southern Idaho Rivers
An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in...
Sampling effort needed to estimate condition and species richness in the Ohio River, USA
The level of sampling effort required to characterize fish assemblage condition in a river for the purposes of bioassessment may be estimated via different approaches. However, the goal with any approach is to determine the minimum level of effort necessary to reach some specific...
Sampling effort and estimates of species richness based on prepositioned area electrofisher samples
Bowen, Z.H.; Freeman, Mary C.
1998-01-01
Estimates of species richness based on electrofishing data are commonly used to describe the structure of fish communities. One electrofishing method for sampling riverine fishes that has become popular in the last decade is the prepositioned area electrofisher (PAE). We investigated the relationship between sampling effort and fish species richness at seven sites in the Tallapoosa River system, USA based on 1,400 PAE samples collected during 1994 and 1995. First, we estimated species richness at each site using the first-order jackknife and compared observed values for species richness and jackknife estimates of species richness to estimates based on historical collection data. Second, we used a permutation procedure and nonlinear regression to examine rates of species accumulation. Third, we used regression to predict the number of PAE samples required to collect the jackknife estimate of species richness at each site during 1994 and 1995. We found that jackknife estimates of species richness generally were less than or equal to estimates based on historical collection data. The relationship between PAE electrofishing effort and species richness in the Tallapoosa River was described by a positive asymptotic curve as found in other studies using different electrofishing gears in wadable streams. Results from nonlinear regression analyses indicted that rates of species accumulation were variable among sites and between years. Across sites and years, predictions of sampling effort required to collect jackknife estimates of species richness suggested that doubling sampling effort (to 200 PAEs) would typically increase observed species richness by not more than six species. However, sampling effort beyond about 60 PAE samples typically increased observed species richness by < 10%. We recommend using historical collection data in conjunction with a preliminary sample size of at least 70 PAE samples to evaluate estimates of species richness in medium-sized rivers. Seventy PAE samples should provide enough information to describe the relationship between sampling effort and species richness and thus facilitate evaluation of a sampling effort.
Sample Size and Allocation of Effort in Point Count Sampling of Birds in Bottomland Hardwood Forests
Winston P. Smith; Daniel J. Twedt; Robert J. Cooper; David A. Wiedenfeld; Paul B. Hamel; Robert P. Ford
1995-01-01
To examine sample size requirements and optimum allocation of effort in point count sampling of bottomland hardwood forests, we computed minimum sample sizes from variation recorded during 82 point counts (May 7-May 16, 1992) from three localities containing three habitat types across three regions of the Mississippi Alluvial Valley (MAV). Also, we estimated the effect...
Electrofishing effort required to estimate biotic condition in southern Idaho Rivers
Maret, Terry R.; Ott, Douglas S.; Herlihy, Alan T.
2007-01-01
An important issue surrounding biomonitoring in large rivers is the minimum sampling effort required to collect an adequate number of fish for accurate and precise determinations of biotic condition. During the summer of 2002, we sampled 15 randomly selected large-river sites in southern Idaho to evaluate the effects of sampling effort on an index of biotic integrity (IBI). Boat electrofishing was used to collect sample populations of fish in river reaches representing 40 and 100 times the mean channel width (MCW; wetted channel) at base flow. Minimum sampling effort was assessed by comparing the relation between reach length sampled and change in IBI score. Thirty-two species of fish in the families Catostomidae, Centrarchidae, Cottidae, Cyprinidae, Ictaluridae, Percidae, and Salmonidae were collected. Of these, 12 alien species were collected at 80% (12 of 15) of the sample sites; alien species represented about 38% of all species (N = 32) collected during the study. A total of 60% (9 of 15) of the sample sites had poor IBI scores. A minimum reach length of about 36 times MCW was determined to be sufficient for collecting an adequate number of fish for estimating biotic condition based on an IBI score. For most sites, this equates to collecting 275 fish at a site. Results may be applicable to other semiarid, fifth-order through seventh-order rivers sampled during summer low-flow conditions.
The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival
Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas
2016-01-01
Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561
Active learning based segmentation of Crohns disease from abdominal MRI.
Mahapatra, Dwarikanath; Vos, Franciscus M; Buhmann, Joachim M
2016-05-01
This paper proposes a novel active learning (AL) framework, and combines it with semi supervised learning (SSL) for segmenting Crohns disease (CD) tissues from abdominal magnetic resonance (MR) images. Robust fully supervised learning (FSL) based classifiers require lots of labeled data of different disease severities. Obtaining such data is time consuming and requires considerable expertise. SSL methods use a few labeled samples, and leverage the information from many unlabeled samples to train an accurate classifier. AL queries labels of most informative samples and maximizes gain from the labeling effort. Our primary contribution is in designing a query strategy that combines novel context information with classification uncertainty and feature similarity. Combining SSL and AL gives a robust segmentation method that: (1) optimally uses few labeled samples and many unlabeled samples; and (2) requires lower training time. Experimental results show our method achieves higher segmentation accuracy than FSL methods with fewer samples and reduced training effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Approximate sample sizes required to estimate length distributions
Miranda, L.E.
2007-01-01
The sample sizes required to estimate fish length were determined by bootstrapping from reference length distributions. Depending on population characteristics and species-specific maximum lengths, 1-cm length-frequency histograms required 375-1,200 fish to estimate within 10% with 80% confidence, 2.5-cm histograms required 150-425 fish, proportional stock density required 75-140 fish, and mean length required 75-160 fish. In general, smaller species, smaller populations, populations with higher mortality, and simpler length statistics required fewer samples. Indices that require low sample sizes may be suitable for monitoring population status, and when large changes in length are evident, additional sampling effort may be allocated to more precisely define length status with more informative estimators. ?? Copyright by the American Fisheries Society 2007.
Sampling bee communities using pan traps: alternative methods increase sample size
USDA-ARS?s Scientific Manuscript database
Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...
How do feelings influence effort? An empirical study of entrepreneurs' affect and venture effort.
Foo, Maw-Der; Uy, Marilyn A; Baron, Robert A
2009-07-01
How do feelings influence the effort of entrepreneurs? To obtain data on this issue, the authors implemented experience sampling methodology in which 46 entrepreneurs used cell phones to provide reports on their affect, future temporal focus, and venture effort twice daily for 24 days. Drawing on the affect-as-information theory, the study found that entrepreneurs' negative affect directly predicts entrepreneurs' effort toward tasks that are required immediately. Results were consistent for within-day and next-day time lags. Extending the theory, the study found that positive affect predicts venture effort beyond what is immediately required and that this relationship is mediated by future temporal focus. The mediating effects were significant only for next-day outcomes. Implications of findings on the nature of the affect-effort relationship for different time lags are discussed.
Al-Chokhachy, R.; Budy, P.; Conner, M.
2009-01-01
Using empirical field data for bull trout (Salvelinus confluentus), we evaluated the trade-off between power and sampling effort-cost using Monte Carlo simulations of commonly collected mark-recapture-resight and count data, and we estimated the power to detect changes in abundance across different time intervals. We also evaluated the effects of monitoring different components of a population and stratification methods on the precision of each method. Our results illustrate substantial variability in the relative precision, cost, and information gained from each approach. While grouping estimates by age or stage class substantially increased the precision of estimates, spatial stratification of sampling units resulted in limited increases in precision. Although mark-resight methods allowed for estimates of abundance versus indices of abundance, our results suggest snorkel surveys may be a more affordable monitoring approach across large spatial scales. Detecting a 25% decline in abundance after 5 years was not possible, regardless of technique (power = 0.80), without high sampling effort (48% of study site). Detecting a 25% decline was possible after 15 years, but still required high sampling efforts. Our results suggest detecting moderate changes in abundance of freshwater salmonids requires considerable resource and temporal commitments and highlight the difficulties of using abundance measures for monitoring bull trout populations.
The cost of model reference adaptive control - Analysis, experiments, and optimization
NASA Technical Reports Server (NTRS)
Messer, R. S.; Haftka, R. T.; Cudney, H. H.
1993-01-01
In this paper the performance of Model Reference Adaptive Control (MRAC) is studied in numerical simulations and verified experimentally with the objective of understanding how differences between the plant and the reference model affect the control effort. MRAC is applied analytically and experimentally to a single degree of freedom system and analytically to a MIMO system with controlled differences between the model and the plant. It is shown that the control effort is sensitive to differences between the plant and the reference model. The effects of increased damping in the reference model are considered, and it is shown that requiring the controller to provide increased damping actually decreases the required control effort when differences between the plant and reference model exist. This result is useful because one of the first attempts to counteract the increased control effort due to differences between the plant and reference model might be to require less damping, however, this would actually increase the control effort. Optimization of weighting matrices is shown to help reduce the increase in required control effort. However, it was found that eventually the optimization resulted in a design that required an extremely high sampling rate for successful realization.
The Earth Microbiome Project and modeling the planets microbial potential (Invited)
NASA Astrophysics Data System (ADS)
Gilbert, J. A.
2013-12-01
The understanding of Earth's climate and ecology requires multiscale observations of the biosphere, of which microbial life are a major component. However, to acquire and process physical samples of soil, water and air that comprise the appropriate spatial and temporal resolution to capture the immense variation in microbial dynamics, would require a herculean effort and immense financial resources dwarfing even the most ambitious projects to date. To overcome this hurdle we created the Earth Microbiome Project, a crowd-sourced effort to acquire physical samples from researchers around the world that are, importantly, contextualized with physical, chemical and biological data detailing the environmental properties of that sample in the location and time it was acquired. The EMP leverages these existing efforts to target a systematic analysis of microbial taxonomic and functional dynamics across a vast array of environmental parameter gradients. The EMP captures the environmental gradients, location, time and sampling protocol information about every sample donated by our valued collaborators. Physical samples are then processed using a standardized DNA extraction, PCR, and shotgun sequencing protocol to generate comparable data regarding the microbial community structure and function in each sample. To date we have processed >17,000 samples from 40 different biomes. One of the key goals of the EMP is to map the spatiotemporal variability of microbial communities to capture the changes in important functional processes that need to be appropriately expressed in models to provide reliable forecasts of ecosystem phenotype across our changing planet. This is essential if we are to develop economically sound strategies to be good stewards of our Earth. The EMP recognizes that environments are comprised of complex sets of interdependent parameters and that the development of useful predictive computational models of both terrestrial and atmospheric systems requires recognition and accommodation of sources of uncertainty.
Flight program language requirements. Volume 2: Requirements and evaluations
NASA Technical Reports Server (NTRS)
1972-01-01
The efforts and results are summarized for a study to establish requirements for a flight programming language for future onboard computer applications. Several different languages were available as potential candidates for future NASA flight programming efforts. The study centered around an evaluation of the four most pertinent existing aerospace languages. Evaluation criteria were established, and selected kernels from the current Saturn 5 and Skylab flight programs were used as benchmark problems for sample coding. An independent review of the language specifications incorporated anticipated future programming requirements into the evaluation. A set of detailed language requirements was synthesized from these activities. The details of program language requirements and of the language evaluations are described.
A standard sampling protocol to assess the fish assemblages and abundances in large, coldwater rivers is most accurate and precise if consistent gears and levels of effort are used at each site. This requires thorough crew training, quality control audits, and replicate sampling...
Modular biowaste monitoring system
NASA Technical Reports Server (NTRS)
Fogal, G. L.
1975-01-01
The objective of the Modular Biowaste Monitoring System Program was to generate and evaluate hardware for supporting shuttle life science experimental and diagnostic programs. An initial conceptual design effort established requirements and defined an overall modular system for the collection, measurement, sampling and storage of urine and feces biowastes. This conceptual design effort was followed by the design, fabrication and performance evaluation of a flight prototype model urine collection, volume measurement and sampling capability. No operational or performance deficiencies were uncovered as a result of the performance evaluation tests.
Creel survey sampling designs for estimating effort in short-duration Chinook salmon fisheries
McCormick, Joshua L.; Quist, Michael C.; Schill, Daniel J.
2013-01-01
Chinook Salmon Oncorhynchus tshawytscha sport fisheries in the Columbia River basin are commonly monitored using roving creel survey designs and require precise, unbiased catch estimates. The objective of this study was to examine the relative bias and precision of total catch estimates using various sampling designs to estimate angling effort under the assumption that mean catch rate was known. We obtained information on angling populations based on direct visual observations of portions of Chinook Salmon fisheries in three Idaho river systems over a 23-d period. Based on the angling population, Monte Carlo simulations were used to evaluate the properties of effort and catch estimates for each sampling design. All sampling designs evaluated were relatively unbiased. Systematic random sampling (SYS) resulted in the most precise estimates. The SYS and simple random sampling designs had mean square error (MSE) estimates that were generally half of those observed with cluster sampling designs. The SYS design was more efficient (i.e., higher accuracy per unit cost) than a two-cluster design. Increasing the number of clusters available for sampling within a day decreased the MSE of estimates of daily angling effort, but the MSE of total catch estimates was variable depending on the fishery. The results of our simulations provide guidelines on the relative influence of sample sizes and sampling designs on parameters of interest in short-duration Chinook Salmon fisheries.
GY SAMPLING THEORY IN ENVIRONMENTAL STUDIES 2: SUBSAMPLING ERROR MEASUREMENTS
Sampling can be a significant source of error in the measurement process. The characterization and cleanup of hazardous waste sites require data that meet site-specific levels of acceptable quality if scientifically supportable decisions are to be made. In support of this effort,...
Applying Active Learning to Assertion Classification of Concepts in Clinical Text
Chen, Yukun; Mani, Subramani; Xu, Hua
2012-01-01
Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105
Influence of Sampling Effort on the Estimated Richness of Road-Killed Vertebrate Wildlife
NASA Astrophysics Data System (ADS)
Bager, Alex; da Rosa, Clarissa A.
2011-05-01
Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.
Influence of sampling effort on the estimated richness of road-killed vertebrate wildlife.
Bager, Alex; da Rosa, Clarissa A
2011-05-01
Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.
Peterson, J.; Dunham, J.B.
2003-01-01
Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult-to-sample species, and models of species presence may produce biased predictions. We present a Bayesian approach that combines sampling and model-based inferences for estimating species presence. The accuracy and cost-effectiveness of this approach were compared to those of sampling surveys and predictive models for estimating the presence of the threatened bull trout ( Salvelinus confluentus ) via simulation with existing models and empirical sampling data. Simulations indicated that a sampling-only approach would be the most effective and would result in the lowest presence and absence misclassification error rates for three thresholds of detection probability. When sampling effort was considered, however, the combined approach resulted in the lowest error rates per unit of sampling effort. Hence, lower probability-of-detection thresholds can be specified with the combined approach, resulting in lower misclassification error rates and improved cost-effectiveness.
Innovative Ways for Information Transfer in Biobanking
ERIC Educational Resources Information Center
Macheiner, Tanja; Huppertz, Berthold; Sargsyan, Karine
2013-01-01
Purpose: Biobanks are collections of biological samples (e.g. tissue samples and body fluids) and their associated data intended for various approaches in medical research. The field of biobanking evolves rapidly as an interdisciplinary branch of research and requires educational efforts to provide skilled experts in Europe and beyond. New ways in…
This multi-year pilot study evaluated a proposed field method for its effectiveness in the collection of a benthic macroinvertebrate sample adequate for use in the condition assessment of streams and rivers in the Neuquén Province, Argentina. A total of 13 sites, distribut...
Jensen, Pamela C.; Purcell, Maureen K.; Morado, J. Frank; Eckert, Ginny L.
2012-01-01
The Alaskan red king crab (Paralithodes camtschaticus) fishery was once one of the most economically important single-species fisheries in the world, but is currently depressed. This fishery would benefit from improved stock assessment capabilities. Larval crab distribution is patchy temporally and spatially, requiring extensive sampling efforts to locate and track larval dispersal. Large-scale plankton surveys are generally cost prohibitive because of the effort required for collection and the time and taxonomic expertise required to sort samples to identify plankton individually via light microscopy. Here, we report the development of primers and a dual-labeled probe for use in a DNA-based real-time polymerase chain reaction assay targeting the red king crab, mitochondrial gene cytochrome oxidase I for the detection of red king crab larvae DNA in plankton samples. The assay allows identification of plankton samples containing crab larvae DNA and provides an estimate of DNA copy number present in a sample without sorting the plankton sample visually. The assay was tested on DNA extracted from whole red king crab larvae and plankton samples seeded with whole larvae, and it detected DNA copies equivalent to 1/10,000th of a larva and 1 crab larva/5mL sieved plankton, respectively. The real-time polymerase chain reaction assay can be used to screen plankton samples for larvae in a fraction of the time required for traditional microscopial methods, which offers advantages for stock assessment methodologies for red king crab as well as a rapid and reliable method to assess abundance of red king crab larvae as needed to improve the understanding of life history and population processes, including larval population dynamics.
Accounting for Incomplete Species Detection in Fish Community Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta
2013-01-01
Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less
Burdick, Summer M.; Wilkens, Alexander X.; VanderKooi, Scott P.
2008-01-01
We continued sampling juvenile suckers in 2006 as part of an effort to develop bioenergetics models for juvenile Lost River and shortnose suckers. This study required us to collect fish to determine growth rates and energy content of juvenile suckers. We followed the sampling protocols and methods described by Hendrixson et al. (2007b) to maintain continuity and facilitate comparisons with data collected in recent years, but sampled at a reduced level of effort compared to previous years (approximately one-third) due to limited funding. Here we present a summary of catch data collected in 2006. Bioenergetics models will be reported separately
Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects
Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.
2012-01-01
Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305
Optimal sampling for radiotelemetry studies of spotted owl habitat and home range.
Andrew B. Carey; Scott P. Horton; Janice A. Reid
1989-01-01
Radiotelemetry studies of spotted owl (Strix occidentalis) ranges and habitat-use must be designed efficiently to estimate parameters needed for a sample of individuals sufficient to describe the population. Independent data are required by analytical methods and provide the greatest return of information per effort. We examined time series of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maisel, B.E.; Hunt, G.T.; Devaney, R.J. Jr.
EPA`s Brownfields Economic Redevelopment Initiative has sparked renewal of industrial and commercial parcels otherwise idled or under-utilized because of real or perceived environmental contamination. In certain cases, restoring such parcels to productive economic use requires a redevelopment effort protective of human health and welfare through minimizing offsite migration of environmental contaminants during cleanup, demolition and remediation activities. To support these objectives, an air monitoring program is often required as an integral element of a comprehensive brownfields redevelopment effort. This paper presents a strategic framework for design and execution of an ambient air monitoring program in support of a brownfields remediationmore » effort ongoing in Lawrence, MA. Based on site characterization, the program included sample collection and laboratory analysis of ambient air samples for polychlorinated biphenyls (PCBs), polychlorinated dibenzodioxins and polychlorinated dibenzofurans (PCDDs/PCDFs), total suspended particulate (TSP), inhalable particulate (PM10), and lead. The program included four monitoring phases, identified as background, wintertime, demolition/remediation and post-demolition. Air sampling occurred over a 16 month period during 1996--97, during which time nine sampling locations were utilized to produce approximately 1,500 ambient air samples. Following strict data review and validation procedures, ambient air data interpretation focused on the following: evaluation of upwind/downwind sample pairs, comparison of ambient levels to existing regulatory standards, relation of ambient levels to data reported in the open literature, and, determination of normal seasonal variations in existing background burden, comparison of ambient levels measured during site activity to background levels.« less
Preti, Emanuele; Richetin, Juliette; Suttora, Chiara; Pisani, Alberto
2016-04-30
Dysfunctions in social cognition characterize personality disorders. However, mixed results emerged from literature on emotion processing. Borderline Personality Disorder (BPD) traits are either associated with enhanced emotion recognition, impairments, or equal functioning compared to controls. These apparent contradictions might result from the complexity of emotion recognition tasks used and from individual differences in impulsivity and effortful control. We conducted a study in a sample of undergraduate students (n=80), assessing BPD traits, using an emotion recognition task that requires the processing of only visual information or both visual and acoustic information. We also measured individual differences in impulsivity and effortful control. Results demonstrated the moderating role of some components of impulsivity and effortful control on the capability of BPD traits in predicting anger and happiness recognition. We organized the discussion around the interaction between different components of regulatory functioning and task complexity for a better understanding of emotion recognition in BPD samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn
2013-03-06
Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.
2013-01-01
Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171
Risk analysis of earth return options for the Mars rover/sample return mission
NASA Technical Reports Server (NTRS)
1988-01-01
Four options for return of a Mars surface sample to Earth were studied to estimate the risk of mission failure and the risk of a sample container breach that might result in the release of Martian life forms, should such exist, in the Earth's biosphere. The probabilities calculated refer only to the time period from the last midcourse correction burn to possession of the sample on Earth. Two extreme views characterize this subject. In one view, there is no life on Mars, therefore there is no significant risk and no serious effort is required to deal with back contamination. In the other view, public safety overrides any desire to return Martian samples, and any risk of damaging contamination greater than zero is unacceptable. Zero risk requires great expense to achieve and may prevent the mission as currently envisioned from taking place. The major conclusion is that risk of sample container breach can be reduced to a very low number within the framework of the mission as now envisioned, but significant expense and effort, above that currently planned is needed. There are benefits to the public that warrant some risk. Martian life, if it exists, will be a major discovery. If it does not, there is no risk.
300 Area TEDF NPDES Permit Compliance Monitoring Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loll, C.M.
1994-10-13
This monitoring plan describes the activities and methods that will be employed at the 300 Area Treated Effluent Disposal Facility (TEDF) in order to ensure compliance with the National Discharge Elimination System (NPDES) permit. Included in this document are a brief description of the project, the specifics of the sampling effort, including the physical location and frequency of sampling, the support required for sampling, and the Quality Assurance (QA) protocols to be followed in the sampling procedures.
Intelligent Sampling of Hazardous Particle Populations in Resource-Constrained Environments
NASA Astrophysics Data System (ADS)
McCollough, J. P.; Quinn, J. M.; Starks, M. J.; Johnston, W. R.
2017-10-01
Sampling of anomaly-causing space environment drivers is necessary for both real-time operations and satellite design efforts, and optimizing measurement sampling helps minimize resource demands. Relating these measurements to spacecraft anomalies requires the ability to resolve spatial and temporal variability in the energetic charged particle hazard of interest. Here we describe a method for sampling particle fluxes informed by magnetospheric phenomenology so that, along a given trajectory, the variations from both temporal dynamics and spatial structure are adequately captured while minimizing oversampling. We describe the coordinates, sampling method, and specific regions and parameters employed. We compare resulting sampling cadences with data from spacecraft spanning the regions of interest during a geomagnetically active period, showing that the algorithm retains the gross features necessary to characterize environmental impacts on space systems in diverse orbital regimes while greatly reducing the amount of sampling required. This enables sufficient environmental specification within a resource-constrained context, such as limited telemetry bandwidth, processing requirements, and timeliness.
NASA Technical Reports Server (NTRS)
Fries, M. D.; Fries, W. D.; McCubbin, F. M.; Zeigler, R. A.
2018-01-01
Mars Sample Return (MSR) requires strict organic contamination control (CC) and contamination knowledge (CK) as outlined by the Mars 2020 Organic Contamination Panel (OCP). This includes a need to monitor surficial organic contamination to a ng/sq. cm sensitivity level. Archiving and maintaining this degree of surface cleanliness may be difficult but has been achieved. MSR's CK effort will be very important because all returned samples will be studied thoroughly and in minute detail. Consequently, accurate CK must be collected and characterized to best interpret scientific results from the returned samples. The CK data are not only required to make accurate measurements and interpretations for carbon-depleted martian samples, but also to strengthen the validity of science investigations performed on the samples. The Opera instrument prototype is intended to fulfill a CC/CK role in the assembly, cleaning, and overall contamination history of hardware used in the MSR effort, from initial hardware assembly through post-flight sample curation. Opera is intended to monitor particulate and organic contamination using quartz crystal microbalances (QCMs), in a self-contained portable package that is cleanroom-compliant. The Opera prototype is in initial development capable of approximately 100 ng/sq. cm organic contamination sensitivity, with additional development planned to achieve 1 ng/sq. cm. The Opera prototype was funded by the 2017 NASA Johnson Space Center Innovation Charge Account (ICA), which provides funding for small, short-term projects.
James T. Peterson; Jason Dunham
2003-01-01
Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult- to-sample species, and models of species...
Precision and relative effectiveness of a purse seine for sampling age-0 river herring in lakes
Devine, Matthew T.; Roy, Allison; Whiteley, Andrew R.; Gahagan, Benjamin I.; Armstrong, Michael P.; Jordaan, Adrian
2018-01-01
Stock assessments for anadromous river herring, collectively Alewife Alosa pseudoharengus and Blueback Herring A. aestivalis, lack adequate demographic information, particularly with respect to early life stages. Although sampling adult river herring is increasingly common throughout their range, currently no standardized, field‐based, analytical methods exist for estimating juvenile abundance in freshwater lakes. The objective of this research was to evaluate the relative effectiveness and sampling precision of a purse seine for estimating densities of age‐0 river herring in freshwater lakes. We used a purse seine to sample age‐0 river herring in June–September 2015 and June–July 2016 in 16 coastal freshwater lakes in the northeastern USA. Sampling effort varied from two seine hauls to more than 50 seine hauls per lake. Catch rates were highest in June and July, and sampling precision was maximized in July. Sampling at night (versus day) in open water (versus littoral areas) was most effective for capturing newly hatched larvae and juveniles up to ca. 100 mm TL. Bootstrap simulation results indicated that sampling precision of CPUE estimates increased with sampling effort, and there was a clear threshold beyond which increased effort resulted in negligible increases in precision. The effort required to produce precise CPUE estimates, as determined by the CV, was dependent on lake size; river herring densities could be estimated with up to 10 purse‐seine hauls (one‐two nights) in a small lake (<50 ha) and 15–20 hauls (two‐three nights) in a large lake (>50 ha). Fish collection techniques using a purse seine as described in this paper are likely to be effective for estimating recruit abundance of river herring in freshwater lakes across their range.
Rosenberger, Amanda E.; Dunham, Jason B.
2005-01-01
Estimation of fish abundance in streams using the removal model or the Lincoln - Peterson mark - recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one-, two-, three-, and four-pass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark - recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark - recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency for different levels of effort. Validation of abundance estimates requires extra attention to routine sampling considerations but can help fisheries biologists avoid pitfalls associated with biased data and facilitate standardized comparisons among studies that employ different sampling methods.
López, Iago; Alvarez, César; Gil, José L; Revilla, José A
2012-11-30
Data on the 95th and 90th percentiles of bacteriological quality indicators are used to classify bathing waters in Europe, according to the requirements of Directive 2006/7/EC. However, percentile values and consequently, classification of bathing waters depend both on sampling effort and sample-size, which may undermine an appropriate assessment of bathing water classification. To analyse the influence of sampling effort and sample size on water classification, a bootstrap approach was applied to 55 bacteriological quality datasets of several beaches in the Balearic Islands (Spain). Our results show that the probability of failing the regulatory standards of the Directive is high when sample size is low, due to a higher variability in percentile values. In this way, 49% of the bathing waters reaching an "Excellent" classification (95th percentile of Escherichia coli under 250 cfu/100 ml) can fail the "Excellent" regulatory standard due to sampling strategy, when 23 samples per season are considered. This percentage increases to 81% when 4 samples per season are considered. "Good" regulatory standards can also be failed in bathing waters with an "Excellent" classification as a result of these sampling strategies. The variability in percentile values may affect bathing water classification and is critical for the appropriate design and implementation of bathing water Quality Monitoring and Assessment Programs. Hence, variability of percentile values should be taken into account by authorities if an adequate management of these areas is to be achieved. Copyright © 2012 Elsevier Ltd. All rights reserved.
Sporadic sampling, not climatic forcing, drives observed early hominin diversity.
Maxwell, Simon J; Hopley, Philip J; Upchurch, Paul; Soligo, Christophe
2018-05-08
The role of climate change in the origin and diversification of early hominins is hotly debated. Most accounts of early hominin evolution link observed fluctuations in species diversity to directional shifts in climate or periods of intense climatic instability. None of these hypotheses, however, have tested whether observed diversity patterns are distorted by variation in the quality of the hominin fossil record. Here, we present a detailed examination of early hominin diversity dynamics, including both taxic and phylogenetically corrected diversity estimates. Unlike past studies, we compare these estimates to sampling metrics for rock availability (hominin-, primate-, and mammal-bearing formations) and collection effort, to assess the geological and anthropogenic controls on the sampling of the early hominin fossil record. Taxic diversity, primate-bearing formations, and collection effort show strong positive correlations, demonstrating that observed patterns of early hominin taxic diversity can be explained by temporal heterogeneity in fossil sampling rather than genuine evolutionary processes. Peak taxic diversity at 1.9 million years ago (Ma) is a sampling artifact, reflecting merely maximal rock availability and collection effort. In contrast, phylogenetic diversity estimates imply peak diversity at 2.4 Ma and show little relation to sampling metrics. We find that apparent relationships between early hominin diversity and indicators of climatic instability are, in fact, driven largely by variation in suitable rock exposure and collection effort. Our results suggest that significant improvements in the quality of the fossil record are required before the role of climate in hominin evolution can be reliably determined. Copyright © 2018 the Author(s). Published by PNAS.
Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults
NASA Technical Reports Server (NTRS)
Hamill, Maggie; Goseva-Popstojanova, Katerina
2016-01-01
Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.
A standardized sampling protocol for channel catfish in prairie streams
Vokoun, Jason C.; Rabeni, Charles F.
2001-01-01
Three alternative gears—an AC electrofishing raft, bankpoles, and a 15-hoop-net set—were used in a standardized manner to sample channel catfish Ictalurus punctatus in three prairie streams of varying size in three seasons. We compared these gears as to time required per sample, size selectivity, mean catch per unit effort (CPUE) among months, mean CPUE within months, effect of fluctuating stream stage, and sensitivity to population size. According to these comparisons, the 15-hoop-net set used during stable water levels in October had the most desirable characteristics. Using our catch data, we estimated the precision of CPUE and size structure by varying sample sizes for the 15-hoop-net set. We recommend that 11–15 repetitions of the 15-hoop-net set be used for most management activities. This standardized basic unit of effort will increase the precision of estimates and allow better comparisons among samples as well as increased confidence in management decisions.
Quist, M.C.; Gerow, K.G.; Bower, M.R.; Hubert, W.A.
2006-01-01
Native fishes of the upper Colorado River basin (UCRB) have declined in distribution and abundance due to habitat degradation and interactions with normative fishes. Consequently, monitoring populations of both native and nonnative fishes is important for conservation of native species. We used data collected from Muddy Creek, Wyoming (2003-2004), to compare sample size estimates using a random and a fixed-site sampling design to monitor changes in catch per unit effort (CPUE) of native bluehead suckers Catostomus discobolus, flannelmouth suckers C. latipinnis, roundtail chub Gila robusta, and speckled dace Rhinichthys osculus, as well as nonnative creek chub Semotilus atromaculatus and white suckers C. commersonii. When one-pass backpack electrofishing was used, detection of 10% or 25% changes in CPUE (fish/100 m) at 60% statistical power required 50-1,000 randomly sampled reaches among species regardless of sampling design. However, use of a fixed-site sampling design with 25-50 reaches greatly enhanced the ability to detect changes in CPUE. The addition of seining did not appreciably reduce required effort. When detection of 25-50% changes in CPUE of native and nonnative fishes is acceptable, we recommend establishment of 25-50 fixed reaches sampled by one-pass electrofishing in Muddy Creek. Because Muddy Creek has habitat and fish assemblages characteristic of other headwater streams in the UCRB, our results are likely to apply to many other streams in the basin. ?? Copyright by the American Fisheries Society 2006.
Development of Oxidation Resistant Coatings on GRCop-84 Substrates by Cold Spray Process
NASA Technical Reports Server (NTRS)
Karthikeyan, J.
2007-01-01
GRCop-84, a Cu-CR-Nb alloy, has been developed for rocket engine liner applications. For maximum life additional oxidation protection is required to prevent blanching. NiCrAlY was identified as a suitable coating, and efforts were initiated to develop suitable coating techniques. Cold spray is one technique under consideration. Efforts at ASB Industries to produce dense, adherent coatings are detailed. The work culminated in the production of samples for testing at NASA Glenn Research Center.
Flight program language requirements. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
1972-01-01
Government-sponsored study and development efforts were directed toward design and implementation of high level programming languages suitable for future aerospace applications. The study centered around an evaluation of the four most pertinent existing aerospace languages. Evaluation criteria were established, and selected kernels from the current Saturn 5 and Skylab flight programs were used as benchmark problems for sample coding. An independent review of the language specifications incorporated anticipated future programming requirements into the evaluation. A set of language requirements was synthesized from these activities.
Warner, David M.; Claramunt, Randall M.; Schaeffer, Jeffrey S.; Yule, Daniel L.; Hrabik, Tom R.; Peintka, Bernie; Rudstam, Lars G.; Holuszko, Jeffrey D.; O'Brien, Timothy P.
2012-01-01
Because it is not possible to identify species with echosounders alone, trawling is widely used as a method for collecting species and size composition data for allocating acoustic fish density estimates to species or size groups. In the Laurentian Great Lakes, data from midwater trawls are commonly used for such allocations. However, there are no rules for how much midwater trawling effort is required to adequately describe species and size composition of the pelagic fish communities in these lakes, so the balance between acoustic sampling effort and trawling effort has been unguided. We used midwater trawl data collected between 1986 and 2008 in lakes Michigan and Huron and a variety of analytical techniques to develop guidance for appropriate levels of trawl effort. We used multivariate regression trees and re-sampling techniques to i. identify factors that influence species and size composition of the pelagic fish communities in these lakes, ii. identify stratification schemes for the two lakes, iii. determine if there was a relationship between uncertainty in catch composition and the number of tows made, and iv. predict the number of tows required to reach desired uncertainty targets. We found that depth occupied by fish below the surface was the most influential explanatory variable. Catch composition varied between lakes at depths <38.5 m below the surface, but not at depths ≥38.5 m below the surface. Year, latitude, and bottom depth influenced catch composition in the near-surface waters of Lake Michigan, while only year was important for Lake Huron surface waters. There was an inverse relationship between RSE [relative standard error = 100 × (SE/mean)] and the number of tows made for the proportions of the different size and species groups. We found for the fifth (Lake Huron) and sixth (Lake Michigan) largest lakes in the world, 15–35 tows were adequate to achieve target RSEs (15% and 30%) for ubiquitous species, but rarer species required much higher, and at times, impractical effort levels to reach these targets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, H.C.
1998-07-01
The Idaho National Engineering and Environmental Laboratory (INEEL) has several permitted treatment, storage and disposal facilities. The INEEL Sample Management Office (SMO) conducts all analysis subcontracting activities for Department of Energy Environmental Management programs at the INEEL. In this role, the INEEL SMO has had the opportunity to subcontract the analyses of various wastes (including ash from an interim status incinerator) requesting a target analyte list equivalent to the constituents listed in 40 Code of Federal Regulations. These analyses are required to ensure that treated wastes do not contain underlying hazardous constituents (UHC) at concentrations greater than the universal treatmentmore » standards (UTS) prior to land disposal. The INEEL SMO has conducted a good-faith effort by negotiating with several commercial laboratories to identify the lowest possible quantitation and detection limits that can be achieved for the organic UHC analytes. The results of this negotiating effort has been the discovery that no single laboratory (currently under subcontract with the INEEL SMO) can achieve a detection level that is within an order of magnitude of the UTS for all organic parameters on a clean sample matrix (e.g., sand). This does not mean that there is no laboratory that can achieve the order of magnitude requirements for all organic UHCs on a clean sample matrix. The negotiations held to date indicate that it is likely that no laboratory can achieve the order of magnitude requirements for a difficult sample matrix (e.g., an incinerator ash). The authors suggest that the regulation needs to be revised to address the disparity between what is achievable in the laboratory and the regulatory levels required by the UTS.« less
Silvia, Paul J; Beaty, Roger E; Nusbaum, Emily C; Eddington, Kari M; Kwapil, Thomas R
2014-10-01
Executive approaches to creativity emphasize that generating creative ideas can be hard and requires mental effort. Few studies, however, have examined effort-related physiological activity during creativity tasks. Using motivational intensity theory as a framework, we examined predictors of effort-related cardiac activity during a creative challenge. A sample of 111 adults completed a divergent thinking task. Sympathetic (PEP and RZ) and parasympathetic (RSA and RMSSD) outcomes were assessed using impedance cardiography. As predicted, people with high creative achievement (measured with the Creative Achievement Questionnaire) showed significantly greater increases in sympathetic activity from baseline to task, reflecting higher effort. People with more creative achievements generated ideas that were significantly more creative, and creative performance correlated marginally with PEP and RZ. The results support the view that creative thought can be a mental challenge. Copyright © 2014 Elsevier B.V. All rights reserved.
2017-09-01
information provided from the GED (Peggy Harris, personal communication ), but unprecedented high rainfall (including 20” of rain April 29–30, 2014...Alexandria, VA, by the Energy and Environmental Sustainability Branch (71760) of the Advanced Systems and Applied Sciences Division (71700), Space ...sampling and analyses. These challenges include a high level of effort or difficulty required to (1) measure MC at very low (ng/L) concentrations; (2
Hakjun Rhee; Randy B. Foltz; James L. Fridley; Finn Krogstad; Deborah S. Page-Dumroese
2014-01-01
Measurement of particle-size distribution (PSD) of soil with large-sized particles (e.g., 25.4 mm diameter) requires a large sample and numerous particle-size analyses (PSAs). A new method is needed that would reduce time, effort, and cost for PSAs of the soil and aggregate material with large-sized particles. We evaluated a nested method for sampling and PSA by...
A review on setting appropriate reach length for biological assessment of boatable rivers
Researchers working on boatable rivers are presented with the task of selecting an appropriate stream length, or reach length, from which data will be collected. Ideally, the sampling effort applied is the minimum that will allow stated objectives to be addressed as required by a...
50 CFR 18.118 - What are the mitigation, monitoring, and reporting requirements?
Code of Federal Regulations, 2014 CFR
2014-10-01
... monitoring and research efforts will employ rigorous study designs and sampling protocols in order to provide... mitigation measures for offshore seismic surveys. Any offshore exploration activity expected to include the... 1 µPa. (ii) Ramp-up procedures. For all seismic surveys, including airgun testing, use the following...
50 CFR 18.118 - What are the mitigation, monitoring, and reporting requirements?
Code of Federal Regulations, 2013 CFR
2013-10-01
... monitoring and research efforts will employ rigorous study designs and sampling protocols in order to provide... mitigation measures for offshore seismic surveys. Any offshore exploration activity expected to include the... 1 µPa. (ii) Ramp-up procedures. For all seismic surveys, including airgun testing, use the following...
Removing the "high" from the highways : the impact of Virginia's efforts to combat drug-related DUI.
DOT National Transportation Integrated Search
1992-01-01
Beginning on April 1, 1988, a revision to Virginia law gave police officers the authority to require an individual suspected of drug-related driving under the influence (DUI) to submit a blood sample to be tested for drugs. Concurrent with the implem...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbett, J.E.
1996-02-01
This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review.
NASA Technical Reports Server (NTRS)
Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.
2015-01-01
Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.
Hugo, Sanet; Altwegg, Res
2017-09-01
Using the Southern African Bird Atlas Project (SABAP2) as a case study, we examine the possible determinants of spatial bias in volunteer sampling effort and how well such biased data represent environmental gradients across the area covered by the atlas. For each province in South Africa, we used generalized linear mixed models to determine the combination of variables that explain spatial variation in sampling effort (number of visits per 5' × 5' grid cell, or "pentad"). The explanatory variables were distance to major road and exceptional birding locations or "sampling hubs," percentage cover of protected, urban, and cultivated area, and the climate variables mean annual precipitation, winter temperatures, and summer temperatures. Further, we used the climate variables and plant biomes to define subsets of pentads representing environmental zones across South Africa, Lesotho, and Swaziland. For each environmental zone, we quantified sampling intensity, and we assessed sampling completeness with species accumulation curves fitted to the asymptotic Lomolino model. Sampling effort was highest close to sampling hubs, major roads, urban areas, and protected areas. Cultivated area and the climate variables were less important. Further, environmental zones were not evenly represented by current data and the zones varied in the amount of sampling required representing the species that are present. SABAP2 volunteers' preferences in birding locations cause spatial bias in the dataset that should be taken into account when analyzing these data. Large parts of South Africa remain underrepresented, which may restrict the kind of ecological questions that may be addressed. However, sampling bias may be improved by directing volunteers toward undersampled regions while taking into account volunteer preferences.
Pearson, Kristen Nicole; Kendall, William L.; Winkelman, Dana L.; Persons, William R.
2016-01-01
A key component of many monitoring programs for special status species involves capture and handling of individuals as part of capture-recapture efforts for tracking population health and demography. Minimizing negative impacts from sampling, such as through reduced handling, aids prevention of negative impacts on species from monitoring efforts. Using simulation analyses, we found that long-term population monitoring techniques, requiring physical capture (i.e. hoop-net sampling), can be reduced and supplemented with passive detections (i.e. PIT tag antenna array detections) without negatively affecting estimates of adult humpback chub (HBC; Gila cypha) survival (S) and skipped spawning probabilities (γ' = spawner transitions to a skipped spawner, γ′ = skipped spawner remains a skipped spawner). Based on our findings of the array’s in situ detection efficiency (0.42), estimability of such demographic parameters would improve over hoop-netting alone. In addition, the array provides insight into HBC population dynamics and movement patterns outside of traditional sampling periods. However, given current timing of sampling efforts, spawner abundance estimates were negatively biased when hoop-netting was reduced, suggesting not all spawning HBC are present during the current sampling events. Despite this, our findings demonstrate that PIT tag antenna arrays, even with moderate potential detectability, may allow for reduced handling of special status species while also offering potentially more efficient monitoring strategies, especially if ideal timing of sampling can be determined.
1984-02-01
measurable impact if changed. The following items were included in the sample: * Mark Zero Items -Low demand insurance items which represent about three...R&D efforts reviewed. The resulting assessment highlighted the generic enabling technologies and cross- cutting R&D projects required to focus current...supplied by spot buys, and which may generate Navy Inventory Control Numbers (NICN). Random samples of data were extracted from the Master Data File ( MDF
NASA Astrophysics Data System (ADS)
Wason, H.; Herrmann, F. J.; Kumar, R.
2016-12-01
Current efforts towards dense shot (or receiver) sampling and full azimuthal coverage to produce high resolution images have led to the deployment of multiple source vessels (or streamers) across marine survey areas. Densely sampled marine seismic data acquisition, however, is expensive, and hence necessitates the adoption of sampling schemes that save acquisition costs and time. Compressed sensing is a sampling paradigm that aims to reconstruct a signal--that is sparse or compressible in some transform domain--from relatively fewer measurements than required by the Nyquist sampling criteria. Leveraging ideas from the field of compressed sensing, we show how marine seismic acquisition can be setup as a compressed sensing problem. A step ahead from multi-source seismic acquisition is simultaneous source acquisition--an emerging technology that is stimulating both geophysical research and commercial efforts--where multiple source arrays/vessels fire shots simultaneously resulting in better coverage in marine surveys. Following the design principles of compressed sensing, we propose a pragmatic simultaneous time-jittered time-compressed marine acquisition scheme where single or multiple source vessels sail across an ocean-bottom array firing airguns at jittered times and source locations, resulting in better spatial sampling and speedup acquisition. Our acquisition is low cost since our measurements are subsampled. Simultaneous source acquisition generates data with overlapping shot records, which need to be separated for further processing. We can significantly impact the reconstruction quality of conventional seismic data from jittered data and demonstrate successful recovery by sparsity promotion. In contrast to random (sub)sampling, acquisition via jittered (sub)sampling helps in controlling the maximum gap size, which is a practical requirement of wavefield reconstruction with localized sparsifying transforms. We illustrate our results with simulations of simultaneous time-jittered marine acquisition for 2D and 3D ocean-bottom cable survey.
Russo, Laura; Park, Mia; Gibbs, Jason; Danforth, Bryan
2015-01-01
Bees are important pollinators of agricultural crops, and bee diversity has been shown to be closely associated with pollination, a valuable ecosystem service. Higher functional diversity and species richness of bees have been shown to lead to higher crop yield. Bees simultaneously represent a mega-diverse taxon that is extremely challenging to sample thoroughly and an important group to understand because of pollination services. We sampled bees visiting apple blossoms in 28 orchards over 6 years. We used species rarefaction analyses to test for the completeness of sampling and the relationship between species richness and sampling effort, orchard size, and percent agriculture in the surrounding landscape. We performed more than 190 h of sampling, collecting 11,219 specimens representing 104 species. Despite the sampling intensity, we captured <75% of expected species richness at more than half of the sites. For most of these, the variation in bee community composition between years was greater than among sites. Species richness was influenced by percent agriculture, orchard size, and sampling effort, but we found no factors explaining the difference between observed and expected species richness. Competition between honeybees and wild bees did not appear to be a factor, as we found no correlation between honeybee and wild bee abundance. Our study shows that the pollinator fauna of agroecosystems can be diverse and challenging to thoroughly sample. We demonstrate that there is high temporal variation in community composition and that sites vary widely in the sampling effort required to fully describe their diversity. In order to maximize pollination services provided by wild bee species, we must first accurately estimate species richness. For researchers interested in providing this estimate, we recommend multiyear studies and rarefaction analyses to quantify the gap between observed and expected species richness. PMID:26380684
Pinsent, Amy; Blake, Isobel M; White, Michael T; Riley, Steven
2014-08-01
Both high and low pathogenic subtype A avian influenza remain ongoing threats to the commercial poultry industry globally. The emergence of a novel low pathogenic H7N9 lineage in China presents itself as a new concern to both human and animal health and may necessitate additional surveillance in commercial poultry operations in affected regions. Sampling data was simulated using a mechanistic model of H7N9 influenza transmission within commercial poultry barns together with a stochastic observation process. Parameters were estimated using maximum likelihood. We assessed the probability of detecting an outbreak at time of slaughter using both real-time polymerase chain reaction (rt-PCR) and a hemagglutinin inhibition assay (HI assay) before considering more intense sampling prior to slaughter. The day of virus introduction and R0 were estimated jointly from weekly flock sampling data. For scenarios where R0 was known, we estimated the day of virus introduction into a barn under different sampling frequencies. If birds were tested at time of slaughter, there was a higher probability of detecting evidence of an outbreak using an HI assay compared to rt-PCR, except when the virus was introduced <2 weeks before time of slaughter. Prior to the initial detection of infection N sample = 50 (1%) of birds were sampled on a weekly basis once, but after infection was detected, N sample = 2000 birds (40%) were sampled to estimate both parameters. We accurately estimated the day of virus introduction in isolation with weekly and 2-weekly sampling. A strong sampling effort would be required to infer both the day of virus introduction and R0. Such a sampling effort would not be required to estimate the day of virus introduction alone once R0 was known, and sampling N sample = 50 of birds in the flock on a weekly or 2 weekly basis would be sufficient.
Struwe, Weston B; Agravat, Sanjay; Aoki-Kinoshita, Kiyoko F; Campbell, Matthew P; Costello, Catherine E; Dell, Anne; Ten Feizi; Haslam, Stuart M; Karlsson, Niclas G; Khoo, Kay-Hooi; Kolarich, Daniel; Liu, Yan; McBride, Ryan; Novotny, Milos V; Packer, Nicolle H; Paulson, James C; Rapp, Erdmann; Ranzinger, Rene; Rudd, Pauline M; Smith, David F; Tiemeyer, Michael; Wells, Lance; York, William S; Zaia, Joseph; Kettner, Carsten
2016-09-01
The minimum information required for a glycomics experiment (MIRAGE) project was established in 2011 to provide guidelines to aid in data reporting from all types of experiments in glycomics research including mass spectrometry (MS), liquid chromatography, glycan arrays, data handling and sample preparation. MIRAGE is a concerted effort of the wider glycomics community that considers the adaptation of reporting guidelines as an important step towards critical evaluation and dissemination of datasets as well as broadening of experimental techniques worldwide. The MIRAGE Commission published reporting guidelines for MS data and here we outline guidelines for sample preparation. The sample preparation guidelines include all aspects of sample generation, purification and modification from biological and/or synthetic carbohydrate material. The application of MIRAGE sample preparation guidelines will lead to improved recording of experimental protocols and reporting of understandable and reproducible glycomics datasets. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
WANG, P. T.
2015-12-01
Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.
Accuracy of remotely sensed data: Sampling and analysis procedures
NASA Technical Reports Server (NTRS)
Congalton, R. G.; Oderwald, R. G.; Mead, R. A.
1982-01-01
A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.
Ichthyoplankton abundance and variance in a large river system concerns for long-term monitoring
Holland-Bartels, Leslie E.; Dewey, Michael R.; Zigler, Steven J.
1995-01-01
System-wide spatial patterns of ichthyoplankton abundance and variability were assessed in the upper Mississippi and lower Illinois rivers to address the experimental design and statistical confidence in density estimates. Ichthyoplankton was sampled from June to August 1989 in primary milieus (vegetated and non-vegated backwaters and impounded areas, main channels and main channel borders) in three navigation pools (8, 13 and 26) of the upper Mississippi River and in a downstream reach of the Illinois River. Ichthyoplankton densities varied among stations of similar aquatic landscapes (milieus) more than among subsamples within a station. An analysis of sampling effort indicated that the collection of single samples at many stations in a given milieu type is statistically and economically preferable to the collection of multiple subsamples at fewer stations. Cluster analyses also revealed that stations only generally grouped by their preassigned milieu types. Pilot studies such as this can define station groupings and sources of variation beyond an a priori habitat classification. Thus the minimum intensity of sampling required to achieve a desired statistical confidence can be identified before implementing monitoring efforts.
ERIC Educational Resources Information Center
Adams, Troy B.; Colner, Willa
2008-01-01
Few college students meet fruit and vegetable intake recommended requirements, and most receive no information from their institutions about this issue. The avoidable disease burden among students is large, the necessary information infrastructure exists, and "Healthy People 2010" objectives indicate efforts should be taken to increase intake.…
Same day prediction of fecal indicator bacteria (FIB) concentrations and bather protection from the risk of exposure to pathogens are two important goals of implementing a modeling program at recreational beaches. Sampling efforts for modelling applications can be expensive and t...
Flipping the Audience Script: An Activity That Integrates Research and Audience Analysis
ERIC Educational Resources Information Center
Lam, Chris; Hannah, Mark A.
2016-01-01
This article describes a flipped classroom activity that requires students to integrate research and audience analysis. The activity uses Twitter as a data source. In the activity, students identify a sample, collect customer tweets, and analyze the language of the tweets in an effort to construct knowledge about an audience's values, needs, and…
Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal effort by indicating problems and/or benefits of different approaches and designs.
Sizirici, Banu; Tansel, Berrin
2015-04-01
Monitoring contaminant concentrations in groundwater near closed municipal solid waste landfills requires long term monitoring program which can require significant investment for monitoring efforts. The groundwater monitoring data from a closed landfill in Florida was analyzed to reduce the monitoring efforts. The available groundwater monitoring data (collected over 20 years) were analyzed (i.e., type, concentration and detection level) to identify the trends in concentrations of contaminants and spatial mobility characteristics of groundwater (i.e., groundwater direction, retardation characteristics of contaminants, groundwater well depth, subsoil characteristics), to identify critical monitoring locations. Among the 7 groundwater monitoring well clusters (totaling 22 wells) in landfill, the data from two monitoring well clusters (totaling 7 wells) located along direction of groundwater flow showed similarities (the highest concentrations and same contaminants). These wells were used to assess the transport characteristics of the contaminants. Some parameters (e.g., iron, sodium, ammonia as N, chlorobenzene, 1,4-dichlorobenzene) showed decreasing trends in the groundwater due to soil absorption and retardation. Metals were retarded by ion exchange and their concentration increased by depth indicating soil reached breakthrough over time. Soil depth did not have a significant effect on the concentrations of volatile organic contaminants. Based on the analyses, selective groundwater monitoring modifications were developed for effective monitoring to acquire data from the most critical locations which may be impacted by leachate mobility. The adjustments in the sampling strategy reduced the amount of data collected by as much as 97.7% (i.e., total number of parameters monitored). Effective groundwater sampling strategies can save time, effort and monitoring costs while improving the quality of sample handling and data analyses for better utilization of post closure monitoring funds. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fraga, Rafael de; Stow, Adam J; Magnusson, William E; Lima, Albertina P
2014-01-01
Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities.
de Fraga, Rafael; Stow, Adam J.; Magnusson, William E.; Lima, Albertina P.
2014-01-01
Studies leading to decision-making for environmental licensing often fail to provide accurate estimates of diversity. Measures of snake diversity are regularly obtained to assess development impacts in the rainforests of the Amazon Basin, but this taxonomic group may be subject to poor detection probabilities. Recently, the Brazilian government tried to standardize sampling designs by the implementation of a system (RAPELD) to quantify biological diversity using spatially-standardized sampling units. Consistency in sampling design allows the detection probabilities to be compared among taxa, and sampling effort and associated cost to be evaluated. The cost effectiveness of detecting snakes has received no attention in Amazonia. Here we tested the effects of reducing sampling effort on estimates of species densities and assemblage composition. We identified snakes in seven plot systems, each standardised with 14 plots. The 250 m long centre line of each plot followed an altitudinal contour. Surveys were repeated four times in each plot and detection probabilities were estimated for the 41 species encountered. Reducing the number of observations, or the size of the sampling modules, caused significant loss of information on species densities and local patterns of variation in assemblage composition. We estimated the cost to find a snake as $ 120 U.S., but general linear models indicated the possibility of identifying differences in assemblage composition for half the overall survey costs. Decisions to reduce sampling effort depend on the importance of lost information to target-issues, and may not be the preferred option if there is the potential for identifying individual snake species requiring specific conservation actions. However, in most studies of human disturbance on species assemblages, it is likely to be more cost-effective to focus on other groups of organisms with higher detection probabilities. PMID:25147930
Mallott, Elizabeth K; Garber, Paul A; Malhi, Ripan S
2017-02-01
Invertebrate foraging strategies in nonhuman primates often require complex extractive foraging or prey detection techniques. As these skills take time to master, juveniles may have reduced foraging efficiency or concentrate their foraging efforts on easier to acquire prey than adults. We use DNA barcoding, behavioral observations, and ecological data to assess age-based differences in invertebrate prey foraging strategies in a group of white-faced capuchins (Cebus capucinus) in northeastern Costa Rica. Invertebrate availability was monitored using canopy traps and sweep netting. Fecal samples were collected from adult female, adult male, and juvenile white-faced capuchins (n = 225). COI mtDNA sequences were compared with known sequences in GenBank and the Barcode of Life Database. Frequencies of Lepidoptera and Hymenoptera consumption were higher in juveniles than in adults. A significantly smaller proportion of juvenile fecal samples contained Gryllidae and Cercopidae sequences, compared with adults (0% and 4.2% vs. 4.6% and 12.5%), and a significantly larger proportion contained Tenthredinidae, Culicidae, and Crambidae (5.6%, 9.7%, and 5.6% vs. 1.3%, 0.7%, and 1.3%). Juveniles spent significantly more time feeding and foraging than adults, and focused their foraging efforts on prey that require different skills to capture or extract. Arthropod availability was not correlated with foraging efficiency, and the rate of consumption of specific orders of invertebrates was not correlated with the availability of those same taxa. Our data support the hypothesis that juveniles are concentrating their foraging efforts on different prey than adults, potentially focusing their foraging efforts on more easily acquired types of prey. © 2016 Wiley Periodicals, Inc.
Budimlija, Zoran M; Prinz, Mechthild K; Zelson-Mundorff, Amy; Wiersema, Jason; Bartelink, Eric; MacKinnon, Gaille; Nazzaruolo, Bianca L; Estacio, Sheila M; Hennessey, Michael J; Shaler, Robert C
2003-06-01
To present individual body identification efforts, as part of the World Trade Center (WTC) mass disaster identification project. More than 500 samples were tested by using polymerase chain reaction (PCR) amplification and short tandem repeat (STR) typing. The extent to which the remains were fragmented and affected by taphonomic factors complicated the identification project. Anthropologists reviewed 19,000 samples, and detected inconsistencies in 69, which were further split into 239 new cases and re-sampled by DNA specialists. The severity and nature of the disaster required an interdisciplinary effort. DNA profiling of 500 samples was successful in 75% of the cases. All discrepancies, which occurred between bone and tissue samples taken from the same body part, were resolved by re-sampling and re-testing of preferably bone tissue. Anthropologists detected inconsistencies in 69 cases, which were then split into 239 new cases. Out of 125 "split" cases, 65 were excluded from their original case. Of these 65 cases, 37 did not match any profiles in M-FISys, probably because profiles were incomplete or no exemplar for the victim was available. Out of the 60 remains not excluded from their original case, 30 were partial profiles and did not reach the statistical requirement to match their original case, because the population frequency of the DNA profile had to be =1 in 10(9) for men and =1 in 10(8) for women. Due to transfer of soft tissue and other commingling of remains, DNA testing alone would have led to problems if only soft tissue would have been tested. This was one of the reasons that forensic anthropologists were needed to evaluate the consistency between all linked body parts. Especially in disasters with a high potential for commingling, the described anthropological review process should be part of the investigation.
Ellison, Laura E.; Lukacs, Paul M.
2014-01-01
Concern for migratory tree-roosting bats in North America has grown because of possible population declines from wind energy development. This concern has driven interest in estimating population-level changes. Mark-recapture methodology is one possible analytical framework for assessing bat population changes, but sample size requirements to produce reliable estimates have not been estimated. To illustrate the sample sizes necessary for a mark-recapture-based monitoring program we conducted power analyses using a statistical model that allows reencounters of live and dead marked individuals. We ran 1,000 simulations for each of five broad sample size categories in a Burnham joint model, and then compared the proportion of simulations in which 95% confidence intervals overlapped between and among years for a 4-year study. Additionally, we conducted sensitivity analyses of sample size to various capture probabilities and recovery probabilities. More than 50,000 individuals per year would need to be captured and released to accurately determine 10% and 15% declines in annual survival. To detect more dramatic declines of 33% or 50% survival over four years, then sample sizes of 25,000 or 10,000 per year, respectively, would be sufficient. Sensitivity analyses reveal that increasing recovery of dead marked individuals may be more valuable than increasing capture probability of marked individuals. Because of the extraordinary effort that would be required, we advise caution should such a mark-recapture effort be initiated because of the difficulty in attaining reliable estimates. We make recommendations for what techniques show the most promise for mark-recapture studies of bats because some techniques violate the assumptions of mark-recapture methodology when used to mark bats.
Fischer, Jesse R.; Quist, Michael C.
2014-01-01
All freshwater fish sampling methods are biased toward particular species, sizes, and sexes and are further influenced by season, habitat, and fish behavior changes over time. However, little is known about gear-specific biases for many common fish species because few multiple-gear comparison studies exist that have incorporated seasonal dynamics. We sampled six lakes and impoundments representing a diversity of trophic and physical conditions in Iowa, USA, using multiple gear types (i.e., standard modified fyke net, mini-modified fyke net, sinking experimental gill net, bag seine, benthic trawl, boat-mounted electrofisher used diurnally and nocturnally) to determine the influence of sampling methodology and season on fisheries assessments. Specifically, we describe the influence of season on catch per unit effort, proportional size distribution, and the number of samples required to obtain 125 stock-length individuals for 12 species of recreational and ecological importance. Mean catch per unit effort generally peaked in the spring and fall as a result of increased sampling effectiveness in shallow areas and seasonal changes in habitat use (e.g., movement offshore during summer). Mean proportional size distribution decreased from spring to fall for white bass Morone chrysops, largemouth bass Micropterus salmoides, bluegill Lepomis macrochirus, and black crappie Pomoxis nigromaculatus, suggesting selectivity for large and presumably sexually mature individuals in the spring and summer. Overall, the mean number of samples required to sample 125 stock-length individuals was minimized in the fall with sinking experimental gill nets, a boat-mounted electrofisher used at night, and standard modified nets for 11 of the 12 species evaluated. Our results provide fisheries scientists with relative comparisons between several recommended standard sampling methods and illustrate the effects of seasonal variation on estimates of population indices that will be critical to the future development of standardized sampling methods for freshwater fish in lentic ecosystems.
Investigations related to evaluation of ultramicrofluorometer
NASA Technical Reports Server (NTRS)
Whitcomb, B.
1981-01-01
High resolution emission and excitation fluorescent spectra were obtained for several samples in an effort to determine the optimum operational design for the instrument. The instrument was used to determine the required nature of a sample which could be detected, and in so doing, several different sample preparation techniques were considered. Numerous experiments were performed to determine the capabilities of the instrument with regard to the detection of suitably prepared virus specimens. Significant results were obtained in several areas. The fluorescent spectra indicated that substantial changes in the laser might be used advantageously to greatly improve the performance of the instrument. In the existing configuration, the instrument was shown to be capable of detecting the presence of suitably prepared virus samples.
The potential for chemical evolution on Titan
NASA Technical Reports Server (NTRS)
Beauchamp, P. M.; Lunine, J. I.; Welch, C.
2002-01-01
Sampling of organics to determine oxygen content, extent of acetylene polymerization, existence of chiral molecules and enantiomeric excesses, and searches for specific polymer products, would be of interest in assessing how organic chemistry evolves toward biochemistry. Such efforts would require fairly sophisticated chemical analyses from landed missions. This paper examines this chemistry and the potential instruments that could distinguish chemical evolution.
ERIC Educational Resources Information Center
Austin, Gregory; Skager, Rodney
The California Student Substance Use Survey marks a milestone in the state's efforts to monitor, understand, and prevent adolescent substance use and abuse. Chapter 1 presents the methodology. This survey follows a shift in California policy to a written parental consent requirement. Sample characteristics, consent procedures, and methods of data…
Estimation of real-time N load in surface water using dynamic data driven application system
Y. Ouyang; S.M. Luo; L.H. Cui; Q. Wang; J.E. Zhang
2011-01-01
Agricultural, industrial, and urban activities are the major sources for eutrophication of surface water ecosystems. Currently, determination of nutrients in surface water is primarily accomplished by manually collecting samples for laboratory analysis, which requires at least 24 h. In other words, little to no effort has been devoted to monitoring real-time variations...
Chen, Chen-Yueh
2014-06-01
This study investigated the relationship between parents' passion for sport/exercise and children's self- and task-perceptions in sport and exercise. Paired samples of 312 children, 312 fathers, and 312 mothers were collected using two-stage sampling; parents were classified based on their passion for sport and exercise as high concordance if both parents had a high passion for sport and exercise, low concordance if neither parent had a passion for sport and exercise, or discordant. Intrinsic interest value, attainment value/importance, extrinsic utility value, ability/expectancy, task difficulty, and required effort were measured, as well as harmonious and obsessive passion. Children's self- and task-perceptions in sport and exercise were examined with respect to parents' passion for sport and exercise. The results of the study indicated that children of parents with high concordance in harmonious passion for sport and exercise scored higher on intrinsic interest value, attainment value/importance, extrinsic utility value, ability/expectancy, task difficulty, and required effort in sport and exercise than counterparts with discordant and low concordance parents. Similar patterns were found for obsessive passion in parents.
Gilliland, Jason; Clark, Andrew F; Kobrzynski, Marta; Filler, Guido
2015-07-01
Childhood obesity is a critical public health matter associated with numerous pediatric comorbidities. Local-level data are required to monitor obesity and to help administer prevention efforts when and where they are most needed. We hypothesized that samples of children visiting hospital clinics could provide representative local population estimates of childhood obesity using data from 2007 to 2013. Such data might provide more accurate, timely, and cost-effective obesity estimates than national surveys. Results revealed that our hospital-based sample could not serve as a population surrogate. Further research is needed to confirm this finding.
NASA Technical Reports Server (NTRS)
Brucker, G. J.
1971-01-01
The effort reported here represents data of lithium properties in bulk-silicon samples before and after irradiation for analytical information required to characterize the interactions of lithium with radiation-induced defects in silicon. A model of the damage and recovery mechanisms in irradiated-lithium-containing solar cells is developed based on making measurements of the Hall coefficient and resistivity of samples irradiated by 1-MeV electrons. Experiments on bulk samples included Hall coefficient and resistivity measurements taken as a function of: (1) bombardment temperature, (2) resistivity, (3) fluence, (4) oxygen concentration, and (5) annealing time at temperatures from 300 to 373 K.
Cooper, R.J.; Mordecai, Rua S.; Mattsson, B.G.; Conroy, M.J.; Pacifici, K.; Peterson, J.T.; Moore, C.T.
2008-01-01
We describe a survey design and field protocol for the Ivory-billed Woodpecker (Campephilus principalis) search effort that will: (1) allow estimation of occupancy, use, and detection probability for habitats at two spatial scales within the bird?s former range, (2) assess relationships between occupancy, use, and habitat characteristics at those scales, (3) eventually allow the development of a population viability model that depends on patch occupancy instead of difficult-to-measure demographic parameters, and (4) be adaptive, allowing newly collected information to update the above models and search locations. The approach features random selection of patches to be searched from a sampling frame stratified and weighted by patch quality, and requires multiple visits per patch. It is adaptive within a season in that increased search activity is allowed in and around locations of strong visual and/or aural evidence, and adaptive among seasons in that habitat associations allow modification of stratum weights. This statistically rigorous approach is an improvement over simply visiting the ?best? habitat in an ad hoc fashion because we can learn from prior effort and modify the search accordingly. Results from the 2006-07 search season indicate weak relationships between occupancy and habitat (although we suggest modifications of habitat measurement protocols), and a very low detection probability, suggesting more visits per patch are required. Sample size requirements will be discussed.
Silva, Déborah R O; Ligeiro, Raphael; Hughes, Robert M; Callisto, Marcos
2016-06-01
Taxonomic richness is one of the most important measures of biological diversity in ecological studies, including those with stream macroinvertebrates. However, it is impractical to measure the true richness of any site directly by sampling. Our objective was to evaluate the effect of sampling effort on estimates of macroinvertebrate family and Ephemeroptera, Plecoptera, and Trichoptera (EPT) genera richness at two scales: basin and stream site. In addition, we tried to determine which environmental factors at the site scale most influenced the amount of sampling effort needed. We sampled 39 sites in the Cerrado biome (neotropical savanna). In each site, we obtained 11 equidistant samples of the benthic assemblage and multiple physical habitat measurements. The observed basin-scale richness achieved a consistent estimation from Chao 1, Jack 1, and Jack 2 richness estimators. However, at the site scale, there was a constant increase in the observed number of taxa with increased number of samples. Models that best explained the slope of site-scale sampling curves (representing the necessity of greater sampling effort) included metrics that describe habitat heterogeneity, habitat structure, anthropogenic disturbance, and water quality, for both macroinvertebrate family and EPT genera richness. Our results demonstrate the importance of considering basin- and site-scale sampling effort in ecological surveys and that taxa accumulation curves and richness estimators are good tools for assessing sampling efficiency. The physical habitat explained a significant amount of the sampling effort needed. Therefore, future studies should explore the possible implications of physical habitat characteristics when developing sampling objectives, study designs, and calculating the needed sampling effort.
Invited article: Neurology education research.
Stern, Barney J; Lowenstein, Daniel H; Schuh, Lori A
2008-03-11
There is a need to rigorously study the neurologic education of medical students, neurology residents, and neurologists to determine the effectiveness of our educational efforts. We review the status of neurologic education research as it pertains to the groups of interest. We identify opportunities and impediments for education research. The introduction of the Accreditation Council for Graduate Medical Education core competencies, the Accreditation Council of Continuing Medical Education requirement to link continuing medical education to improved physician behavior and patient care, and the American Board of Medical Specialties/American Board of Psychiatry and Neurology-mandated maintenance of certification program represent research opportunities. Challenges include numerous methodologic issues such as definition of the theoretical framework of the study, adequate sample size ascertainment, and securing research funding. State-of-the-art education research will require multidisciplinary research teams and innovative funding strategies. The central goal of all concerned should be defining educational efforts that improve patient outcomes.
A sampling design framework for monitoring secretive marshbirds
Johnson, D.H.; Gibbs, J.P.; Herzog, M.; Lor, S.; Niemuth, N.D.; Ribic, C.A.; Seamans, M.; Shaffer, T.L.; Shriver, W.G.; Stehman, S.V.; Thompson, W.L.
2009-01-01
A framework for a sampling plan for monitoring marshbird populations in the contiguous 48 states is proposed here. The sampling universe is the breeding habitat (i.e. wetlands) potentially used by marshbirds. Selection protocols would be implemented within each of large geographical strata, such as Bird Conservation Regions. Site selection will be done using a two-stage cluster sample. Primary sampling units (PSUs) would be land areas, such as legal townships, and would be selected by a procedure such as systematic sampling. Secondary sampling units (SSUs) will be wetlands or portions of wetlands in the PSUs. SSUs will be selected by a randomized spatially balanced procedure. For analysis, the use of a variety of methods as a means of increasing confidence in conclusions that may be reached is encouraged. Additional effort will be required to work out details and implement the plan.
Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long
2015-01-01
The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and (131)I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. Copyright © 2014 Elsevier Ltd. All rights reserved.
Redmond, Daniel P; Chiew, Yeong Shiong; Major, Vincent; Chase, J Geoffrey
2016-09-23
Monitoring of respiratory mechanics is required for guiding patient-specific mechanical ventilation settings in critical care. Many models of respiratory mechanics perform poorly in the presence of variable patient effort. Typical modelling approaches either attempt to mitigate the effect of the patient effort on the airway pressure waveforms, or attempt to capture the size and shape of the patient effort. This work analyses a range of methods to identify respiratory mechanics in volume controlled ventilation modes when there is patient effort. The models are compared using 4 Datasets, each with a sample of 30 breaths before, and 2-3 minutes after sedation has been administered. The sedation will reduce patient efforts, but the underlying pulmonary mechanical properties are unlikely to change during this short time. Model identified parameters from breathing cycles with patient effort are compared to breathing cycles that do not have patient effort. All models have advantages and disadvantages, so model selection may be specific to the respiratory mechanics application. However, in general, the combined method of iterative interpolative pressure reconstruction, and stacking multiple consecutive breaths together has the best performance over the Dataset. The variability of identified elastance when there is patient effort is the lowest with this method, and there is little systematic offset in identified mechanics when sedation is administered. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Aerial survey methodology for bison population estimation in Yellowstone National Park
Hess, Steven C.
2002-01-01
I developed aerial survey methods for statistically rigorous bison population estimation in Yellowstone National Park to support sound resource management decisions and to understand bison ecology. Survey protocols, data recording procedures, a geographic framework, and seasonal stratifications were based on field observations from February 1998-September 2000. The reliability of this framework and strata were tested with long-term data from 1970-1997. I simulated different sample survey designs and compared them to high-effort censuses of well-defined large areas to evaluate effort, precision, and bias. Sample survey designs require much effort and extensive information on the current spatial distribution of bison and therefore do not offer any substantial reduction in time and effort over censuses. I conducted concurrent ground surveys, or 'double sampling' to estimate detection probability during aerial surveys. Group size distribution and habitat strongly affected detection probability. In winter, 75% of the groups and 92% of individual bison were detected on average from aircraft, while in summer, 79% of groups and 97% of individual bison were detected. I also used photography to quantify the bias due to counting large groups of bison accurately and found that undercounting increased with group size and could reach 15%. I compared survey conditions between seasons and identified optimal time windows for conducting surveys in both winter and summer. These windows account for the habitats and total area bison occupy, and group size distribution. Bison became increasingly scattered over the Yellowstone region in smaller groups and more occupied unfavorable habitats as winter progressed. Therefore, the best conditions for winter surveys occur early in the season (Dec-Jan). In summer, bison were most spatially aggregated and occurred in the largest groups by early August. Low variability between surveys and high detection probability provide population estimates with an overall coefficient of variation of approximately 8% and have high power for detecting trends in population change. I demonstrated how population estimates from winter and summer can be integrated into a comprehensive monitoring program to estimate annual growth rates, overall winter mortality, and an index of calf production, requiring about 30 hours of flight per year.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crockett, C.S.; Haas, C.N.
1996-11-01
Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less
Rew, Mary Beth; Robbins, Jooke; Mattila, David; Palsbøll, Per J; Bérube, Martine
2011-04-01
Genetic identification of individuals is now commonplace, enabling the application of tagging methods to elusive species or species that cannot be tagged by traditional methods. A key aspect is determining the number of loci required to ensure that different individuals have non-matching multi-locus genotypes. Closely related individuals are of particular concern because of elevated matching probabilities caused by their recent co-ancestry. This issue may be addressed by increasing the number of loci to a level where full siblings (the relatedness category with the highest matching probability) are expected to have non-matching multi-locus genotypes. However, increasing the number of loci to meet this "full-sib criterion" greatly increases the laboratory effort, which in turn may increase the genotyping error rate resulting in an upward-biased mark-recapture estimate of abundance as recaptures are missed due to genotyping errors. We assessed the contribution of false matches from close relatives among 425 maternally related humpback whales, each genotyped at 20 microsatellite loci. We observed a very low (0.5-4%) contribution to falsely matching samples from pairs of first-order relatives (i.e., parent and offspring or full siblings). The main contribution to falsely matching individuals from close relatives originated from second-order relatives (e.g., half siblings), which was estimated at 9%. In our study, the total number of observed matches agreed well with expectations based upon the matching probability estimated for unrelated individuals, suggesting that the full-sib criterion is overly conservative, and would have required a 280% relative increase in effort. We suggest that, under most circumstances, the overall contribution to falsely matching samples from close relatives is likely to be low, and hence applying the full-sib criterion is unnecessary. In those cases where close relatives may present a significant issue, such as unrepresentative sampling, we propose three different genotyping strategies requiring only a modest increase in effort, which will greatly reduce the number of false matches due to the presence of related individuals.
Lizoňová, Zuzana; Horsák, Michal
2017-04-01
Ecological studies of peatland testate amoebae are generally based on totals of 150 individuals per sample. However, the suitability of this standard has never been assessed for alkaline habitats such as spring fens. We explored the differences in testate amoeba diversity between Sphagnum and brown-moss microhabitats at a mire site with a highly diversified moss layer which reflects the small-scale heterogeneity in groundwater chemistry. Relationships between sampling efficiency and sample completeness were explored using individual-based species accumulation curves and the effort required to gain an extra species was assessed. Testate amoeba diversity differed substantially between microhabitats, with brown mosses hosting on average twice as many species and requiring greater shell totals to reach comparable sample analysis efficiency as for Sphagnum. Thus, for samples from alkaline conditions an increase in shell totals would be required and even an overall doubling up to 300 individuals might be considered for reliable community description. Our small-scale data are likely not robust enough to provide an ultimate solution for the optimization of shell totals. However, the results proved that testate amoebae communities from acidic and alkaline environments differ sharply in both species richness and composition and they might call for different methodological approaches. Copyright © 2017 Elsevier GmbH. All rights reserved.
Rapid Assessment of Contaminants and Interferences in Mass Spectrometry Data Using Skyline
NASA Astrophysics Data System (ADS)
Rardin, Matthew J.
2018-04-01
Proper sample preparation in proteomic workflows is essential to the success of modern mass spectrometry experiments. Complex workflows often require reagents which are incompatible with MS analysis (e.g., detergents) necessitating a variety of sample cleanup procedures. Efforts to understand and mitigate sample contamination are a continual source of disruption with respect to both time and resources. To improve the ability to rapidly assess sample contamination from a diverse array of sources, I developed a molecular library in Skyline for rapid extraction of contaminant precursor signals using MS1 filtering. This contaminant template library is easily managed and can be modified for a diverse array of mass spectrometry sample preparation workflows. Utilization of this template allows rapid assessment of sample integrity and indicates potential sources of contamination. [Figure not available: see fulltext.
New laser materials for laser diode pumping
NASA Technical Reports Server (NTRS)
Jenssen, H. P.
1990-01-01
The potential advantages of laser diode pumped solid state lasers are many with high overall efficiency being the most important. In order to realize these advantages, the solid state laser material needs to be optimized for diode laser pumping and for the particular application. In the case of the Nd laser, materials with a longer upper level radiative lifetime are desirable. This is because the laser diode is fundamentally a cw source, and to obtain high energy storage, a long integration time is necessary. Fluoride crystals are investigated as host materials for the Nd laser and also for IR laser transitions in other rare earths, such as the 2 micron Ho laser and the 3 micron Er laser. The approach is to investigate both known crystals, such as BaY2F8, as well as new crystals such as NaYF8. Emphasis is on the growth and spectroscopy of BaY2F8. These two efforts are parallel efforts. The growth effort is aimed at establishing conditions for obtaining large, high quality boules for laser samples. This requires numerous experimental growth runs; however, from these runs, samples suitable for spectroscopy become available.
Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino
2015-09-01
The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of <0.10. The enumerative sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, T.
The Nuclear Forensics Analysis Center (NFAC) is part of Savannah River National Laboratory (SRNL) and is one of only two USG National Laboratories accredited to perform nuclear forensic analyses to the requirements of ISO 17025. SRNL NFAC is capable of analyzing nuclear and radiological samples from bulk material to ultra-trace samples. NFAC provides analytical support to the FBI's Radiological Evidence Examination Facility (REEF), which is located within SRNL. REEF gives the FBI the capability to perform traditional forensics on material that is radiological and/or is contaminated. SRNL is engaged in research and development efforts to improve the USG technical nuclearmore » forensics capabilities. Research includes improving predictive signatures and developing a database containing comparative samples.« less
NASA Astrophysics Data System (ADS)
Beddow, B.; Roberts, C.; Rankin, J.; Bloch, A.; Peizer, J.
1981-01-01
The National Accident Sampling System (NASS) is described. The study area discussed is one of the original ten sites selected for NASS implementation. In addition to collecting data from the field, the original ten sites address questions of feasibility of the plan, projected results of the data collection effort, and specific operational topics, e.g., team size, sampling requirements, training approaches, quality control procedures, and field techniques. Activities and results of the first three years of the project, for both major tasks (establishment and operation) are addressed. Topics include: study area documentation; team description, function and activities; problems and solutions; and recommendations.
Kille, Sabrina; Acevedo-Rocha, Carlos G; Parra, Loreto P; Zhang, Zhi-Gang; Opperman, Diederik J; Reetz, Manfred T; Acevedo, Juan Pablo
2013-02-15
Saturation mutagenesis probes define sections of the vast protein sequence space. However, even if randomization is limited this way, the combinatorial numbers problem is severe. Because diversity is created at the codon level, codon redundancy is a crucial factor determining the necessary effort for library screening. Additionally, due to the probabilistic nature of the sampling process, oversampling is required to ensure library completeness as well as a high probability to encounter all unique variants. Our trick employs a special mixture of three primers, creating a degeneracy of 22 unique codons coding for the 20 canonical amino acids. Therefore, codon redundancy and subsequent screening effort is significantly reduced, and a balanced distribution of codon per amino acid is achieved, as demonstrated exemplarily for a library of cyclohexanone monooxygenase. We show that this strategy is suitable for any saturation mutagenesis methodology to generate less-redundant libraries.
Issues in subject recruitment and retention with pregnant and parenting substance-abusing women.
Howard, J; Beckwith, L
1996-01-01
To advance knowledge about the treatment of addiction among pregnant women and other women of childbearing age, investigators must adhere to the requirements of a strict experimental research design while concurrently providing clinical services. This means that researchers must address a variety of difficult questions, including the following: Was the sample large enough? Were the criteria for subject inclusion and exclusion well defined? Did the process of recruitment result in a sample that could be generalized to a larger population, or was the sample biased in some way? Was assignment to groups clearly random? What was the attrition rate? Was attrition the same in both experimental and comparison groups? Did baseline measures collect enough information to permit a description of the facts that were associated with attrition in each group? Was the attrition rate so high that the retained sample had special characteristics? If so, what were these features? This chapter highlights several problems related to these questions, describes the difficulties that investigators have faced in meeting clinical and research challenges to date, and suggests strategies for overcoming some obstacles. In establishing the Perinatal-20 project, the National Institute on Drug Abuse took an informed first step in organizing a substantial research effort to investigate treatment modalities that incorporate services specific to the needs of substance-abusing women who have children. This initial effort has resulted in a beginning knowledge base that can be used to refine and expand future treatment efforts. Even the issue of the "study unit" for this population is evolving. Today's researchers are attempting to determine whether the mother alone or the mother along with her dependent children constitutes the study unit. This question also has led professionals in the field to examine a range of specific outcome priorities, and investigators just now are beginning to determine exactly what needs to be evaluated in gauging the effectiveness of treatment. Is success measured on the basis of the woman's progress with abstinence alone, or does it also include her role with her children? Is it determined on the basis of her relationship with her children or the children's growth and development? Compared with providing services for and studying single adult subjects, developing treatment for women and their children presents researchers with a more complex task and requires expanded clinical services (Gallagher 1990, pp. 540-559). As in most fields of study, initial research data in substance abuse treatment for pregnant and parenting women are derived from samples of convenience, as described above. To put this information in perspective, future research will require a wider and more representative spectrum of the population. Furthermore, tensions between clinical needs and research requirements must be considered in advance, and methods for relaxing these tensions will be critical to the success of future efforts. For example, members of both the research and clinical staff teams must be absolutely clear about the study design and the requirements of reliable research. Where possible, potential ambiguities about group assignment, project services, subjects' responsibilities, and so forth must be incorporated into subject consent forms so that the subjects also are apprised of potential problems and their solutions. A final caution to future investigators is to be aware of the economic, physical, and personnel limitations of the range of treatment services that can be provided in a research demonstration study involving this population. Because of these limitations and the extensive range of services the subjects of the studies require, treatment components must be discrete and carefully defined to prevent programs from becoming impractically diverse and unclear. Research goals must be attainable and measurable.(ABSTRACT TRUNCATED)
Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey
2018-04-01
Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.
Jokerst, Jesse V.; Floriano, Pierre N.; Christodoulides, Nicolaos; Simmons, Glennon W.; McDevitt, John T.
2010-01-01
Recent humanitarian efforts have led to the widespread release of antiretroviral drugs for the treatment of the more than 33 million HIV afflicted people living in resource-scarce settings. Here, the enumeration of CD4+ T lymphocytes is required to establish the level at which the immune system has been compromised. The gold standard method used in developed countries, based on flow cytometry, though widely accepted and accurate, is precluded from widespread use in resource-scarce settings due to its high expense, high technical requirements, difficulty in operation-maintenance and the lack of portability for these sophisticated laboratory-confined systems. As part of continuing efforts to develop practical diagnostic instrumentation, the integration of semiconductor nanocrystals (quantum dots, QDs) into a portable microfluidic-based lymphocyte capture and detection device is completed. This integrated system is capable of isolating and counting selected lymphocyte sub-populations (CD3+CD4+) from whole blood samples. By combining the unique optical properties of the QDs with the sample handling capabilities and cost effectiveness of novel microfluidic systems, a practical, portable lymphocyte measurement modality that correlates nicely with flow cytometry (R2 = 0.97) has been developed. This QD-based system reduces the optical requirements significantly relative to molecular fluorophores and the mini-CD4 counting device is projected to be suitable for use in both point-of-need and resource-scarce settings. PMID:19023471
VAN Kesteren, F; Mastin, A; Torgerson, P R; Mytynova, Bermet; Craig, P S
2017-09-01
Echinococcosis is a re-emerging zoonotic disease in Kyrgyzstan. In 2012, an echinococcosis control scheme was started that included dosing owned dogs in the Alay Valley, Kyrgyzstan with praziquantel. Control programmes require large investments of money and resources; as such it is important to evaluate how well these are meeting their targets. However, problems associated with echinococcosis control schemes include remoteness and semi-nomadic customs of affected communities, and lack of resources. These same problems apply to control scheme evaluations, and quick and easy assessment tools are highly desirable. Lot quality assurance sampling was used to assess the impact of approximately 2 years of echinococcosis control in the Alay valley. A pre-intervention coproELISA prevalence was established, and a 75% threshold for dosing compliance was set based on previous studies. Ten communities were visited in 2013 and 2014, with 18-21 dogs sampled per community, and questionnaires administered to dog owners. After 21 months of control efforts, 8/10 communities showed evidence of reaching the 75% praziquantel dosing target, although only 3/10 showed evidence of a reduction in coproELISA prevalence. This is understandable, since years of sustained control are required to effectively control echinococcosis, and efforts in the Alay valley should be and are being continued.
Wilson, Keithia L; Charker, Jill; Lizzio, Alf; Halford, Kim; Kimlin, Siobhan
2005-09-01
It is widely believed that satisfying couple relationships require work by the partners. The authors equated the concept of work to relationship self-regulation and developed a scale to assess this construct. A factor analysis of the scale in a sample of 187 newlywed couples showed it comprised 2 factors of relationship strategies and effort. The factor structure was replicated in an independent sample of 97 newlywed couples. In both samples the scale had good internal consistency and high convergent validity between self- and partner-report forms. Self-regulation accounted for substantial variance in relationship satisfaction in both newlywed samples and in a 3rd sample of 61 long-married couples. The self-regulation and satisfaction association was independent of mood or self-report common method variance. (c) 2005 APA, all rights reserved
A Ground Systems Template for Remote Sensing Systems
NASA Astrophysics Data System (ADS)
McClanahan, Timothy P.; Trombka, Jacob I.; Floyd, Samuel R.; Truskowski, Walter; Starr, Richard D.; Clark, Pamela E.; Evans, Larry G.
2002-10-01
Spaceborne remote sensing using gamma and X-ray spectrometers requires particular attention to the design and development of reliable systems. These systems must ensure the scientific requirements of the mission within the challenging technical constraints of operating instrumentation in space. The Near Earth Asteroid Rendezvous (NEAR) spacecraft included X-ray and gamma-ray spectrometers (XGRS), whose mission was to map the elemental chemistry of the 433 Eros asteroid. A remote sensing system template, similar to a blackboard systems approach used in artificial intelligence, was identified in which the spacecraft, instrument, and ground system was designed and developed to monitor and adapt to evolving mission requirements in a complicated operational setting. Systems were developed for ground tracking of instrument calibration, instrument health, data quality, orbital geometry, solar flux as well as models of the asteroid's surface characteristics, requiring an intensive human effort. In the future, missions such as the Autonomous Nano-Technology Swarm (ANTS) program will have to rely heavily on automation to collectively encounter and sample asteroids in the outer asteroid belt. Using similar instrumentation, ANTS will require information similar to data collected by the NEAR X-ray/Gamma-Ray Spectrometer (XGRS) ground system for science and operations management. The NEAR XGRS systems will be studied to identify the equivalent subsystems that may be automated for ANTS. The effort will also investigate the possibility of applying blackboard style approaches to automated decision making required for ANTS.
Sample size and allocation of effort in point count sampling of birds in bottomland hardwood forests
Smith, W.P.; Twedt, D.J.; Cooper, R.J.; Wiedenfeld, D.A.; Hamel, P.B.; Ford, R.P.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
To examine sample size requirements and optimum allocation of effort in point count sampling of bottomland hardwood forests, we computed minimum sample sizes from variation recorded during 82 point counts (May 7-May 16, 1992) from three localities containing three habitat types across three regions of the Mississippi Alluvial Valley (MAV). Also, we estimated the effect of increasing the number of points or visits by comparing results of 150 four-minute point counts obtained from each of four stands on Delta Experimental Forest (DEF) during May 8-May 21, 1991 and May 30-June 12, 1992. For each stand, we obtained bootstrap estimates of mean cumulative number of species each year from all possible combinations of six points and six visits. ANOVA was used to model cumulative species as a function of number of points visited, number of visits to each point, and interaction of points and visits. There was significant variation in numbers of birds and species between regions and localities (nested within region); neither habitat, nor the interaction between region and habitat, was significant. For a = 0.05 and a = 0.10, minimum sample size estimates (per factor level) varied by orders of magnitude depending upon the observed or specified range of desired detectable difference. For observed regional variation, 20 and 40 point counts were required to accommodate variability in total individuals (MSE = 9.28) and species (MSE = 3.79), respectively, whereas ? 25 percent of the mean could be achieved with five counts per factor level. Sample size sufficient to detect actual differences of Wood Thrush (Hylocichla mustelina) was >200, whereas the Prothonotary Warbler (Protonotaria citrea) required <10 counts. Differences in mean cumulative species were detected among number of points visited and among number of visits to a point. In the lower MAV, mean cumulative species increased with each added point through five points and with each additional visit through four visits. Although no interaction was detected between number of points and number of visits, when paired reciprocals were compared, more points invariably yielded a significantly greater cumulative number of species than more visits to a point. Still, 36 point counts per stand during each of two breeding seasons detected only 52 percent of the known available species pool in DEF.
NASA Technical Reports Server (NTRS)
Scott, Elaine P.
1994-01-01
Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort was related to the development of the experimental techniques. Initial experiments required a resistance heater placed between two samples. The design was modified such that the heater was placed on the surface of only one sample, as would be necessary in the analysis of built up structures. Experiments using the modified technique were conducted on the composite sample used previously at different temperatures. The results were within 5 percent of those found using two samples. Finally, an initial heat transfer analysis, including conduction, convection and radiation components, was completed on a titanium sandwich structural sample. Experiments utilizing this sample are currently being designed and will be used to first estimate the material's effective thermal conductivity and later to determine the properties associated with each individual heat transfer component.
Advances in arsenic biosensor development--a comprehensive review.
Kaur, Hardeep; Kumar, Rabindra; Babu, J Nagendra; Mittal, Sunil
2015-01-15
Biosensors are analytical devices having high sensitivity, portability, small sample requirement and ease of use for qualitative and quantitative monitoring of various analytes of human importance. Arsenic (As), owing to its widespread presence in nature and high toxicity to living creatures, requires frequent determination in water, soil, agricultural and food samples. The present review is an effort to highlight the various advancements made so far in the development of arsenic biosensors based either on recombinant whole cells or on certain arsenic-binding oligonucleotides or proteins. The role of futuristic approaches like surface plasmon resonance (SPR) and aptamer technology has also been discussed. The biomethods employed and their general mechanisms, advantages and limitations in relevance to arsenic biosensors developed so far are intended to be discussed in this review. Copyright © 2014 Elsevier B.V. All rights reserved.
Grizzle, R E; Ward, L G; Fredriksson, D W; Irish, J D; Langan, R; Heinig, C S; Greene, J K; Abeels, H A; Peter, C R; Eberhardt, A L
2014-11-15
The seafloor at an open ocean finfish aquaculture facility in the western Gulf of Maine, USA was monitored from 1999 to 2008 by sampling sites inside a predicted impact area modeled by oceanographic conditions and fecal and food settling characteristics, and nearby reference sites. Univariate and multivariate analyses of benthic community measures from box core samples indicated minimal or no significant differences between impact and reference areas. These findings resulted in development of an adaptive monitoring protocol involving initial low-cost methods that required more intensive and costly efforts only when negative impacts were initially indicated. The continued growth of marine aquaculture is dependent on further development of farming methods that minimize negative environmental impacts, as well as effective monitoring protocols. Adaptive monitoring protocols, such as the one described herein, coupled with mathematical modeling approaches, have the potential to provide effective protection of the environment while minimize monitoring effort and costs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Method to determine 226Ra in small sediment samples by ultralow background liquid scintillation.
Sanchez-Cabeza, Joan-Albert; Kwong, Laval Liong Wee; Betti, Maria
2010-08-15
(210)Pb dating of sediment cores is a widely used tool to reconstruct ecosystem evolution and historical pollution during the last century. Although (226)Ra can be determined by gamma spectrometry, this method shows severe limitations which are, among others, sample size requirements and counting times. In this work, we propose a new strategy based on the analysis of (210)Pb through (210)Po in equilibrium by alpha spectrometry, followed by the determination of (226)Ra (base or supported (210)Pb) without any further chemical purification by liquid scintillation and with a higher sample throughput. Although gamma spectrometry might still be required to determine (137)Cs as an independent tracer, the effort can then be focused only on those sections dated around 1963, when maximum activities are expected. In this work, we optimized the counting conditions, calibrated the system for changing quenching, and described the new method to determine (226)Ra in small sediment samples, after (210)Po determination, allowing a more precise determination of excess (210)Pb ((210)Pb(ex)). The method was validated with reference materials IAEA-384, IAEA-385, and IAEA-313.
Fernando, Rohan L; Cheng, Hao; Golden, Bruce L; Garrick, Dorian J
2016-12-08
Two types of models have been used for single-step genomic prediction and genome-wide association studies that include phenotypes from both genotyped animals and their non-genotyped relatives. The two types are breeding value models (BVM) that fit breeding values explicitly and marker effects models (MEM) that express the breeding values in terms of the effects of observed or imputed genotypes. MEM can accommodate a wider class of analyses, including variable selection or mixture model analyses. The order of the equations that need to be solved and the inverses required in their construction vary widely, and thus the computational effort required depends upon the size of the pedigree, the number of genotyped animals and the number of loci. We present computational strategies to avoid storing large, dense blocks of the MME that involve imputed genotypes. Furthermore, we present a hybrid model that fits a MEM for animals with observed genotypes and a BVM for those without genotypes. The hybrid model is computationally attractive for pedigree files containing millions of animals with a large proportion of those being genotyped. We demonstrate the practicality on both the original MEM and the hybrid model using real data with 6,179,960 animals in the pedigree with 4,934,101 phenotypes and 31,453 animals genotyped at 40,214 informative loci. To complete a single-trait analysis on a desk-top computer with four graphics cards required about 3 h using the hybrid model to obtain both preconditioned conjugate gradient solutions and 42,000 Markov chain Monte-Carlo (MCMC) samples of breeding values, which allowed making inferences from posterior means, variances and covariances. The MCMC sampling required one quarter of the effort when the hybrid model was used compared to the published MEM. We present a hybrid model that fits a MEM for animals with genotypes and a BVM for those without genotypes. Its practicality and considerable reduction in computing effort was demonstrated. This model can readily be extended to accommodate multiple traits, multiple breeds, maternal effects, and additional random effects such as polygenic residual effects.
Mars Sample Handling Protocol Workshop Series: Workshop 2
NASA Technical Reports Server (NTRS)
Rummel, John D. (Editor); Acevedo, Sara E. (Editor); Kovacs, Gregory T. A. (Editor); Race, Margaret S. (Editor); DeVincenzi, Donald L. (Technical Monitor)
2001-01-01
Numerous NASA reports and studies have identified Planetary Protection (PP) as an important part of any Mars sample return mission. The mission architecture, hardware, on-board experiments, and related activities must be designed in ways that prevent both forward- and back-contamination and also ensure maximal return of scientific information. A key element of any PP effort for sample return missions is the development of guidelines for containment and analysis of returned sample(s). As part of that effort, NASA and the Space Studies Board (SSB) of the National Research Council (NRC) have each assembled experts from a wide range of scientific fields to identify and discuss issues pertinent to sample return. In 1997, the SSB released its report on recommendations for handling and testing of returned Mars samples. In particular, the NRC recommended that: a) samples returned from Mars by spacecraft should be contained and treated as potentially hazardous until proven otherwise, and b) rigorous physical, chemical, and biological analyses [should] confirm that there is no indication of the presence of any exogenous biological entity. Also in 1997, a Mars Sample Quarantine Protocol workshop was convened at NASA Ames Research Center to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent 'uncontrolled release' of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. In 1999, a study by NASA's Mars Sample Handling and Requirements Panel (MSHARP) addressed three other specific areas in anticipation of returning samples from Mars: 1) sample collection and transport back to Earth; 2) certification of the samples as non-hazardous; and 3) sample receiving, curation, and distribution. To further refine the requirements for sample hazard testing and the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened an additional series of workshops beginning in March 2000. The overall objective of these workshops was to develop comprehensive protocols to assess whether the returned materials contain any biological hazards, and to safeguard the purity of the samples from possible terrestrial contamination. This document is the report of the second Workshop in the Series. The information herein will ultimately be integrated into a final document reporting the proceedings of the entire Workshop Series along with additional information and recommendations.
Hughes, Sarah E; Hutchings, Hayley A; Rapport, Frances L; McMahon, Catherine M; Boisvert, Isabelle
2018-02-08
Individuals with hearing loss often report a need for increased effort when listening, particularly in challenging acoustic environments. Despite audiologists' recognition of the impact of listening effort on individuals' quality of life, there are currently no standardized clinical measures of listening effort, including patient-reported outcome measures (PROMs). To generate items and content for a new PROM, this qualitative study explored the perceptions, understanding, and experiences of listening effort in adults with severe-profound sensorineural hearing loss before and after cochlear implantation. Three focus groups (1 to 3) were conducted. Purposive sampling was used to recruit 17 participants from a cochlear implant (CI) center in the United Kingdom. The participants included adults (n = 15, mean age = 64.1 years, range 42 to 84 years) with acquired severe-profound sensorineural hearing loss who satisfied the UK's national candidacy criteria for cochlear implantation and their normal-hearing significant others (n = 2). Participants were CI candidates who used hearing aids (HAs) and were awaiting CI surgery or CI recipients who used a unilateral CI or a CI and contralateral HA (CI + HA). Data from a pilot focus group conducted with 2 CI recipients were included in the analysis. The data, verbatim transcripts of the focus group proceedings, were analyzed qualitatively using constructivist grounded theory (GT) methodology. A GT of listening effort in cochlear implantation was developed from participants' accounts. The participants provided rich, nuanced descriptions of the complex and multidimensional nature of their listening effort. Interpreting and integrating these descriptions through GT methodology, listening effort was described as the mental energy required to attend to and process the auditory signal, as well as the effort required to adapt to, and compensate for, a hearing loss. Analyses also suggested that listening effort for most participants was motivated by a need to maintain a sense of social connectedness (i.e., the subjective awareness of being in touch with one's social world). Before implantation, low social connectedness in the presence of high listening effort encouraged self-alienating behaviors and resulted in social isolation with adverse effects for participant's well-being and quality of life. A CI moderated but did not remove the requirement for listening effort. Listening effort, in combination with the improved auditory signal supplied by the CI, enabled most participants to listen and communicate more effectively. These participants reported a restored sense of social connectedness and an acceptance of the continued need for listening effort. Social connectedness, effort-reward balance, and listening effort as a multidimensional phenomenon were the core constructs identified as important to participants' experiences and understanding of listening effort. The study's findings suggest: (1) perceived listening effort is related to social and psychological factors and (2) these factors may influence how individuals with hearing loss report on the actual cognitive processing demands of listening. These findings provide evidence in support of the Framework for Understanding Effortful Listening a heuristic that describes listening effort as a function of both motivation and demands on cognitive capacity. This GT will inform item development and establish the content validity for a new PROM for measuring listening effort.
Statistical methods for identifying and bounding a UXO target area or minefield
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinstry, Craig A.; Pulsipher, Brent A.; Gilbert, Richard O.
2003-09-18
The sampling unit for minefield or UXO area characterization is typically represented by a geographical block or transect swath that lends itself to characterization by geophysical instrumentation such as mobile sensor arrays. New spatially based statistical survey methods and tools, more appropriate for these unique sampling units have been developed and implemented at PNNL (Visual Sample Plan software, ver. 2.0) with support from the US Department of Defense. Though originally developed to support UXO detection and removal efforts, these tools may also be used in current form or adapted to support demining efforts and aid in the development of newmore » sensors and detection technologies by explicitly incorporating both sampling and detection error in performance assessments. These tools may be used to (1) determine transect designs for detecting and bounding target areas of critical size, shape, and density of detectable items of interest with a specified confidence probability, (2) evaluate the probability that target areas of a specified size, shape and density have not been missed by a systematic or meandering transect survey, and (3) support post-removal verification by calculating the number of transects required to achieve a specified confidence probability that no UXO or mines have been missed.« less
Miller, Brett; McCardle, Peggy
2011-01-01
Continued progress in language and learning disabilities (LDs) research requires a renewed focused on issues of etiology. Genetics research forms a central tenet of such an agenda and is critical in clarifying relationships among oral language development, acquisition of literacy and mathematics, executive function skills, and comorbid conditions. For progress to be made, diversified efforts must continue to emphasize molecular and behavioral genetics (including quantitative genetics) approaches, in concert with multi-disciplinary and multi-modal projects, to provide an integrated understanding of the behavioral and biological manifestations of language and learning disabilities. Critically, increased efforts to include ethnic, socio-economic, and linguistically diverse participant samples across a range of developmental stages is required to meet the public health needs of learners in the US and across the world. Taken together, this body of work will continue to enhance our understanding of LDs and help us move toward a truly prevention based approach to language and learning disabilities.
Long Term Resource Monitoring Program procedures: fish monitoring
Ratcliff, Eric N.; Glittinger, Eric J.; O'Hara, T. Matt; Ickes, Brian S.
2014-01-01
This manual constitutes the second revision of the U.S. Army Corps of Engineers’ Upper Mississippi River Restoration-Environmental Management Program (UMRR-EMP) Long Term Resource Monitoring Program (LTRMP) element Fish Procedures Manual. The original (1988) manual merged and expanded on ideas and recommendations related to Upper Mississippi River fish sampling presented in several early documents. The first revision to the manual was made in 1995 reflecting important protocol changes, such as the adoption of a stratified random sampling design. The 1995 procedures manual has been an important document through the years and has been cited in many reports and scientific manuscripts. The resulting data collected by the LTRMP fish component represent the largest dataset on fish within the Upper Mississippi River System (UMRS) with more than 44,000 collections of approximately 5.7 million fish. The goal of this revision of the procedures manual is to document changes in LTRMP fish sampling procedures since 1995. Refinements to sampling methods become necessary as monitoring programs mature. Possible refinements are identified through field experiences (e.g., sampling techniques and safety protocols), data analysis (e.g., planned and studied gear efficiencies and reallocations of effort), and technological advances (e.g., electronic data entry). Other changes may be required because of financial necessity (i.e., unplanned effort reductions). This version of the LTRMP fish monitoring manual describes the most current (2014) procedures of the LTRMP fish component.
ERIC Educational Resources Information Center
Wertz, Richard D.; And Others
In an effort to elicit student attitudes concerning residence hall living on campus a questionnaire was designed and administered to a random sample of 1,100 resident students at the University of South Carolina. The survey instrument consisted of a set of sixteen statements that required an "is" and a "should be" response. The…
Minority carrier diffusion lengths and absorption coefficients in silicon sheet material
NASA Technical Reports Server (NTRS)
Dumas, K. A.; Swimm, R. T.
1980-01-01
Most of the methods which have been developed for the measurement of the minority carrier diffusion length of silicon wafers require that the material have either a Schottky or an ohmic contact. The surface photovoltage (SPV) technique is an exception. The SPV technique could, therefore, become a valuable diagnostic tool in connection with current efforts to develop low-cost processes for the production of solar cells. The technique depends on a knowledge of the optical absorption coefficient. The considered investigation is concerned with a reevaluation of the absorption coefficient as a function of silicon processing. A comparison of absorption coefficient values showed these values to be relatively consistent from sample to sample, and independent of the sample growth method.
Hashimoto, Yuichiro
2017-01-01
The development of a robust ionization source using the counter-flow APCI, miniature mass spectrometer, and an automated sampling system for detecting explosives are described. These development efforts using mass spectrometry were made in order to improve the efficiencies of on-site detection in areas such as security, environmental, and industrial applications. A development team, including the author, has struggled for nearly 20 years to enhance the robustness and reduce the size of mass spectrometers to meet the requirements needed for on-site applications. This article focuses on the recent results related to the detection of explosive materials where automated particle sampling using a cyclone concentrator permitted the inspection time to be successfully reduced to 3 s. PMID:28337396
Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J
2001-08-01
The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.
Pre-Mission Input Requirements to Enable Successful Sample Collection by A Remote Field/EVA Team
NASA Technical Reports Server (NTRS)
Cohen, B. A.; Lim, D. S. S.; Young, K. E.; Brunner, A.; Elphic, R. E.; Horne, A.; Kerrigan, M. C.; Osinski, G. R.; Skok, J. R.; Squyres, S. W.;
2016-01-01
The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team, part of the Solar System Exploration Virtual Institute (SSERVI), is a field-based research program aimed at generating strategic knowledge in preparation for human and robotic exploration of the Moon, near-Earth asteroids, Phobos and Deimos, and beyond. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused and, moreover, is sampling-focused with the explicit intent to return the best samples for geochronology studies in the laboratory. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. We examined the in situ sample characterization and real-time decision-making process of the astronauts, with a guiding hypothesis that pre-mission training that included detailed background information on the analytical fate of a sample would better enable future astronauts to select samples that would best meet science requirements. We conducted three tests of this hypothesis over several days in the field. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This was not meant to be a blind, controlled test of crew efficacy, but rather an effort to explicitly recognize the relevant variables that enter into sampling protocol and to be able to develop recommendations for crew and backroom training in future endeavors.
DeVries, R. J.; Hann, D. A.; Schramm, H.L.
2015-01-01
This study evaluated the effects of environmental parameters on the probability of capturing endangered pallid sturgeon (Scaphirhynchus albus) using trotlines in the lower Mississippi River. Pallid sturgeon were sampled by trotlines year round from 2008 to 2011. A logistic regression model indicated water temperature (T; P < 0.01) and depth (D; P = 0.03) had significant effects on capture probability (Y = −1.75 − 0.06T + 0.10D). Habitat type, surface current velocity, river stage, stage change and non-sturgeon bycatch were not significant predictors (P = 0.26–0.63). Although pallid sturgeon were caught throughout the year, the model predicted that sampling should focus on times when the water temperature is less than 12°C and in deeper water to maximize capture probability; these water temperature conditions commonly occur during November to March in the lower Mississippi River. Further, the significant effect of water temperature which varies widely over time, as well as water depth indicate that any efforts to use the catch rate to infer population trends will require the consideration of temperature and depth in standardized sampling efforts or adjustment of estimates.
Vizentin-Bugoni, Jeferson; Maruyama, Pietro K; Debastiani, Vanderlei J; Duarte, L da S; Dalsgaard, Bo; Sazima, Marlies
2016-01-01
Virtually all empirical ecological interaction networks to some extent suffer from undersampling. However, how limitations imposed by sampling incompleteness affect our understanding of ecological networks is still poorly explored, which may hinder further advances in the field. Here, we use a plant-hummingbird network with unprecedented sampling effort (2716 h of focal observations) from the Atlantic Rainforest in Brazil, to investigate how sampling effort affects the description of network structure (i.e. widely used network metrics) and the relative importance of distinct processes (i.e. species abundances vs. traits) in determining the frequency of pairwise interactions. By dividing the network into time slices representing a gradient of sampling effort, we show that quantitative metrics, such as interaction evenness, specialization (H2 '), weighted nestedness (wNODF) and modularity (Q; QuanBiMo algorithm) were less biased by sampling incompleteness than binary metrics. Furthermore, the significance of some network metrics changed along the sampling effort gradient. Nevertheless, the higher importance of traits in structuring the network was apparent even with small sampling effort. Our results (i) warn against using very poorly sampled networks as this may bias our understanding of networks, both their patterns and structuring processes, (ii) encourage the use of quantitative metrics little influenced by sampling when performing spatio-temporal comparisons and (iii) indicate that in networks strongly constrained by species traits, such as plant-hummingbird networks, even small sampling is sufficient to detect their relative importance for the frequencies of interactions. Finally, we argue that similar effects of sampling are expected for other highly specialized subnetworks. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.
The RBANS Effort Index: base rates in geriatric samples.
Duff, Kevin; Spering, Cynthia C; O'Bryant, Sid E; Beglinger, Leigh J; Moser, David J; Bayless, John D; Culp, Kennith R; Mold, James W; Adams, Russell L; Scott, James G
2011-01-01
The Effort Index (EI) of the RBANS was developed to assist clinicians in discriminating patients who demonstrate good effort from those with poor effort. However, there are concerns that older adults might be unfairly penalized by this index, which uses uncorrected raw scores. Using five independent samples of geriatric patients with a broad range of cognitive functioning (e.g., cognitively intact, nursing home residents, probable Alzheimer's disease), base rates of failure on the EI were calculated. In cognitively intact and mildly impaired samples, few older individuals were classified as demonstrating poor effort (e.g., 3% in cognitively intact). However, in the more severely impaired geriatric patients, over one third had EI scores that fell above suggested cutoff scores (e.g., 37% in nursing home residents, 33% in probable Alzheimer's disease). In the cognitively intact sample, older and less educated patients were more likely to have scores suggestive of poor effort. Education effects were observed in three of the four clinical samples. Overall cognitive functioning was significantly correlated with EI scores, with poorer cognition being associated with greater suspicion of low effort. The current results suggest that age, education, and level of cognitive functioning should be taken into consideration when interpreting EI results and that significant caution is warranted when examining EI scores in elders suspected of having dementia.
Sampling procedures for throughfall monitoring: A simulation study
NASA Astrophysics Data System (ADS)
Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut
2010-01-01
What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.
Goal Based Testing: A Risk Informed Process
NASA Technical Reports Server (NTRS)
Everline, Chester; Smith, Clayton; Distefano, Sal; Goldin, Natalie
2014-01-01
A process for life demonstration testing is developed, which can reduce the number of resources required by conventional sampling theory while still maintaining the same degree of rigor and confidence level. This process incorporates state-of-the-art probabilistic thinking and is consistent with existing NASA guidance documentation. This view of life testing changes the paradigm of testing a system for many hours to show confidence that a system will last for the required number of years to one that focuses efforts and resources on exploring how the system can fail at end-of-life and building confidence that the failure mechanisms are understood and well mitigated.
NASA Technical Reports Server (NTRS)
Kiusalaas, J.; Reddy, G. B.
1977-01-01
A finite element program is presented for computer-automated, minimum weight design of elastic structures with constraints on stresses (including local instability criteria) and displacements. Volume 1 of the report contains the theoretical and user's manual of the program. Sample problems and the listing of the program are included in Volumes 2 and 3. The element subroutines are organized so as to facilitate additions and changes by the user. As a result, a relatively minor programming effort would be required to make DESAP 1 into a special purpose program to handle the user's specific design requirements and failure criteria.
Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.
1981-01-01
Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.
Ten-minute analysis of drugs and metabolites in saliva by surface-enhanced Raman spectroscopy
NASA Astrophysics Data System (ADS)
Shende, Chetan; Inscore, Frank; Maksymiuk, Paul; Farquharson, Stuart
2005-11-01
Rapid analysis of drugs in emergency room overdose patients is critical to selecting appropriate medical care. Saliva analysis has long been considered an attractive alternative to blood plasma analysis for this application. However, current clinical laboratory analysis methods involve extensive sample extraction followed by gas chromatography and mass spectrometry, and typically require as much as one hour to perform. In an effort to overcome this limitation we have been investigating metal-doped sol-gels to both separate drugs and their metabolites from saliva and generate surface-enhanced Raman spectra. We have incorporated the sol-gel in a disposable lab-on-a-chip format, and generally no more than a drop of sample is required. The detailed molecular vibrational information allows chemical identification, while the increase in Raman scattering by six orders of magnitude or more allows detection of microg/mL concentrations. Measurements of cocaine, its metabolite benzoylecgonine, and several barbiturates are presented.
Development of composite calibration standard for quantitative NDE by ultrasound and thermography
NASA Astrophysics Data System (ADS)
Dayal, Vinay; Benedict, Zach G.; Bhatnagar, Nishtha; Harper, Adam G.
2018-04-01
Inspection of aircraft components for damage utilizing ultrasonic Non-Destructive Evaluation (NDE) is a time intensive endeavor. Additional time spent during aircraft inspections translates to added cost to the company performing them, and as such, reducing this expenditure is of great importance. There is also great variance in the calibration samples from one entity to another due to a lack of a common calibration set. By characterizing damage types, we can condense the required calibration sets and reduce the time required to perform calibration while also providing procedures for the fabrication of these standard sets. We present here our effort to fabricate composite samples with known defects and quantify the size and location of defects, such as delaminations, and impact damage. Ultrasonic and Thermographic images are digitally enhanced to accurately measure the damage size. Ultrasonic NDE is compared with thermography.
Ferriere, Michael; Van Ness, Brian
2013-01-01
The NCI funded cooperative group cancer clinical trial system develops experimental therapies and often collects patient samples for correlative research. The Cooperative Group Bank (CGB) system maintains biobanks with a current policy not to return research results to individuals. An online survey was created, and 10 directors of CGBs completed the surveys asking about understanding and attitudes in changing policies to consider return of Incidental Findings (IFs) and Individual Research Results (IRRs) of health significance. The potential impact of the ten consensus recommendations of Wolf et al. presented in this issue are examined. Re-identification of samples is often not problematic; however, changes to the current banking and clinical trial systems would require significant effort to fulfill an obligation of recontact of subjects. Additional resources, as well as a national advisory board would be required to standardize implementation. PMID:22382800
Revised ground-water monitoring compliance plan for the 300 area process trenches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schalla, R.; Aaberg, R.L.; Bates, D.J.
1988-09-01
This document contains ground-water monitoring plans for process-water disposal trenches located on the Hanford Site. These trenches, designated the 300 Area Process Trenches, have been used since 1973 for disposal of water that contains small quantities of both chemicals and radionuclides. The ground-water monitoring plans contained herein represent revision and expansion of an effort initiated in June 1985. At that time, a facility-specific monitoring program was implemented at the 300 Area Process Trenches as part of a regulatory compliance effort for hazardous chemicals being conducted on the Hanford Site. This monitoring program was based on the ground-water monitoring requirements formore » interim-status facilities, which are those facilities that do not yet have final permits, but are authorized to continue interim operations while engaged in the permitting process. The applicable monitoring requirements are described in the Resource Conservation and Recovery Act (RCRA), 40 CFR 265.90 of the federal regulations, and in WAC 173-303-400 of Washington State's regulations (Washington State Department of Ecology 1986). The program implemented for the process trenches was designed to be an alternate program, which is required instead of the standard detection program when a facility is known or suspected to have contaminated the ground water in the uppermost aquifer. The plans for the program, contained in a document prepared by the US Department of Energy (USDOE) in 1985, called for monthly sampling of 14 of the 37 existing monitoring wells at the 300 Area plus the installation and sampling of 2 new wells. 27 refs., 25 figs., 15 tabs.« less
Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers
Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.
2018-01-01
Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.
Diurnal and Reproductive Stage-Dependent Variation of Parental Behaviour in Captive Zebra Finches
Morvai, Boglárka; Nanuru, Sabine; Mul, Douwe; Kusche, Nina; Milne, Gregory; Székely, Tamás; Komdeur, Jan; Miklósi, Ádám
2016-01-01
Parental care plays a key role in ontogeny, life-history trade-offs, sexual selection and intra-familial conflict. Studies focusing on understanding causes and consequences of variation in parental effort need to quantify parental behaviour accurately. The applied methods are, however, diverse even for a given species and type of parental effort, and rarely validated for accuracy. Here we focus on variability of parental behaviour from a methodological perspective to investigate the effect of different samplings on various estimates of parental effort. We used nest box cameras in a captive breeding population of zebra finches, Taeniopygia guttata, a widely used model system of sexual selection, intra-familial dynamics and parental care. We investigated diurnal and reproductive stage-dependent variation in parental effort (including incubation, brooding, nest attendance and number of feedings) based on 12h and 3h continuous video-recordings taken at various reproductive stages. We then investigated whether shorter (1h) sampling periods provided comparable estimates of overall parental effort and division of labour to those of longer (3h) sampling periods. Our study confirmed female-biased division of labour during incubation, and showed that the difference between female and male effort diminishes with advancing reproductive stage. We found individually consistent parental behaviours within given days of incubation and nestling provisioning. Furthermore, parental behaviour was consistent over the different stages of incubation, however, only female brooding was consistent over nestling provisioning. Parental effort during incubation did not predict parental effort during nestling provisioning. Our analyses revealed that 1h sampling may be influenced heavily by stochastic and diurnal variation. We suggest using a single longer sampling period (3h) may provide a consistent and accurate estimate for overall parental effort during incubation in zebra finches. Due to the large within-individual variation, we suggest repeated longer sampling over the reproductive stage may be necessary for accurate estimates of parental effort post-hatching. PMID:27973549
Maret, Terry R.; Ott, D.S.
2004-01-01
width was determined to be sufficient for collecting an adequate number of fish to estimate species richness and evaluate biotic integrity. At most sites, about 250 fish were needed to effectively represent 95 percent of the species present. Fifty-three percent of the sites assessed, using an IBI developed specifically for large Idaho rivers, received scores of less than 50, indicating poor biotic integrity.
Advanced Extra-Vehicular Activity Pressure Garment Requirements Development
NASA Technical Reports Server (NTRS)
Ross, Amy; Aitchison, Lindsay; Rhodes, Richard
2015-01-01
The NASA Johnson Space Center advanced pressure garment technology development team is addressing requirements development for exploration missions. Lessons learned from the Z-2 high fidelity prototype development have reiterated that clear low-level requirements and verification methods reduce risk to the government, improve efficiency in pressure garment design efforts, and enable the government to be a smart buyer. The expectation is to provide requirements at the specification level that are validated so that their impact on pressure garment design is understood. Additionally, the team will provide defined verification protocols for the requirements. However, in reviewing exploration space suit high level requirements there are several gaps in the team's ability to define and verify related lower level requirements. This paper addresses the efforts in requirement areas such as mobility/fit/comfort and environmental protection (dust, radiation, plasma, secondary impacts) to determine the method by which the requirements can be defined and use of those methods for verification. Gaps exist at various stages. In some cases component level work is underway, but no system level effort has begun; in other cases no effort has been initiated to close the gap. Status of on-going efforts and potential approaches to open gaps are discussed.
Bryce, Thomas N.; Dijkers, Marcel P.
2015-01-01
Background: Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. Objective: To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. Methods: A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Results: Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. Conclusion: This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device. PMID:26364280
Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P
2015-01-01
Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.
An Internationally Coordinated Science Management Plan for Samples Returned from Mars
NASA Astrophysics Data System (ADS)
Haltigin, T.; Smith, C. L.
2015-12-01
Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.
Urbanization and the Carbon Cycle: Synthesis of Ongoing Research
NASA Astrophysics Data System (ADS)
Gurney, K. R.; Duren, R. M.; Hutyra, L.; Ehleringer, J. R.; Patarasuk, R.; Song, Y.; Huang, J.; Davis, K.; Kort, E. A.; Shepson, P. B.; Turnbull, J. C.; Lauvaux, T.; Rao, P.; Eldering, A.; Miller, C. E.; Wofsy, S.; McKain, K.; Mendoza, D. L.; Lin, J. C.; Sweeney, C.; Miles, N. L.; Richardson, S.; Cambaliza, M. O. L.
2015-12-01
Given the explosive growth in urbanization and its dominant role in current and future global greenhouse gas emissions, urban areas have received increasing research attention from the carbon cycle science community. The emerging focus is driven by the increasingly dense atmospheric observing capabilities - ground and space-based - in addition to the rising profile of cities within international climate change policymaking. Dominated by anthropogenic emissions, urban carbon cycle research requires a cross-disciplinary perspective with contributions from disciplines such as engineering, economics, social theory, and atmospheric science. We review the recent results from a sample of the active urban carbon research efforts including the INFLUX experiment (Indianapolis), the Megacity carbon project (Los Angeles), Salt Lake City, and Boston. Each of these efforts represent unique approaches in pursuit of different scientific and policy questions and assist in setting priorities for future research. From top-down atmospheric measurement systems to bottom-up estimation, these research efforts offer a view of the challenges and opportunities in urban carbon cycle research.
Generic particulate-monitoring system for retrofit to Hanford exhaust stacks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camman, J.W.; Carbaugh, E.H.
1982-11-01
Evaluations of 72 sampling and monitoring systems were performed at Hanford as the initial phase of a program to upgrade such systems. Each evaluation included determination of theoretical sampling efficiencies for particle sizes ranging from 0.5 to 10 micrometers aerodynamic equivalent diameter, addressing anisokinetic bias, sample transport line losses, and collector device efficiency. Upgrades needed to meet current Department of Energy guidance for effluent sampling and monitoring were identified, and a cost for each upgrade was estimated. A relative priority for each system's upgrade was then established based on evaluation results, current operational status, and future plans for the facilitymore » being exhausted. Common system upgrade requirements lead to the development of a generic design for common components of an exhaust stack sampling and monitoring system for airborne radioactive particulates. The generic design consists of commercially available off-the-shelf components to the extent practical and will simplify future stack sampling and monitoring system design, fabrication, and installation efforts. Evaluation results and their significance to system upgrades are empasized. A brief discussion of the analytical models used and experience to date with the upgrade program is included. Development of the generic stack sampling and monitoring system design is outlined. Generic system design features and limitations are presented. Requirements for generic system retrofitting to existing exhaust stacks are defined and benefits derived from generic system application are discussed.« less
The RBANS Effort Index: Base rates in geriatric samples
Duff, Kevin; Spering, Cynthia C.; O’Bryant, Sid E.; Beglinger, Leigh J.; Moser, David J.; Bayless, John D.; Culp, Kennith R.; Mold, James W.; Adams, Russell L.; Scott, James G.
2011-01-01
The Effort Index (EI) of the RBANS was developed to assist clinicians in discriminating patients who demonstrate good effort from those with poor effort. However, there are concerns that older adults might be unfairly penalized by this index, which uses uncorrected raw scores. Using five independent samples of geriatric patients with a broad range of cognitive functioning (e.g., cognitively intact, nursing home residents, probable Alzheimer’s disease), base rates of failure on the EI were calculated. In cognitively intact and mildly impaired samples, few older individuals were classified as demonstrating poor effort (e.g., 3% in cognitively intact). However, in the more severely impaired geriatric patients, over one third had EI scores that fell above suggested cut-off scores (e.g., 37% in nursing home residents, 33% in probable Alzheimer’s disease). In the cognitively intact sample, older and less educated patients were more likely to have scores suggestive of poor effort. Education effects were observed in 3 of the 4 clinical samples. Overall cognitive functioning was significantly correlated with EI scores, with poorer cognition being associated with greater suspicion of low effort. The current results suggest that age, education, and level of cognitive functioning should be taken into consideration when interpreting EI results and that significant caution is warranted when examining EI scores in elders suspected of having dementia. PMID:21390895
Mulder, Leontine; van der Molen, Renate; Koelman, Carin; van Leeuwen, Ester; Roos, Anja; Damoiseaux, Jan
2018-05-01
ISO 15189:2012 requires validation of methods used in the medical laboratory, and lists a series of performance parameters for consideration to include. Although these performance parameters are feasible for clinical chemistry analytes, application in the validation of autoimmunity tests is a challenge. Lack of gold standards or reference methods in combination with the scarcity of well-defined diagnostic samples of patients with rare diseases make validation of new assays difficult. The present manuscript describes the initiative of Dutch medical immunology laboratory specialists to combine efforts and perform multi-center validation studies of new assays in the field of autoimmunity. Validation data and reports are made available to interested Dutch laboratory specialists. Copyright © 2018 Elsevier B.V. All rights reserved.
Survey methods for assessing land cover map accuracy
Nusser, S.M.; Klaas, E.E.
2003-01-01
The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.
Strotman, Lindsay N; Lin, Guangyun; Berry, Scott M; Johnson, Eric A; Beebe, David J
2012-09-07
Extraction and purification of DNA is a prerequisite to detection and analytical techniques. While DNA sample preparation methods have improved over the last few decades, current methods are still time consuming and labor intensive. Here we demonstrate a technology termed IFAST (Immiscible Filtration Assisted by Surface Tension), that relies on immiscible phase filtration to reduce the time and effort required to purify DNA. IFAST replaces the multiple wash and centrifugation steps required by traditional DNA sample preparation methods with a single step. To operate, DNA from lysed cells is bound to paramagnetic particles (PMPs) and drawn through an immiscible fluid phase barrier (i.e. oil) by an external handheld magnet. Purified DNA is then eluted from the PMPs. Here, detection of Clostridium botulinum type A (BoNT/A) in food matrices (milk, orange juice), a bioterrorism concern, was used as a model system to establish IFAST's utility in detection assays. Data validated that the DNA purified by IFAST was functional as a qPCR template to amplify the bont/A gene. The sensitivity limit of IFAST was comparable to the commercially available Invitrogen ChargeSwitch® method. Notably, pathogen detection via IFAST required only 8.5 μL of sample and was accomplished in five-fold less time. The simplicity, rapidity and portability of IFAST offer significant advantages when compared to existing DNA sample preparation methods.
Early detection monitoring for larval dreissenid mussels: How much plankton sampling is enough?
Counihan, Timothy D.; Bollens, Stephen M.
2017-01-01
The development of quagga and zebra mussel (dreissenids) monitoring programs in the Pacific Northwest provides a unique opportunity to evaluate a regional invasive species detection effort early in its development. Recent studies suggest that the ecological and economic costs of a dreissenid infestation in the Pacific Northwest of the USA would be significant. Consequently, efforts are underway to monitor for the presence of dreissenids. However, assessments of whether these efforts provide for early detection are lacking. We use information collected from 2012 to 2014 to characterize the development of larval dreissenid monitoring programs in the states of Idaho, Montana, Oregon, and Washington in the context of introduction and establishment risk. We also estimate the effort needed for high-probability detection of rare planktonic taxa in four Columbia and Snake River reservoirs and assess whether the current level of effort provides for early detection. We found that the effort expended to monitor for dreissenid mussels increased substantially from 2012 to 2014, that efforts were distributed across risk categories ranging from high to very low, and that substantial gaps in our knowledge of both introduction and establishment risk exist. The estimated volume of filtered water required to fully census planktonic taxa or to provide high-probability detection of rare taxa was high for the four reservoirs examined. We conclude that the current level of effort expended does not provide for high-probability detection of larval dreissenids or other planktonic taxa when they are rare in these reservoirs. We discuss options to improve early detection capabilities.
Control and Effort Costs Influence the Motivational Consequences of Choice
Sullivan-Toole, Holly; Richey, John A.; Tricomi, Elizabeth
2017-01-01
The act of making a choice, apart from any outcomes the choice may yield, has, paradoxically, been linked to both the enhancement and the detriment of intrinsic motivation. Research has implicated two factors in potentially mediating these contradictory effects: the personal control conferred by a choice and the costs associated with a choice. Across four experiments, utilizing a physical effort task disguised as a simple video game, we systematically varied costs across two levels of physical effort requirements (Low-Requirement, High-Requirement) and control over effort costs across three levels of choice (Free-Choice, Restricted-Choice, and No-Choice) to disambiguate how these factors affect the motivational consequences of choosing within an effortful task. Together, our results indicated that, in the face of effort requirements, illusory control alone may not sufficiently enhance perceptions of personal control to boost intrinsic motivation; rather, the experience of actual control may be necessary to overcome effort costs and elevate performance. Additionally, we demonstrated that conditions of illusory control, while otherwise unmotivating, can through association with the experience of free-choice, be transformed to have a positive effect on motivation. PMID:28515705
Sampling effort affects multivariate comparisons of stream assemblages
Cao, Y.; Larsen, D.P.; Hughes, R.M.; Angermeier, P.L.; Patton, T.M.
2002-01-01
Multivariate analyses are used widely for determining patterns of assemblage structure, inferring species-environment relationships and assessing human impacts on ecosystems. The estimation of ecological patterns often depends on sampling effort, so the degree to which sampling effort affects the outcome of multivariate analyses is a concern. We examined the effect of sampling effort on site and group separation, which was measured using a mean similarity method. Two similarity measures, the Jaccard Coefficient and Bray-Curtis Index were investigated with 1 benthic macroinvertebrate and 2 fish data sets. Site separation was significantly improved with increased sampling effort because the similarity between replicate samples of a site increased more rapidly than between sites. Similarly, the faster increase in similarity between sites of the same group than between sites of different groups caused clearer separation between groups. The strength of site and group separation completely stabilized only when the mean similarity between replicates reached 1. These results are applicable to commonly used multivariate techniques such as cluster analysis and ordination because these multivariate techniques start with a similarity matrix. Completely stable outcomes of multivariate analyses are not feasible. Instead, we suggest 2 criteria for estimating the stability of multivariate analyses of assemblage data: 1) mean within-site similarity across all sites compared, indicating sample representativeness, and 2) the SD of within-site similarity across sites, measuring sample comparability.
IT solutions for privacy protection in biobanking.
Eder, J; Gottweis, H; Zatloukal, K
2012-01-01
Biobanks containing human biological samples and associated data are key resources for the advancement of medical research. Efficient access to samples and data increases competitiveness in medical research, reduces effort and time for achieving scientific results and promotes scientific progress. In order to address upcoming health challenges, there is increasing need for transnational collaboration. This requires innovative solutions improving interoperability of biobanks in fields such as sample and data management as well as governance including ethical and legal frameworks. In this context, rights and expectations of donors to determine the usage of their biological material and data and to ensure their privacy have to be observed. We discuss the benefits of biobanks, the needs to support medical research and the societal demands and regulations, in particular, securing the rights of donors and present IT solutions that allow both to maintain the security of personal data and to increase the efficiency of access to data in biobanks. Disclosure filters are discussed as a strategy to combine European public expectations concerning informed consent with the requirements of biobank research. Copyright © 2012 S. Karger AG, Basel.
Aging and cardiovascular complexity: effect of the length of RR tachograms
Nagaraj, Nithin
2016-01-01
As we age, our hearts undergo changes that result in a reduction in complexity of physiological interactions between different control mechanisms. This results in a potential risk of cardiovascular diseases which are the number one cause of death globally. Since cardiac signals are nonstationary and nonlinear in nature, complexity measures are better suited to handle such data. In this study, three complexity measures are used, namely Lempel–Ziv complexity (LZ), Sample Entropy (SampEn) and Effort-To-Compress (ETC). We determined the minimum length of RR tachogram required for characterizing complexity of healthy young and healthy old hearts. All the three measures indicated significantly lower complexity values for older subjects than younger ones. However, the minimum length of heart-beat interval data needed differs for the three measures, with LZ and ETC needing as low as 10 samples, whereas SampEn requires at least 80 samples. Our study indicates that complexity measures such as LZ and ETC are good candidates for the analysis of cardiovascular dynamics since they are able to work with very short RR tachograms. PMID:27957395
A Diaper Pad for Diaper-Based Urine Collection and Colorimetric Screening of Urinary Biomarkers.
Karlsen, Haakon; Dong, Tao; Suo, Zhenhe
2018-05-01
The high prevalence of urinary tract infection in aging adults is a challenging aspect of geriatric care. Incontinence and cognitive/functional impairment make collection of urine samples difficult and often require either catheterization for sample collection, which is a risk factor for infections, or more lenient criteria for initiating antibiotic treatment. We report the development of a diaper inlay with absorbent materials, superabsorbent polymer-based valve and chemical reaction pads for rapid screening of urinary tract infection of incontinent diaper-wearing elderly receivers of home care services. The developed diaper inlay was capable of collecting, isolating, analyzing samples and retaining results > 8 h. The diaper inlay can therefore be compatible with the diaper changing routines of nurses in home care services, without requiring much time or effort. A nurse can insert a diaper inlay in a diaper and the results can be recorded during a later diaper change. Although the research focuses on tools for home care services, the nursing home sector has similar problems and may benefit from technological development for rapid screening to avoid unnecessary catheterization and overuse of antibiotics.
NASA Technical Reports Server (NTRS)
Allton, J. H.; Zeigler, R. A.; Calaway, M. J.
2016-01-01
The Lunar Receiving Laboratory (LRL) was planned and constructed in the 1960s to support the Apollo program in the context of landing on the Moon and safely returning humans. The enduring science return from that effort is a result of careful curation of planetary materials. Technical decisions for the first facility included sample handling environment (vacuum vs inert gas), and instruments for making basic sample assessment, but the most difficult decision, and most visible, was stringent biosafety vs ultra-clean sample handling. Biosafety required handling of samples in negative pressure gloveboxes and rooms for containment and use of sterilizing protocols and animal/plant models for hazard assessment. Ultra-clean sample handling worked best in positive pressure nitrogen environment gloveboxes in positive pressure rooms, using cleanable tools of tightly controlled composition. The requirements for these two objectives were so different, that the solution was to design and build a new facility for specific purpose of preserving the scientific integrity of the samples. The resulting Lunar Curatorial Facility was designed and constructed, from 1972-1979, with advice and oversight by a very active committee comprised of lunar sample scientists. The high precision analyses required for planetary science are enabled by stringent contamination control of trace elements in the materials and protocols of construction (e.g., trace element screening for paint and flooring materials) and the equipment used in sample handling and storage. As other astromaterials, especially small particles and atoms, were added to the collections curated, the technical tension between particulate cleanliness and organic cleanliness was addressed in more detail. Techniques for minimizing particulate contamination in sample handling environments use high efficiency air filtering techniques typically requiring organic sealants which offgas. Protocols for reducing adventitious carbon on sample handling surfaces often generate particles. Further work is needed to achieve both minimal particulate and adventitious carbon contamination. This paper will discuss these facility topics and others in the historical context of nearly 50 years' curation experience for lunar rocks and regolith, meteorites, cosmic dust, comet particles, solar wind atoms, and asteroid particles at Johnson Space Center.
Embedding clinical interventions into observational studies
Newman, Anne B.; Avilés-Santa, M. Larissa; Anderson, Garnet; Heiss, Gerardo; Howard, Wm. James; Krucoff, Mitchell; Kuller, Lewis H.; Lewis, Cora E.; Robinson, Jennifer G.; Taylor, Herman; Treviño, Roberto P.; Weintraub, William
2017-01-01
Novel approaches to observational studies and clinical trials could improve the cost-effectiveness and speed of translation of research. Hybrid designs that combine elements of clinical trials with observational registries or cohort studies should be considered as part of a long-term strategy to transform clinical trials and epidemiology, adapting to the opportunities of big data and the challenges of constrained budgets. Important considerations include study aims, timing, breadth and depth of the existing infrastructure that can be leveraged, participant burden, likely participation rate and available sample size in the cohort, required sample size for the trial, and investigator expertise. Community engagement and stakeholder (including study participants) support are essential for these efforts to succeed. PMID:26611435
An adaptive two-stage sequential design for sampling rare and clustered populations
Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.
2008-01-01
How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.
Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements
NASA Astrophysics Data System (ADS)
Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga
The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.
Minimizing Experimental Setup Time and Effort at APS beamline 1-ID through Instrumentation Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benda, Erika; Almer, Jonathan; Kenesei, Peter
2016-01-01
Sector 1-ID at the APS accommodates a number of dif-ferent experimental techniques in the same spatial enve-lope of the E-hutch end station. These include high-energy small and wide angle X-ray scattering (SAXS and WAXS), high-energy diffraction microscopy (HEDM, both near and far field modes) and high-energy X-ray tomography. These techniques are frequently combined to allow the users to obtain multimodal data, often attaining 1 μm spatial resolution and <0.05º angular resolution. Furthermore, these techniques are utilized while the sam-ple is thermo-mechanically loaded to mimic real operat-ing conditions. The instrumentation required for each of these techniques and environments has been designedmore » and configured in a modular way with a focus on stability and repeatability between changeovers. This approach allows the end station to be more versatile, capable of collecting multi-modal data in-situ while reducing time and effort typically required for set up and alignment, resulting in more efficient beam time use. Key instrumentation de-sign features and layout of the end station are presented.« less
Forsthoefel, David J; Waters, Forrest A; Newmark, Phillip A
2014-12-21
Efforts to elucidate the cellular and molecular mechanisms of regeneration have required the application of methods to detect specific cell types and tissues in a growing cohort of experimental animal models. For example, in the planarian Schmidtea mediterranea, substantial improvements to nucleic acid hybridization and electron microscopy protocols have facilitated the visualization of regenerative events at the cellular level. By contrast, immunological resources have been slower to emerge. Specifically, the repertoire of antibodies recognizing planarian antigens remains limited, and a more systematic approach is needed to evaluate the effects of processing steps required during sample preparation for immunolabeling. To address these issues and to facilitate studies of planarian digestive system regeneration, we conducted a monoclonal antibody (mAb) screen using phagocytic intestinal cells purified from the digestive tracts of living planarians as immunogens. This approach yielded ten antibodies that recognized intestinal epitopes, as well as markers for the central nervous system, musculature, secretory cells, and epidermis. In order to improve signal intensity and reduce non-specific background for a subset of mAbs, we evaluated the effects of fixation and other steps during sample processing. We found that fixative choice, treatments to remove mucus and bleach pigment, as well as methods for tissue permeabilization and antigen retrieval profoundly influenced labeling by individual antibodies. These experiments led to the development of a step-by-step workflow for determining optimal specimen preparation for labeling whole planarians as well as unbleached histological sections. We generated a collection of monoclonal antibodies recognizing the planarian intestine and other tissues; these antibodies will facilitate studies of planarian tissue morphogenesis. We also developed a protocol for optimizing specimen processing that will accelerate future efforts to generate planarian-specific antibodies, and to extend functional genetic studies of regeneration to post-transcriptional aspects of gene expression, such as protein localization or modification. Our efforts demonstrate the importance of systematically testing multiple approaches to species-specific idiosyncracies, such as mucus removal and pigment bleaching, and may serve as a template for the development of immunological resources in other emerging model organisms.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-18
... improvements to the design and management of survey and estimation methods used to produce marine recreational... to these effort surveys provides for continued pilot testing of effort sampling designs that use both... NMFS determine what specific sampling design to use in MRIP effort surveys on the Atlantic and Gulf...
Sampling procedures for inventory of commercial volume tree species in Amazon Forest.
Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R
2017-01-01
The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1973-01-01
The advanced technology requirements for an advanced high speed commercial transport engine are presented. The results of the phase 3 effort cover the requirements and objectives for future aircraft propulsion systems. These requirements reflect the results of the Task 1 and 2 efforts and serve as a baseline for future evaluations, specification development efforts, contract/purchase agreements, and operational plans for future subsonic commercial engines. This report is divided into five major sections: (1) management objectives for commercial propulsion systems, (2) performance requirements for commercial transport propulsion systems, (3) design criteria for future transport engines, (4) design requirements for powerplant packages, and (5) testing.
A call for transparent reporting to optimize the predictive value of preclinical research
Landis, Story C.; Amara, Susan G.; Asadullah, Khusru; Austin, Chris P.; Blumenstein, Robi; Bradley, Eileen W.; Crystal, Ronald G.; Darnell, Robert B.; Ferrante, Robert J.; Fillit, Howard; Finkelstein, Robert; Fisher, Marc; Gendelman, Howard E.; Golub, Robert M.; Goudreau, John L.; Gross, Robert A.; Gubitz, Amelie K.; Hesterlee, Sharon E.; Howells, David W.; Huguenard, John; Kelner, Katrina; Koroshetz, Walter; Krainc, Dimitri; Lazic, Stanley E.; Levine, Michael S.; Macleod, Malcolm R.; McCall, John M.; Moxley, Richard T.; Narasimhan, Kalyani; Noble, Linda J.; Perrin, Steve; Porter, John D.; Steward, Oswald; Unger, Ellis; Utz, Ursula; Silberberg, Shai D.
2012-01-01
The US National Institute of Neurological Disorders and Stroke convened major stakeholders in June 2012 to discuss how to improve the methodological reporting of animal studies in grant applications and publications. The main workshop recommendation is that at a minimum studies should report on sample-size estimation, whether and how animals were randomized, whether investigators were blind to the treatment, and the handling of data. We recognize that achieving a meaningful improvement in the quality of reporting will require a concerted effort by investigators, reviewers, funding agencies and journal editors. Requiring better reporting of animal studies will raise awareness of the importance of rigorous study design to accelerate scientific progress. PMID:23060188
A call for transparent reporting to optimize the predictive value of preclinical research.
Landis, Story C; Amara, Susan G; Asadullah, Khusru; Austin, Chris P; Blumenstein, Robi; Bradley, Eileen W; Crystal, Ronald G; Darnell, Robert B; Ferrante, Robert J; Fillit, Howard; Finkelstein, Robert; Fisher, Marc; Gendelman, Howard E; Golub, Robert M; Goudreau, John L; Gross, Robert A; Gubitz, Amelie K; Hesterlee, Sharon E; Howells, David W; Huguenard, John; Kelner, Katrina; Koroshetz, Walter; Krainc, Dimitri; Lazic, Stanley E; Levine, Michael S; Macleod, Malcolm R; McCall, John M; Moxley, Richard T; Narasimhan, Kalyani; Noble, Linda J; Perrin, Steve; Porter, John D; Steward, Oswald; Unger, Ellis; Utz, Ursula; Silberberg, Shai D
2012-10-11
The US National Institute of Neurological Disorders and Stroke convened major stakeholders in June 2012 to discuss how to improve the methodological reporting of animal studies in grant applications and publications. The main workshop recommendation is that at a minimum studies should report on sample-size estimation, whether and how animals were randomized, whether investigators were blind to the treatment, and the handling of data. We recognize that achieving a meaningful improvement in the quality of reporting will require a concerted effort by investigators, reviewers, funding agencies and journal editors. Requiring better reporting of animal studies will raise awareness of the importance of rigorous study design to accelerate scientific progress.
Advanced sampling techniques for hand-held FT-IR instrumentation
NASA Astrophysics Data System (ADS)
Arnó, Josep; Frunzi, Michael; Weber, Chris; Levy, Dustin
2013-05-01
FT-IR spectroscopy is the technology of choice to identify solid and liquid phase unknown samples. The challenging ConOps in emergency response and military field applications require a significant redesign of the stationary FT-IR bench-top instruments typically used in laboratories. Specifically, field portable units require high levels of resistance against mechanical shock and chemical attack, ease of use in restrictive gear, extreme reliability, quick and easy interpretation of results, and reduced size. In the last 20 years, FT-IR instruments have been re-engineered to fit in small suitcases for field portable use and recently further miniaturized for handheld operation. This article introduces the HazMatID™ Elite, a FT-IR instrument designed to balance the portability advantages of a handheld device with the performance challenges associated with miniaturization. In this paper, special focus will be given to the HazMatID Elite's sampling interfaces optimized to collect and interrogate different types of samples: accumulated material using the on-board ATR press, dispersed powders using the ClearSampler™ tool, and the touch-to-sample sensor for direct liquid sampling. The application of the novel sample swipe accessory (ClearSampler) to collect material from surfaces will be discussed in some detail. The accessory was tested and evaluated for the detection of explosive residues before and after detonation. Experimental results derived from these investigations will be described in an effort to outline the advantages of this technology over existing sampling methods.
Sofaer, Helen R.; Jarnevich, Catherine S.
2017-01-01
AimThe distributions of exotic species reflect patterns of human-mediated dispersal, species climatic tolerances and a suite of other biotic and abiotic factors. The relative importance of each of these factors will shape how the spread of exotic species is affected by ongoing economic globalization and climate change. However, patterns of trade may be correlated with variation in scientific sampling effort globally, potentially confounding studies that do not account for sampling patterns.LocationGlobal.Time periodMuseum records, generally from the 1800s up to 2015.Major taxa studiedPlant species exotic to the United States.MethodsWe used data from the Global Biodiversity Information Facility (GBIF) to summarize the number of plant species with exotic occurrences in the United States that also occur in each other country world-wide. We assessed the relative importance of trade and climatic similarity for explaining variation in the number of shared species while evaluating several methods to account for variation in sampling effort among countries.ResultsAccounting for variation in sampling effort reversed the relative importance of trade and climate for explaining numbers of shared species. Trade was strongly correlated with numbers of shared U.S. exotic plants between the United States and other countries before, but not after, accounting for sampling variation among countries. Conversely, accounting for sampling effort strengthened the relationship between climatic similarity and species sharing. Using the number of records as a measure of sampling effort provided a straightforward approach for the analysis of occurrence data, whereas species richness estimators and rarefaction were less effective at removing sampling bias.Main conclusionsOur work provides support for broad-scale climatic limitation on the distributions of exotic species, illustrates the need to account for variation in sampling effort in large biodiversity databases, and highlights the difficulty in inferring causal links between the economic drivers of invasion and global patterns of exotic species occurrence.
Microplastic pollution in the Northeast Atlantic Ocean: validated and opportunistic sampling.
Lusher, Amy L; Burke, Ann; O'Connor, Ian; Officer, Rick
2014-11-15
Levels of marine debris, including microplastics, are largely un-documented in the Northeast Atlantic Ocean. Broad scale monitoring efforts are required to understand the distribution, abundance and ecological implications of microplastic pollution. A method of continuous sampling was developed to be conducted in conjunction with a wide range of vessel operations to maximise vessel time. Transects covering a total of 12,700 km were sampled through continuous monitoring of open ocean sub-surface water resulting in 470 samples. Items classified as potential plastics were identified in 94% of samples. A total of 2315 particles were identified, 89% were less than 5mm in length classifying them as microplastics. Average plastic abundance in the Northeast Atlantic was calculated as 2.46 particles m(-3). This is the first report to demonstrate the ubiquitous nature of microplastic pollution in the Northeast Atlantic Ocean and to present a potential method for standardised monitoring of microplastic pollution. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn
2017-01-01
NASA's Sample Cartridge Assembly (SCA) project is responsible for designing and validating a payload that contains materials research samples in a sealed environment. The SCA will be heated in the European Space Agency's (ESA) Low Gradient Furnace (LGF) that is housed inside the Material Science Research Rack (MSRR) located on the International Space Station (ISS). The first Principle Investigator (PI) to utilize the SCA will focus on Gravitational Effects on Distortion in Sintering (GEDS) research. This paper will give a summary of the design and development test effort for the GEDS SCA and will discuss the role of thermal analysis in developing test profiles to meet the science and engineering requirements. Lessons learned will be reviewed and salient design features that may differ for each PI will be discussed.
Active Learning Using Hint Information.
Li, Chun-Liang; Ferng, Chun-Sung; Lin, Hsuan-Tien
2015-08-01
The abundance of real-world data and limited labeling budget calls for active learning, an important learning paradigm for reducing human labeling efforts. Many recently developed active learning algorithms consider both uncertainty and representativeness when making querying decisions. However, exploiting representativeness with uncertainty concurrently usually requires tackling sophisticated and challenging learning tasks, such as clustering. In this letter, we propose a new active learning framework, called hinted sampling, which takes both uncertainty and representativeness into account in a simpler way. We design a novel active learning algorithm within the hinted sampling framework with an extended support vector machine. Experimental results validate that the novel active learning algorithm can result in a better and more stable performance than that achieved by state-of-the-art algorithms. We also show that the hinted sampling framework allows improving another active learning algorithm designed from the transductive support vector machine.
Sample Return Propulsion Technology Development Under NASA's ISPT Project
NASA Technical Reports Server (NTRS)
Anderson, David J.; Dankanich, John; Hahne, David; Pencil, Eric; Peterson, Todd; Munk, Michelle M.
2011-01-01
Abstract In 2009, the In-Space Propulsion Technology (ISPT) program was tasked to start development of propulsion technologies that would enable future sample return missions. Sample return missions can be quite varied, from collecting and bringing back samples of comets or asteroids, to soil, rocks, or atmosphere from planets or moons. As a result, ISPT s propulsion technology development needs are also broad, and include: 1) Sample Return Propulsion (SRP), 2) Planetary Ascent Vehicles (PAV), 3) Multi-mission technologies for Earth Entry Vehicles (MMEEV), and 4) Systems/mission analysis and tools that focuses on sample return propulsion. The SRP area includes electric propulsion for sample return and low cost Discovery-class missions, and propulsion systems for Earth Return Vehicles (ERV) including transfer stages to the destination. Initially the SRP effort will transition ongoing work on a High-Voltage Hall Accelerator (HIVHAC) thruster into developing a full HIVHAC system. SRP will also leverage recent lightweight propellant-tanks advancements and develop flight-qualified propellant tanks with direct applicability to the Mars Sample Return (MSR) mission and with general applicability to all future planetary spacecraft. ISPT s previous aerocapture efforts will merge with earlier Earth Entry Vehicles developments to form the starting point for the MMEEV effort. The first task under the Planetary Ascent Vehicles (PAV) effort is the development of a Mars Ascent Vehicle (MAV). The new MAV effort will leverage past MAV analysis and technology developments from the Mars Technology Program (MTP) and previous MSR studies. This paper will describe the state of ISPT project s propulsion technology development for future sample return missions.12
Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K
2011-12-01
Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.
40 CFR 33.302 - Are there any additional contract administration requirements?
Code of Federal Regulations, 2014 CFR
2014-07-01
... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... faith efforts described in § 33.301 if soliciting a replacement subcontractor. (d) A recipient must require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime...
40 CFR 33.302 - Are there any additional contract administration requirements?
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... faith efforts described in § 33.301 if soliciting a replacement subcontractor. (d) A recipient must require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime...
40 CFR 33.302 - Are there any additional contract administration requirements?
Code of Federal Regulations, 2013 CFR
2013-07-01
... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... faith efforts described in § 33.301 if soliciting a replacement subcontractor. (d) A recipient must require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime...
40 CFR 33.302 - Are there any additional contract administration requirements?
Code of Federal Regulations, 2011 CFR
2011-07-01
... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... faith efforts described in § 33.301 if soliciting a replacement subcontractor. (d) A recipient must require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime...
Spatially resolved δ13C analysis using laser ablation isotope ratio mass spectrometry
NASA Astrophysics Data System (ADS)
Moran, J.; Riha, K. M.; Nims, M. K.; Linley, T. J.; Hess, N. J.; Nico, P. S.
2014-12-01
Inherent geochemical, organic matter, and microbial heterogeneity over small spatial scales can complicate studies of carbon dynamics through soils. Stable isotope analysis has a strong history of helping track substrate turnover, delineate rhizosphere activity zones, and identifying transitions in vegetation cover, but most traditional isotope approaches are limited in spatial resolution by a combination of physical separation techniques (manual dissection) and IRMS instrument sensitivity. We coupled laser ablation sampling with isotope measurement via IRMS to enable spatially resolved analysis over solid surfaces. Once a targeted sample region is ablated the resulting particulates are entrained in a helium carrier gas and passed through a combustion reactor where carbon is converted to CO2. Cyrotrapping of the resulting CO2 enables a reduction in carrier gas flow which improves overall measurement sensitivity versus traditional, high flow sample introduction. Currently we are performing sample analysis at 50 μm resolution, require 65 ng C per analysis, and achieve measurement precision consistent with other continuous flow techniques. We will discuss applications of the laser ablation IRMS (LA-IRMS) system to microbial communities and fish ecology studies to demonstrate the merits of this technique and how similar analytical approaches can be transitioned to soil systems. Preliminary efforts at analyzing soil samples will be used to highlight strengths and limitations of the LA-IRMS approach, paying particular attention to sample preparation requirements, spatial resolution, sample analysis time, and the types of questions most conducive to analysis via LA-IRMS.
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
MicroRaman measurements for nuclear fuel reprocessing applications
Casella, Amanda; Lines, Amanda; Nelson, Gilbert; ...
2016-12-01
Treatment and reuse of used nuclear fuel is a key component in closing the nuclear fuel cycle. Solvent extraction reprocessing methods that have been developed contain various steps tailored to the separation of specific radionuclides, which are highly dependent upon solution properties. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. Our group has been investigating the use of optical spectroscopy for the on-line monitoring of actinides, lanthanides, and acid strength within fuel reprocessing streams. This paper willmore » focus on the development and application of a new MicroRaman probe for on-line real-time monitoring of the U(VI)/nitrate ion/nitric acid in solutions relevant to used nuclear fuel reprocessing. Previous research has successfully demonstrated the applicability on the macroscopic scale, using sample probes requiring larger solution volumes. In an effort to minimize waste and reduce dose to personnel, we have modified this technique to allow measurement at the microfluidic scale using a Raman microprobe. Under the current sampling environment, Raman samples typically require upwards of 10 mL and larger. Using the new sampling system, we can sample volumes at 10 μL or less, which is a scale reduction of over 1,000 fold in sample size. Finally, this paper will summarize our current work in this area including: comparisons between the macroscopic and microscopic probes for detection limits, optimized channel focusing, and application in a flow cell with varying levels of HNO 3, and UO 2(NO 3) 2.« less
NASA Astrophysics Data System (ADS)
Chan, V. S.; Wong, C. P. C.; McLean, A. G.; Luo, G. N.; Wirth, B. D.
2013-10-01
The Xolotl code under development by PSI-SciDAC will enhance predictive modeling capability of plasma-facing materials under burning plasma conditions. The availability and application of experimental data to compare to code-calculated observables are key requirements to validate the breadth and content of physics included in the model and ultimately gain confidence in its results. A dedicated effort has been in progress to collect and organize a) a database of relevant experiments and their publications as previously carried out at sample exposure facilities in US and Asian tokamaks (e.g., DIII-D DiMES, and EAST MAPES), b) diagnostic and surface analysis capabilities available at each device, and c) requirements for future experiments with code validation in mind. The content of this evolving database will serve as a significant resource for the plasma-material interaction (PMI) community. Work supported in part by the US Department of Energy under GA-DE-SC0008698, DE-AC52-07NA27344 and DE-AC05-00OR22725.
Monitoring Acidophilic Microbes with Real-Time Polymerase Chain Reaction (PCR) Assays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank F. Roberto
2008-08-01
Many techniques that are used to characterize and monitor microbial populations associated with sulfide mineral bioleaching require the cultivation of the organisms on solid or liquid media. Chemolithotrophic species, such as Acidithiobacillus ferrooxidans and Leptospirillum ferrooxidans, or thermophilic chemolithotrophs, such as Acidianus brierleyi and Sulfolobus solfataricus can grow quite slowly, requiring weeks to complete efforts to identify and quantify these microbes associated with bioleach samples. Real-time PCR (polymerase chain reaction) assays in which DNA targets are amplified in the presence of fluorescent oligonucleotide primers, allowing the monitoring and quantification of the amplification reactions as they progress, provide a means ofmore » rapidly detecting the presence of microbial species of interest, and their relative abundance in a sample. This presentation will describe the design and use of such assays to monitor acidophilic microbes in the environment and in bioleaching operations. These assays provide results within 2-3 hours, and can detect less than 100 individual microbial cells.« less
Fiscal Year 2016 Revegetation Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordstrom, Jenifer B.
This report summarizes the Fiscal Year (FY) 2016 Revegetation Assessment by Battelle Energy Alliance, LLC. This assessment was conducted to document revegetation efforts at Idaho National Laboratory to verify restoration of disturbed vegetation and soil at various locations occurs as required. This report provides the following information for projects at Idaho National Laboratory completed during FY 2016 that were identified during the National Environmental Policy Act review process as having the potential to disturb soils or vegetation: 1) A summary of all projects identified as having the potential to require revegetation efforts 2) A summary of site disturbance and restorationmore » efforts of each project. For FY 2016, one project required revegetation and sagebrush restoration. For other projects, implementation of best management practices minimized impacts to vegetation and revegetation efforts were not required.« less
Wuellner, Sara; Phipps, Polly
2018-05-01
Accuracy of the Bureau of Labor Statistics Survey of Occupational Injuries and Illnesses (SOII) data is dependent on employer compliance with workplace injury and illness recordkeeping requirements. Characterization of employer recordkeeping can inform efforts to improve the data. We interviewed representative samples of SOII respondents from four states to identify common recordkeeping errors and to assess employer characteristics associated with limited knowledge of the recordkeeping requirements and non compliant practices. Less than half of the establishments required to maintain OSHA injury and illness records reported doing so. Few establishments knew to omit cases limited to diagnostic services (22%) and to count unscheduled weekend days as missed work (27%). No single state or establishment characteristic was consistently associated with better or worse record-keeping. Many employers possess a limited understanding of workplace injury recordkeeping requirements, potentially leading them to over-report minor incidents, and under-report missed work cases. © 2018 Wiley Periodicals, Inc.
Greater effort increases perceived value in an invertebrate.
Czaczkes, Tomer J; Brandstetter, Birgit; di Stefano, Isabella; Heinze, Jürgen
2018-05-01
Expending effort is generally considered to be undesirable. However, both humans and vertebrates will work for a reward they could also get for free. Moreover, cues associated with high-effort rewards are preferred to low-effort associated cues. Many explanations for these counterintuitive findings have been suggested, including cognitive dissonance (self-justification) or a greater contrast in state (e.g., energy or frustration level) before and after an effort-linked reward. Here, we test whether effort expenditure also increases perceived value in ants, using both classical cue-association methods and pheromone deposition, which correlates with perceived value. In 2 separate experimental setups, we show that pheromone deposition is higher toward the reward that requires more effort: 47% more pheromone deposition was performed for rewards reached via a vertical runway (high effort) compared with ones reached via a horizontal runway (low effort), and deposition rates were 28% higher on rough (high effort) versus smooth (low effort) runways. Using traditional cue-association methods, 63% of ants trained on different surface roughness, and 70% of ants trained on different runway elevations, preferred the high-effort related cues on a Y maze. Finally, pheromone deposition to feeders requiring memorization of one path bifurcation was up to 29% higher than to an identical feeder requiring no learning. Our results suggest that effort affects value perception in ants. This effect may stem from a cognitive process, which monitors the change in a generalized hedonic state before and after reward. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Kintrup, J; Wünsch, G
2001-11-01
The capability of sewer slime to accumulate heavy metals from municipal wastewater can be exploited to identify the sources of sewage sludge pollution. Former investigations of sewer slime looked for a few elements only and could, therefore, not account for deviations of the enrichment efficiency of the slime or for irregularities from sampling. Results of ICP-MS multi element determinations were analyzed by multivariate statistical methods. A new dimensionless characteristic "sewer slime impact" is proposed, which is zero for unloaded samples. Patterns expressed in this data format specifically extract the information required to identify the type of pollution and polluter quicker and with less effort and cost than hitherto.
Comparison of Impurities in Charcoal Sorbents Found by Neutron Activation Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doll, Charles G.; Finn, Erin C.; Cantaloub, Michael G.
2013-01-01
Abstract: Neutron activation of gas samples in a reactor often requires a medium to retain sufficient amounts of the gas for analysis. Charcoal is commonly used to adsorb gas and hold it for activation; however, the amount of activated sodium in the charcoal after irradiation swamps most signals of interest. Neutron activation analysis (NAA) was performed on several commonly available charcoal samples in an effort to determine the activation background. The results for several elements, including the dominant sodium element, are reported. It was found that ECN charcoal had the lowest elemental background, containing sodium at 2.65 ± 0.05 ppm,more » as well as trace levels of copper and tungsten.« less
Electrochemical Biosensors for Rapid Detection of Foodborne Salmonella: A Critical Overview
Cinti, Stefano; Volpe, Giulia; Piermarini, Silvia; Delibato, Elisabetta; Palleschi, Giuseppe
2017-01-01
Salmonella has represented the most common and primary cause of food poisoning in many countries for at least over 100 years. Its detection is still primarily based on traditional microbiological culture methods which are labor-intensive, extremely time consuming, and not suitable for testing a large number of samples. Accordingly, great efforts to develop rapid, sensitive and specific methods, easy to use, and suitable for multi-sample analysis, have been made and continue. Biosensor-based technology has all the potentialities to meet these requirements. In this paper, we review the features of the electrochemical immunosensors, genosensors, aptasensors and phagosensors developed in the last five years for Salmonella detection, focusing on the critical aspects of their application in food analysis. PMID:28820458
Embedding clinical interventions into observational studies.
Newman, Anne B; Avilés-Santa, M Larissa; Anderson, Garnet; Heiss, Gerardo; Howard, Wm James; Krucoff, Mitchell; Kuller, Lewis H; Lewis, Cora E; Robinson, Jennifer G; Taylor, Herman; Treviño, Roberto P; Weintraub, William
2016-01-01
Novel approaches to observational studies and clinical trials could improve the cost-effectiveness and speed of translation of research. Hybrid designs that combine elements of clinical trials with observational registries or cohort studies should be considered as part of a long-term strategy to transform clinical trials and epidemiology, adapting to the opportunities of big data and the challenges of constrained budgets. Important considerations include study aims, timing, breadth and depth of the existing infrastructure that can be leveraged, participant burden, likely participation rate and available sample size in the cohort, required sample size for the trial, and investigator expertise. Community engagement and stakeholder (including study participants) support are essential for these efforts to succeed. Copyright © 2015. Published by Elsevier Inc.
Cross-Sectional HIV Incidence Estimation in HIV Prevention Research
Brookmeyer, Ron; Laeyendecker, Oliver; Donnell, Deborah; Eshleman, Susan H.
2013-01-01
Accurate methods for estimating HIV incidence from cross-sectional samples would have great utility in prevention research. This report describes recent improvements in cross-sectional methods that significantly improve their accuracy. These improvements are based on the use of multiple biomarkers to identify recent HIV infections. These multi-assay algorithms (MAAs) use assays in a hierarchical approach for testing that minimizes the effort and cost of incidence estimation. These MAAs do not require mathematical adjustments for accurate estimation of the incidence rates in study populations in the year prior to sample collection. MAAs provide a practical, accurate, and cost-effective approach for cross-sectional HIV incidence estimation that can be used for HIV prevention research and global epidemic monitoring. PMID:23764641
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... protect the waterways, waterway users, and vessels from hazards associated with intensive fish sampling... sampling efforts will include the setting of nets throughout this portion of the Chicago Sanitary and Ship Canal. The purpose of this sampling is to provide essential information in connection with efforts to...
ERIC Educational Resources Information Center
Zullig, Keith J.; Ward, Rose Marie; King, Keith A.; Patton, Jon M.; Murray, Karen A.
2009-01-01
The purpose of this investigation was to assess the reliability and validity of eight developmental asset measures among a stratified, random sample (N = 540) of college students to guide health promotion efforts. The sample was randomly split to produce exploratory and confirmatory samples for factor analysis using principal axis factoring and…
Work stress in radiologists. A pilot study.
Magnavita, N; Fileni, A; Magnavita, G; Mammi, F; Mirk, P; Roccia, K; Bergamaschi, A
2008-04-01
We studied occupational stress and its psychosocial effects in a sample of Italian radiologists and radiotherapists: Radiologists and radiotherapists attending two medical conferences were invited to complete a questionnaire comprising four sections investigating the risk of occupational stress (organisational discomfort, Karasek's Job Content Questionnaire, Siegrist's Effort-Reward Imbalance, Warr's Job Satisfaction) and four sections investigating the health effects of such stress (Goldberg's Anxiety and Depression Scales, General Health Questionnaire, Lifestyles Questionnaire). Radiologists and radiotherapists generally expressed high levels of control, reward and satisfaction. However, 38.5% complained of severe organisational discomfort, 24% reported job strain, 28% reported effort/reward imbalance and 25% were dissatisfied. Female radiologists and radiotherapists showed higher levels of organisational discomfort than their male colleagues. Younger and less experienced radiologists and radiotherapists had higher strain scores than their older and more experienced colleagues. A significant correlation was observed between stress predictors and the effects of stress on health, including depression and anxiety, psychological distress and unhealthy lifestyles. Radiologists and radiotherapists are exposed to major occupational stress factors, and a significant percentage of them suffer from workplace stress. A special effort is required to prevent this condition.
Barcoding of live human PBMC for multiplexed mass cytometry*
Mei, Henrik E.; Leipold, Michael D.; Schulz, Axel Ronald; Chester, Cariad; Maecker, Holden T.
2014-01-01
Mass cytometry is developing as a means of multiparametric single cell analysis. Here, we present an approach to barcoding separate live human PBMC samples for combined preparation and acquisition on a CyTOF® instrument. Using six different anti-CD45 antibody (Ab) conjugates labeled with Pd104, Pd106, Pd108, Pd110, In113, and In115, respectively, we barcoded up to 20 samples with unique combinations of exactly three different CD45 Ab tags. Cell events carrying more than or less than three different tags were excluded from analyses during Boolean data deconvolution, allowing for precise sample assignment and the electronic removal of cell aggregates. Data from barcoded samples matched data from corresponding individually stained and acquired samples, at cell event recoveries similar to individual sample analyses. The approach greatly reduced technical noise and minimizes unwanted cell doublet events in mass cytometry data, and reduces wet work and antibody consumption. It also eliminates sample-to-sample carryover and the requirement of instrument cleaning between samples, thereby effectively reducing overall instrument runtime. Hence, CD45-barcoding facilitates accuracy of mass cytometric immunophenotyping studies, thus supporting biomarker discovery efforts, and should be applicable to fluorescence flow cytometry as well. PMID:25609839
Barnett, J Matthew; Yu, Xiao-Ying; Recknagle, Kurtis P; Glissmeyer, John A
2016-11-01
A planned laboratory space and exhaust system modification to the Pacific Northwest National Laboratory Material Science and Technology Building indicated that a new evaluation of the mixing at the air sampling system location would be required for compliance to ANSI/HPS N13.1-2011. The modified exhaust system would add a third fan, thereby increasing the overall exhaust rate out the stack, thus voiding the previous mixing study. Prior to modifying the radioactive air emissions exhaust system, a three-dimensional computational fluid dynamics computer model was used to evaluate the mixing at the sampling system location. Modeling of the original three-fan system indicated that not all mixing criteria could be met. A second modeling effort was conducted with the addition of an air blender downstream of the confluence of the three fans, which then showed satisfactory mixing results. The final installation included an air blender, and the exhaust system underwent full-scale tests to verify velocity, cyclonic flow, gas, and particulate uniformity. The modeling results and those of the full-scale tests show agreement between each of the evaluated criteria. The use of a computational fluid dynamics code was an effective aid in the design process and allowed the sampling system to remain in its original location while still meeting the requirements for sampling at a well mixed location.
VALIDATION OF STANDARD ANALYTICAL PROTOCOL FOR ...
There is a growing concern with the potential for terrorist use of chemical weapons to cause civilian harm. In the event of an actual or suspected outdoor release of chemically hazardous material in a large area, the extent of contamination must be determined. This requires a system with the ability to prepare and quickly analyze a large number of contaminated samples for the traditional chemical agents, as well as numerous toxic industrial chemicals. Liquid samples (both aqueous and organic), solid samples (e.g., soil), vapor samples (e.g., air) and mixed state samples, all ranging from household items to deceased animals, may require some level of analyses. To meet this challenge, the U.S. Environmental Protection Agency (U.S. EPA) National Homeland Security Research Center, in collaboration with experts from across U.S. EPA and other Federal Agencies, initiated an effort to identify analytical methods for the chemical and biological agents that could be used to respond to a terrorist attack or a homeland security incident. U.S. EPA began development of standard analytical protocols (SAPs) for laboratory identification and measurement of target agents in case of a contamination threat. These methods will be used to help assist in the identification of existing contamination, the effectiveness of decontamination, as well as clearance for the affected population to reoccupy previously contaminated areas. One of the first SAPs developed was for the determin
Wilson, Christina R; Mulligan, Christopher C; Strueh, Kurt D; Stevenson, Gregory W; Hooser, Stephen B
2014-05-01
Desorption electrospray ionization mass spectrometry (DESI-MS) is an emerging analytical technique that permits the rapid and direct analysis of biological or environmental samples under ambient conditions. Highlighting the versatility of this technique, DESI-MS has been used for the rapid detection of illicit drugs, chemical warfare agents, agricultural chemicals, and pharmaceuticals from a variety of sample matrices. In diagnostic veterinary toxicology, analyzing samples using traditional analytical instrumentation typically includes extensive sample extraction procedures, which can be time consuming and labor intensive. Therefore, efforts to expedite sample analyses are a constant goal for diagnostic toxicology laboratories. In the current report, DESI-MS was used to directly analyze stomach contents from a dog exposed to the organophosphate insecticide terbufos. The total DESI-MS analysis time required to confirm the presence of terbufos and diagnose organophosphate poisoning in this case was approximately 5 min. This highlights the potential of this analytical technique in the field of veterinary toxicology for the rapid diagnosis and detection of toxicants in biological samples. © 2014 The Author(s).
Effects of Radiation and Long-Term Thermal Cycling on EPC 1001 Gallium Nitride Transistors
NASA Technical Reports Server (NTRS)
Patterson, Richard L.; Scheick, Leif; Lauenstein, Jean-Marie; Casey, Megan; Hammoud, Ahmad
2012-01-01
Electronics designed for use in NASA space missions are required to work efficiently and reliably under harsh environment conditions. These include radiation, extreme temperatures, and thermal cycling, to name a few. Data obtained on long-term thermal cycling of new un-irradiated and irradiated samples of EPC1001 gallium nitride enhancement-mode transistors are presented. This work was done by a collaborative effort including GRC, GSFC, and support the NASA www.nasa.gov 1 JPL in of Electronic Parts and Packaging (NEPP) Program
40 CFR 33.301 - What does this subpart require?
Code of Federal Regulations, 2014 CFR
2014-07-01
... AGENCY PROGRAMS Good Faith Efforts § 33.301 What does this subpart require? A recipient, including one... good faith efforts whenever procuring construction, equipment, services and supplies under an EPA...
40 CFR 33.301 - What does this subpart require?
Code of Federal Regulations, 2013 CFR
2013-07-01
... AGENCY PROGRAMS Good Faith Efforts § 33.301 What does this subpart require? A recipient, including one... good faith efforts whenever procuring construction, equipment, services and supplies under an EPA...
40 CFR 33.301 - What does this subpart require?
Code of Federal Regulations, 2012 CFR
2012-07-01
... AGENCY PROGRAMS Good Faith Efforts § 33.301 What does this subpart require? A recipient, including one... good faith efforts whenever procuring construction, equipment, services and supplies under an EPA...
40 CFR 33.301 - What does this subpart require?
Code of Federal Regulations, 2011 CFR
2011-07-01
... AGENCY PROGRAMS Good Faith Efforts § 33.301 What does this subpart require? A recipient, including one... good faith efforts whenever procuring construction, equipment, services and supplies under an EPA...
Beam Stability R&D for the APS MBA Upgrade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sereno, Nicholas S.; Arnold, Ned D.; Bui, Hanh D.
2015-01-01
Beam diagnostics required for the APS Multi-bend acromat (MBA) are driven by ambitious beam stability requirements. The major AC stability challenge is to correct rms beam motion to 10% the rms beam size at the insertion device source points from0.01 to 1000 Hz. The vertical plane represents the biggest challenge forAC stability, which is required to be 400 nm rms for a 4-micron vertical beam size. In addition to AC stability, long-term drift over a period of seven days is required to be 1 micron or less. Major diagnostics R&D components include improved rf beam position processing using commercially availablemore » FPGA-based BPM processors, new X-ray beam position monitors based on hard X-ray fluorescence from copper and Compton scattering off diamond, mechanical motion sensing to detect and correct long-term vacuum chamber drift, a new feedback system featuring a tenfold increase in sampling rate, and a several-fold increase in the number of fast correctors and BPMs in the feedback algorithm. Feedback system development represents a major effort, and we are pursuing development of a novel algorithm that integrates orbit correction for both slow and fast correctors down to DC simultaneously. Finally, a new data acquisition system (DAQ) is being developed to simultaneously acquire streaming data from all diagnostics as well as the feedback processors for commissioning and fault diagnosis. Results of studies and the design effort are reported.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Negrete, Oscar A.; Branda, Catherine; Hardesty, Jasper O. E.
In the response to and recovery from a critical homeland security event involving deliberate or accidental release of biological agents, initial decontamination efforts are necessarily followed by tests for the presence of residual live virus or bacteria. Such 'clearance sampling' should be rapid and accurate, to inform decision makers as they take appropriate action to ensure the safety of the public and of operational personnel. However, the current protocol for clearance sampling is extremely time-intensive and costly, and requires significant amounts of laboratory space and capacity. Detection of residual live virus is particularly problematic and time-consuming, as it requires evaluationmore » of replication potential within a eukaryotic host such as chicken embryos. The intention of this project was to develop a new method for clearance sampling, by leveraging Sandia's expertise in the biological and material sciences in order to create a C. elegans-based foam that could be applied directly to the entire contaminated area for quick and accurate detection of any and all residual live virus by means of a fluorescent signal. Such a novel technology for rapid, on-site detection of live virus would greatly interest the DHS, DoD, and EPA, and hold broad commercial potential, especially with regard to the transportation industry.« less
Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset.
Shirts, Michael R; Klein, Christoph; Swails, Jason M; Yin, Jian; Gilson, Michael K; Mobley, David L; Case, David A; Zhong, Ellen D
2017-01-01
We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to better than 0.1 % relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb's constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison.
Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset
Shirts, Michael R.; Klein, Christoph; Swails, Jason M.; Yin, Jian; Gilson, Michael K.; Mobley, David L.; Case, David A.; Zhong, Ellen D.
2017-01-01
We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to a better than 0.1% relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb’s constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison. PMID:27787702
Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset
NASA Astrophysics Data System (ADS)
Shirts, Michael R.; Klein, Christoph; Swails, Jason M.; Yin, Jian; Gilson, Michael K.; Mobley, David L.; Case, David A.; Zhong, Ellen D.
2017-01-01
We describe our efforts to prepare common starting structures and models for the SAMPL5 blind prediction challenge. We generated the starting input files and single configuration potential energies for the host-guest in the SAMPL5 blind prediction challenge for the GROMACS, AMBER, LAMMPS, DESMOND and CHARMM molecular simulation programs. All conversions were fully automated from the originally prepared AMBER input files using a combination of the ParmEd and InterMol conversion programs. We find that the energy calculations for all molecular dynamics engines for this molecular set agree to better than 0.1 % relative absolute energy for all energy components, and in most cases an order of magnitude better, when reasonable choices are made for different cutoff parameters. However, there are some surprising sources of statistically significant differences. Most importantly, different choices of Coulomb's constant between programs are one of the largest sources of discrepancies in energies. We discuss the measures required to get good agreement in the energies for equivalent starting configurations between the simulation programs, and the energy differences that occur when simulations are run with program-specific default simulation parameter values. Finally, we discuss what was required to automate this conversion and comparison.
Mars rover sample return: An exobiology science scenario
NASA Technical Reports Server (NTRS)
Rosenthal, D. A.; Sims, M. H.; Schwartz, Deborah E.; Nedell, S. S.; Mckay, Christopher P.; Mancinelli, Rocco L.
1988-01-01
A mission designed to collect and return samples from Mars will provide information regarding its composition, history, and evolution. At the same time, a sample return mission generates a technical challenge. Sophisticated, semi-autonomous, robotic spacecraft systems must be developed in order to carry out complex operations at the surface of a very distant planet. An interdisciplinary effort was conducted to consider how much a Mars mission can be realistically structured to maximize the planetary science return. The focus was to concentrate on a particular set of scientific objectives (exobiology), to determine the instrumentation and analyses required to search for biological signatures, and to evaluate what analyses and decision making can be effectively performed by the rover in order to minimize the overhead of constant communication between Mars and the Earth. Investigations were also begun in the area of machine vision to determine whether layered sedimentary structures can be recognized autonomously, and preliminary results are encouraging.
Harmonisation of microbial sampling and testing methods for distillate fuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, G.C.; Hill, E.C.
1995-05-01
Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems andmore » describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.« less
IOTA: the array controller for a gigapixel OTCCD camera for Pan-STARRS
NASA Astrophysics Data System (ADS)
Onaka, Peter; Tonry, John; Luppino, Gerard; Lockhart, Charles; Lee, Aaron; Ching, Gregory; Isani, Sidik; Uyeshiro, Robin
2004-09-01
The PanSTARRS project has undertaken an ambitious effort to develop a completely new array controller architecture that is fundamentally driven by the large 1gigapixel, low noise, high speed OTCCD mosaic requirements as well as the size, power and weight restrictions of the PanSTARRS telescope. The result is a very small form factor next generation controller scalar building block with 1 Gigabit Ethernet interfaces that will be assembled into a system that will readout 512 outputs at ~1 Megapixel sample rates on each output. The paper will also discuss critical technology and fabrication techniques such as greater than 1MHz analog to digital converters (ADCs), multiple fast sampling and digital calculation of multiple correlated samples (DMCS), ball grid array (BGA) packaged circuits, LINUX running on embedded field programmable gate arrays (FPGAs) with hard core microprocessors for the prototype currently being developed.
Enhancing conflict competency.
Waite, Roberta; McKinney, Nicole S
2014-01-01
Professional nurses are taking on leadership roles of diverse healthcare teams. Development of conflict competence is essential, yet requires self-awareness and deliberate effort. Heightened awareness of one's preferred conflict style and cognizance of the implications of overuse and/or underuse of these styles is important. DESIGN/METHODOLOGICAL APPROACH: A pre-post survey design (N = 14) used paired sample T-test. Paired sample correlations and an overview of the paired sample test are reported. Students gained self-awareness about their preferred conflict style, recognized that each conflict style has its utility depending on any given situation, and demonstrated a difference in their most frequently used style. Limited data conveys conflict behavior styles among pre-licensure nursing; however, students can influence their own environments (either causing or fueling situations) by their personal conflict-handling styles. Early development of these skills can raise awareness and cultivate ease in the management of conflict within varied settings.
Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Nikbay, Melike; Heeg, Jennifer
2017-01-01
This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.
Design of partially supervised classifiers for multispectral image data
NASA Technical Reports Server (NTRS)
Jeon, Byeungwoo; Landgrebe, David
1993-01-01
A partially supervised classification problem is addressed, especially when the class definition and corresponding training samples are provided a priori only for just one particular class. In practical applications of pattern classification techniques, a frequently observed characteristic is the heavy, often nearly impossible requirements on representative prior statistical class characteristics of all classes in a given data set. Considering the effort in both time and man-power required to have a well-defined, exhaustive list of classes with a corresponding representative set of training samples, this 'partially' supervised capability would be very desirable, assuming adequate classifier performance can be obtained. Two different classification algorithms are developed to achieve simplicity in classifier design by reducing the requirement of prior statistical information without sacrificing significant classifying capability. The first one is based on optimal significance testing, where the optimal acceptance probability is estimated directly from the data set. In the second approach, the partially supervised classification is considered as a problem of unsupervised clustering with initially one known cluster or class. A weighted unsupervised clustering procedure is developed to automatically define other classes and estimate their class statistics. The operational simplicity thus realized should make these partially supervised classification schemes very viable tools in pattern classification.
NASA Technical Reports Server (NTRS)
McCubbin, Francis M.; Zeigler, Ryan A.
2017-01-01
The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.
NASA Technical Reports Server (NTRS)
McCubbin, F. M.; Evans, C. A.; Fries, M. D.; Harrington, A. D.; Regberg, A. B.; Snead, C. J.; Zeigler, R. A.
2017-01-01
The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for re-search, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.
Goldmon, Moses; Roberson, James T; Carey, Tim; Godley, Paul; Howard, Daniel L; Boyd, Carlton; Ammerman, Alice
2008-01-01
This article describes the Carolina-Shaw Partnership for the Elimination of Health Disparities efforts to engage a diverse group of Black churches in a sustainable network. We sought to develop a diverse network of 25 churches to work with the Carolina-Shaw Partnership to develop sustainable health disparities research, education, and intervention initiatives. Churches were selected based on location, pastoral buy-in, and capacity to engage. A purposive sampling technique was applied. (1) Collecting information on the location and characteristics of churches helps to identify and recruit churches that possess the desired qualities and characteristics. (2) The process used to identify, recruit, and select churches is time intensive. (3) The time, energy, and effort required managing an inter-institutional partnership and engage churches in health disparities research and interventions lends itself to sustainability. The development of a sustainable network of churches could lead to successful health disparities initiatives.
Software Certification and Software Certificate Management Systems
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
Incremental certification and re-certification of code as it is developed and modified is a prerequisite for applying modem, evolutionary development processes, which are especially relevant for NASA. For example, the Columbia Accident Investigation Board (CAIB) report 121 concluded there is "the need for improved and uniform statistical sampling, audit, and certification processes". Also, re-certification time has been a limiting factor in making changes to Space Shuttle code close to launch time. This is likely to be an even bigger problem with the rapid turnaround required in developing NASA s replacement for the Space Shuttle, the Crew Exploration Vehicle (CEV). Hence, intelligent development processes are needed which place certification at the center of development. If certification tools provide useful information, such as estimated time and effort, they are more likely to be adopted. The ultimate impact of such a tool will be reduced effort and increased reliability.
NASA Technical Reports Server (NTRS)
Baker, G. R.; Fethe, T. P.
1975-01-01
Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.
NASA Astrophysics Data System (ADS)
Ryan, A. J.; Christensen, P. R.
2016-12-01
Laboratory measurements have been necessary to interpret thermal data of planetary surfaces for decades. We present a novel radiometric laboratory method to determine temperature-dependent thermal conductivity of complex regolith simulants under high vacuum and across a wide range of temperatures. Here, we present our laboratory method, strategy, and initial results. This method relies on radiometric temperature measurements instead of contact measurements, eliminating the need to disturb the sample with thermal probes. We intend to determine the conductivity of grains that are up to 2 cm in diameter and to parameterize the effects of angularity, sorting, layering, composition, and cementation. These results will support the efforts of the OSIRIS-REx team in selecting a site on asteroid Bennu that is safe and meets grain size requirements for sampling. Our system consists of a cryostat vacuum chamber with an internal liquid nitrogen dewar. A granular sample is contained in a cylindrical cup that is 4 cm in diameter and 1 to 6 cm deep. The surface of the sample is exposed to vacuum and is surrounded by a black liquid nitrogen cold shroud. Once the system has equilibrated at 80 K, the base of the sample cup is rapidly heated to 450 K. An infrared camera observes the sample from above to monitor its temperature change over time. We have built a time-dependent finite element model of the experiment in COMSOL Multiphysics. Boundary temperature conditions and all known material properties (including surface emissivities) are included to replicate the experiment as closely as possible. The Optimization module in COMSOL is specifically designed for parameter estimation. Sample thermal conductivity is assumed to be a quadratic or cubic polynomial function of temperature. We thus use gradient-based optimization methods in COMSOL to vary the polynomial coefficients in an effort to reduce the least squares error between the measured and modeled sample surface temperature.
Perspectives in marine science: A European point of view
NASA Astrophysics Data System (ADS)
Barthel, K.-G.
1995-03-01
Marine research has always been a field in science which was particularly open to, and at times dependent on, international cooperation, and this has become even more obvious during the last decade when issues of global change became central to any discussion. The global nature of scientific and other problems, required the development of new concepts and led to the establishment of new structures in research, coordination and funding on an international level. In Europe the 12 member European Community often served as a nucleus for larger networks and initiatives (e.g. COST, EUREKA), and in 1989 the EC itself launched a specific programme on marine research and technology (MAST). The various initiatives are not meant to replace, national efforts but to complement them — where added value arises from international cooperation, e.g. in global programmes like IGBP, WCRP and their various core projects. The focus of support for these programmes through international funding agencies and networks is not primarily on additional research money but more on structural support and coordination. In contrast, the MAST targeted projects on the North Atlantic margin and the Mediterranean also receive substantial basic support, and are designed to fill gaps left by other international research projects. Both EC and other projects profit from the coordinating measures offered by the EC Commission. A more efficient use of facilities (research vessels, special equipment) can be achieved by having central information services. Well-integrated international projects also require additional efforts in standardization of instrumentation, methods and units, with respect to sampling, sample processing and data treatment. Furthermore, the scope of the task to tackle questions of global change demands the development of new technologies like ROVs, biosensors, automatized sample and data acquisition and treatment, etc. Full exploitation of the results in scientific, political and economical terms is only facilitated through special concertation. Mathematical modelling of ecosystems is still in its infancy and needs further cooperative development in order to provide tools for forecasting and management. Finally, training and exchange of personnel on a European, and possibly wider, level needs to be intensified in order to meet the requirements of modern science.
The Challenging Pupil in the Classroom: Child Effects on Teachers
Houts, Renate M.; Caspi, Avshalom; Pianta, Robert C.; Arseneault, Louise; Moffitt, Terrie E.
2012-01-01
Teaching children requires effort and some children naturally require more effort than others. This study tests whether teacher effort devoted to individual children varies as a function of children’s personal characteristics. Using a nation-wide longitudinal study of twins followed between ages 5-12 years, we asked teachers about the effort they invested in each child enrolled in our study. We found that teacher effort was a function of heritable child characteristics; that children’s challenging behavior assessed at age 5 predicted teacher effort at age 12; and that challenging child behavior and teacher effort share common etiology in children’s genes. While child effects accounted for a significant proportion of variance in teacher effort, we also found variation that could not be attributed to children’s behavior. Treating children with challenging behavior and enhancing teachers’ skills in behavior management could increase the time and energy teachers have to deliver curriculum in their classrooms. PMID:21078897
Peek, Monica E; Wilson, Shannon C; Bussey-Jones, Jada; Lypson, Monica; Cordasco, Kristina; Jacobs, Elizabeth A; Bright, Cedric; Brown, Arleen F
2012-06-01
To characterize national physician organizations' efforts to reduce health disparities and identify organizational characteristics associated with such efforts. This cross-sectional study was conducted between September 2009 and June 2010. The authors used two-sample t tests and chi-square tests to compare the proportion of organizations with disparity-reducing activities between different organizational types (e.g., primary care versus subspecialty organizations, small [<1,000 members] versus large [>5,000 members]). Inclusion criteria required physician organizations to be (1) focused on physicians, (2) national in scope, and (3) membership based. The number of activities per organization ranged from 0 to 22. Approximately half (53%) of organizations had 0 or 1 disparity-reducing activities. Organizational characteristics associated with having at least 1 disparity-reducing effort included membership size (88% of large groups versus 58% of small groups had at least 1 activity; P = .004) and the presence of a health disparities committee (95% versus 59%; P < .001). Primary care (versus subspecialty) organizations and racial/ethnic minority physician organizations were more likely to have disparity-reducing efforts, although findings were not statistically significant. Common themes addressed by activities were health care access, health care disparities, workforce diversity, and language barriers. Common strategies included education of physicians/trainees and patients/general public, position statements, and advocacy. Despite the national priority to eliminate health disparities, more than half of national physician organizations are doing little to address this problem. Primary care and minority physician organizations, and those with disparities committees, may provide leadership to extend the scope of disparity-reduction efforts.
40 CFR 33.302 - Are there any additional contract administration requirements?
Code of Federal Regulations, 2010 CFR
2010-07-01
... ENVIRONMENTAL PROTECTION AGENCY PROGRAMS Good Faith Efforts § 33.302 Are there any additional contract... require its prime contractor to employ the six good faith efforts described in § 33.301 even if the prime... the subcontract for any reason, the recipient must require the prime contractor to employ the six good...
Sensor Technologies for Particulate Detection and Characterization
NASA Technical Reports Server (NTRS)
Greenberg, Paul S.
2008-01-01
Planned Lunar missions have resulted in renewed attention to problems attributable to fine particulates. While the difficulties experienced during the sequence of Apollo missions did not prove critical in all cases, the comparatively long duration of impending missions may present a different situation. This situation creates the need for a spectrum of particulate sensing technologies. From a fundamental perspective, an improved understanding of the properties of the dust fraction is required. Described here is laboratory-based reference instrumentation for the measurement of fundamental particle size distribution (PSD) functions from 2.5 nanometers to 20 micrometers. Concomitant efforts for separating samples into fractional size bins are also presented. A requirement also exists for developing mission compatible sensors. Examples include provisions for air quality monitoring in spacecraft and remote habitation modules. Required sensor attributes such as low mass, volume, and power consumption, autonomy of operation, and extended reliability cannot be accommodated by existing technologies.
Cognitive dissonance in children: justification of effort or contrast?
Alessandri, Jérôme; Darcheville, Jean-Claude; Zentall, Thomas R
2008-06-01
Justification of effort is a form of cognitive dissonance in which the subjective value of an outcome is directly related to the effort that went into obtaining it. However, it is likely that in social contexts (such as the requirements for joining a group) an inference can be made (perhaps incorrectly) that an outcome that requires greater effort to obtain in fact has greater value. Here we present evidence that a cognitive dissonance effect can be found in children under conditions that offer better control for the social value of the outcome. This effect is quite similar to contrast effects that recently have been studied in animals. We suggest that contrast between the effort required to obtain the outcome and the outcome itself provides a more parsimonious account of this phenomenon and perhaps other related cognitive dissonance phenomena as well. Research will be needed to identify cognitive dissonance processes that are different from contrast effects of this kind.
Qi, Xingliang; Zhang, Jing; Liu, Yapeng; Ji, Shuang; Chen, Zheng; Sluiter, Judith K; Deng, Huihua
2014-04-01
The present study aims to investigate the relationship between effort-reward imbalance and hair cortisol concentration among teachers to examine whether hair cortisol can be a biomarker of chronic work stress. Hair samples were collected from 39 female teachers from three kindergartens. Cortisol was extracted from the hair samples with methanol, and cortisol concentrations were measured with high performance liquid chromatography-tandem mass spectrometry. Work stress was measured using the effort-reward imbalance scale. The ratio of effort to reward showed significantly positive association with hair cortisol concentration. The cortisol concentration in the system increases with the effort-reward imbalance. Measurement of hair cortisol can become a useful biomarker of chronic work stress. Copyright © 2014 Elsevier Inc. All rights reserved.
ELECTROFISHING DISTANCE AND NUMBER OF SPECIES COLLECTED FROM THREE RAFTABLE WESTERN RIVERS
A key issue in assessing a fish assemblage at a site is determining a sufficient sampling effort to adequately represent the species in an assemblage. Inadequate effort produces considerable noise in multiple samples at the site or under-represents the species present. Excessiv...
Estimating abundance of adult striped bass in reservoirs using mobile hydroacoustics
Hightower, Joseph E.; Taylor, J. Christopher; Degan, Donald J.
2013-01-01
Hydroacoustic surveys have proven valuable for estimating reservoir forage fish abundance but are more challenging for adult predators such as striped bass Morone saxatilis. Difficulties in assessing striped bass in reservoirs include their low density and the inability to distinguish species with hydroacoustic data alone. Despite these difficulties, mobile hydroacoustic surveys have potential to provide useful data for management because of the large sample volume compared to traditional methods such as gill netting and the ability to target specific areas where striped bass are aggregated. Hydroacoustic estimates of reservoir striped bass have been made using mobile surveys, with data analysis using a threshold for target strength in order to focus on striped bass-sized targets, and auxiliary sampling with nets to obtain species composition. We provide recommendations regarding survey design, based in part on simulations that provide insight on the level of effort that would be required to achieve reasonable estimates of abundance. Future surveys may be able to incorporate telemetry or other sonar techniques such as side-scan or multibeam in order to focus survey efforts on productive habitats (within lake and vertically). However, species apportionment will likely remain the main source of error, and we see no hydroacoustic system on the horizon that will identify fish by species at the spatial and temporal scale required for most reservoir surveys. In situations where species composition can be reliably assessed using traditional gears, abundance estimates from hydroacoustic methods should be useful to fishery managers interested in developing harvest regulations, assessing survival of stocked juveniles, identifying seasonal aggregations, and examining predator–prey balance.
Ensemble Sampling vs. Time Sampling in Molecular Dynamics Simulations of Thermal Conductivity
Gordiz, Kiarash; Singh, David J.; Henry, Asegun
2015-01-29
In this report we compare time sampling and ensemble averaging as two different methods available for phase space sampling. For the comparison, we calculate thermal conductivities of solid argon and silicon structures, using equilibrium molecular dynamics. We introduce two different schemes for the ensemble averaging approach, and show that both can reduce the total simulation time as compared to time averaging. It is also found that velocity rescaling is an efficient mechanism for phase space exploration. Although our methodology is tested using classical molecular dynamics, the ensemble generation approaches may find their greatest utility in computationally expensive simulations such asmore » first principles molecular dynamics. For such simulations, where each time step is costly, time sampling can require long simulation times because each time step must be evaluated sequentially and therefore phase space averaging is achieved through sequential operations. On the other hand, with ensemble averaging, phase space sampling can be achieved through parallel operations, since each ensemble is independent. For this reason, particularly when using massively parallel architectures, ensemble sampling can result in much shorter simulation times and exhibits similar overall computational effort.« less
Kane, Michael J; Brown, Leslie H; McVay, Jennifer C; Silvia, Paul J; Myin-Germeys, Inez; Kwapil, Thomas R
2007-07-01
An experience-sampling study of 124 undergraduates, pretested on complex memory-span tasks, examined the relation between working memory capacity (WMC) and the experience of mind wandering in daily life. Over 7 days, personal digital assistants signaled subjects eight times daily to report immediately whether their thoughts had wandered from their current activity, and to describe their psychological and physical context. WMC moderated the relation between mind wandering and activities' cognitive demand. During challenging activities requiring concentration and effort, higher-WMC subjects maintained on-task thoughts better, and mind-wandered less, than did lower-WMC subjects. The results were therefore consistent with theories of WMC emphasizing the role of executive attention and control processes in determining individual differences and their cognitive consequences.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995, 44 U.S.C. 3506(c)(2)(A). This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the proposed information collection for updating Radiation Sampling and Exposure Records.
Power of sign surveys to monitor population trend
Kendall, Katherine C.; Metzgar, Lee H.; Patterson, David A.; Steele, Brian M.
1992-01-01
The urgent need for an effective monitoring scheme for grizzly bear (Ursus arctos) populations led us to investigate the effort required to detect changes in populations of low—density dispersed animals, using sign (mainly scats and tracks) they leave on trails. We surveyed trails in Glacier National Park for bear tracks and scats during five consecutive years. Using these data, we modeled the occurrence of bear sign on trails, then estimated the power of various sampling schemes. Specifically, we explored the power of bear sign surveys to detect a 20% decline in sign occurrence. Realistic sampling schemes appear feasible if the density of sign is high enough, and we provide guidelines for designs with adequate replication to monitor long—term trends of dispersed populations using sign occurrences on trails.
Surface-enhanced Raman sensor for trace chemical detection in water
NASA Astrophysics Data System (ADS)
Lee, Vincent Y.; Farquharson, Stuart; Rainey, Petrie M.
1999-11-01
Surface-enhanced Raman spectroscopy (SERS) promises to be one of the most sensitive methods for chemical detection and in recent years SERS has been used for chemical, biochemical, environmental, and physiological applications. A variety of methods using various media (electrodes, colloids, and substrates) have been successfully developed to enhance Raman signals by six orders of magnitude and more. However, SERS has not become a routine analytical technique because these methods are unable to provide quantitative measurements. This is largely due to the inability to fabricate a sampling medium that provides reversible chemical adsorption, analysis-to-analysis reproducibility, unrestricted solution requirements (reagent concentration and pH) or sample phase (liquid or solid). In an effort to overcome these restrictions, we have developed metal-doped sol-gels to provide surface-enhancement of Raman scattering.
Shelton, Larry R.
1994-01-01
The U.S. Geological Survey's National Water-Quality Assessment program includes extensive data- collection efforts to assess the quality of the Nations's streams. These studies require analyses of stream samples for major ions, nutrients, sediments, and organic contaminants. For the information to be comparable among studies in different parts of the Nation, consistent procedures specifically designed to produce uncontaminated samples for trace analysis in the laboratory are critical. This field guide describes the standard procedures for collecting and processing samples for major ions, nutrients, organic contaminants, sediment, and field analyses of conductivity, pH, alkalinity, and dissolved oxygen. Samples are collected and processed using modified and newly designed equipment made of Teflon to avoid contamination, including nonmetallic samplers (D-77 and DH-81) and a Teflon sample splitter. Field solid-phase extraction procedures developed to process samples for organic constituent analyses produce an extracted sample with stabilized compounds for more accurate results. Improvements to standard operational procedures include the use of processing chambers and capsule filtering systems. A modified collecting and processing procedure for organic carbon is designed to avoid contamination from equipment cleaned with methanol. Quality assurance is maintained by strict collecting and processing procedures, replicate sampling, equipment blank samples, and a rigid cleaning procedure using detergent, hydrochloric acid, and methanol.
Peixoto, Roberta B.; Machado-Silva, Fausto; Marotta, Humberto; Enrich-Prast, Alex; Bastviken, David
2015-01-01
Inland waters (lakes, rivers and reservoirs) are now understood to contribute large amounts of methane (CH4) to the atmosphere. However, fluxes are poorly constrained and there is a need for improved knowledge on spatiotemporal variability and on ways of optimizing sampling efforts to yield representative emission estimates for different types of aquatic ecosystems. Low-latitude floodplain lakes and wetlands are among the most high-emitting environments, and here we provide a detailed investigation of spatial and day-to-day variability in a shallow floodplain lake in the Pantanal in Brazil over a five-day period. CH4 flux was dominated by frequent and ubiquitous ebullition. A strong but predictable spatial variability (decreasing flux with increasing distance to the shore or to littoral vegetation) was found, and this pattern can be addressed by sampling along transects from the shore to the center. Although no distinct day-to-day variability were found, a significant increase in flux was identified from measurement day 1 to measurement day 5, which was likely attributable to a simultaneous increase in temperature. Our study demonstrates that representative emission assessments requires consideration of spatial variability, but also that spatial variability patterns are predictable for lakes of this type and may therefore be addressed through limited sampling efforts if designed properly (e.g., fewer chambers may be used if organized along transects). Such optimized assessments of spatial variability are beneficial by allowing more of the available sampling resources to focus on assessing temporal variability, thereby improving overall flux assessments. PMID:25860229
On the importance of incorporating sampling weights in ...
Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h
Error-associated behaviors and error rates for robotic geology
NASA Technical Reports Server (NTRS)
Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin
2004-01-01
This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.
Duarte, Jaime E; Gebrekristos, Berkenesh; Perez, Sergi; Rowe, Justin B; Sharp, Kelli; Reinkensmeyer, David J
2013-06-01
Robotic devices can modulate success rates and required effort levels during motor training, but it is unclear how this affects performance gains and motivation. Here we present results from training unimpaired humans in a virtual golf-putting task, and training spinal cord injured (SCI) rats in a grip strength task using robotically modulated success rates and effort levels. Robotic assistance in golf practice increased trainees feelings of competence, and, paradoxically, increased their sense effort, even though it had mixed effects on learning. Reducing effort during a grip strength training task led rats with SCI to practice the task more frequently. However, the more frequent practice of these rats did not cause them to exceed the strength gains achieved by rats that exercised less often at higher required effort levels. These results show that increasing success and decreasing effort with robots increases motivation, but has mixed effects on performance gains.
Barcoding of live human peripheral blood mononuclear cells for multiplexed mass cytometry.
Mei, Henrik E; Leipold, Michael D; Schulz, Axel Ronald; Chester, Cariad; Maecker, Holden T
2015-02-15
Mass cytometry is developing as a means of multiparametric single-cell analysis. In this study, we present an approach to barcoding separate live human PBMC samples for combined preparation and acquisition on a cytometry by time of flight instrument. Using six different anti-CD45 Ab conjugates labeled with Pd104, Pd106, Pd108, Pd110, In113, and In115, respectively, we barcoded up to 20 samples with unique combinations of exactly three different CD45 Ab tags. Cell events carrying more than or less than three different tags were excluded from analyses during Boolean data deconvolution, allowing for precise sample assignment and the electronic removal of cell aggregates. Data from barcoded samples matched data from corresponding individually stained and acquired samples, at cell event recoveries similar to individual sample analyses. The approach greatly reduced technical noise and minimizes unwanted cell doublet events in mass cytometry data, and it reduces wet work and Ab consumption. It also eliminates sample-to-sample carryover and the requirement of instrument cleaning between samples, thereby effectively reducing overall instrument runtime. Hence, CD45 barcoding facilitates accuracy of mass cytometric immunophenotyping studies, thus supporting biomarker discovery efforts, and it should be applicable to fluorescence flow cytometry as well. Copyright © 2015 by The American Association of Immunologists, Inc.
Noise Control in Space Shuttle Orbiter
NASA Technical Reports Server (NTRS)
Goodman, Jerry R.
2009-01-01
Acoustic limits in habitable space enclosures are required to ensure crew safety, comfort, and habitability. Noise control is implemented to ensure compliance with the acoustic requirements. The purpose of this paper is to describe problems with establishing acoustic requirements and noise control efforts, and present examples of noise control treatments and design applications used in the Space Shuttle Orbiter. Included is the need to implement the design discipline of acoustics early in the design process, and noise control throughout a program to ensure that limits are met. The use of dedicated personnel to provide expertise and oversight of acoustic requirements and noise control implementation has shown to be of value in the Space Shuttle Orbiter program. It is concluded that to achieve acceptable and safe noise levels in the crew habitable space, early resolution of acoustic requirements and implementation of effective noise control efforts are needed. Management support of established acoustic requirements and noise control efforts is essential.
Towards a Comparative Index of Seaport Climate-Risk: Development of Indicators from Open Data
NASA Astrophysics Data System (ADS)
McIntosh, R. D.; Becker, A.
2016-02-01
Seaports represent an example of coastal infrastructure that is at once critical to global trade, constrained to the land-sea interface, and exposed to weather and climate hazards. Seaports face impacts associated with projected changes in sea level, sedimentation, ocean chemistry, wave dynamics, temperature, precipitation, and storm frequency and intensity. Port decision-makers have the responsibility to enhance resilience against these impacts. At the multi-port (regional or national) scale, policy-makers must prioritize adaptation efforts to maximize the efficiency of limited physical and financial resources. Prioritization requires comparing across seaports, and comparison requires a standardized assessment method, but efforts to date have either been limited in scope to exposure-only assessments or limited in scale to evaluate one port in isolation from a system of ports. In order to better understand the distribution of risk across ports and to inform transportation resilience policy, we are developing a comparative assessment method to measure the relative climate-risk faced by a sample of ports. Our mixed-methods approach combines a quantitative, data-driven, indicator-based assessment with qualitative data collected via expert-elicitation. In this presentation, we identify and synthesize over 120 potential risk indicators from open data sources. Indicators represent exposure, sensitivity, and adaptive capacity for a pilot sample of 20 ports. Our exploratory data analysis, including Principal Component Analysis, uncovered sources of variance between individual ports and between indicators. Next steps include convening an expert panel representing the perspectives of multiple transportation system agencies to find consensus on a suite of robust indicators and metrics for maritime freight node climate risk assessment. The index will be refined based on expert feedback, the sample size expanded, and additional indicators sought from closed data sources. Developing standardized indicators from available data is an essential step in risk assessment, as robust indicators can help policy-makers monitor resilience strategy implementation, target and justify resource expenditure for adaptation schemes, communicate adaptation to stakeholders, and benchmark progress.
Perspectives on a Policy That Never Was: Trying To Enhance Multiculturalism in a University Setting.
ERIC Educational Resources Information Center
Greenwald, Beatrice
This paper discusses the failure of the University of Washington to formulate a policy regarding the establishment of a Cultural and Ethnic Diversity (CED) course requirement for undergraduates despite nine years of efforts to do so, tracing the efforts to establish a CED requirement, along with the arguments for and against such a requirement. It…
ERIC Educational Resources Information Center
US Government Accountability Office, 2016
2016-01-01
When the Individuals with Disabilities Education Act (IDEA) was reauthorized in 2004, it included provisions to reduce administrative and paperwork requirements to address concerns about burden. The Government Accountability Office (GAO) was asked to review federal efforts to reduce burden related to meeting IDEA requirements for educating…
Beck, Adam W; Lombardi, Joseph V; Abel, Dorothy B; Morales, J Pablo; Marinac-Dabic, Danica; Wang, Grace; Azizzadeh, Ali; Kern, John; Fillinger, Mark; White, Rodney; Cronenwett, Jack L; Cambria, Richard P
2017-05-01
United States Food and Drug Administration (FDA)-mandated postapproval studies have long been a mainstay of the continued evaluation of high-risk medical devices after initial marketing approval; however, these studies often present challenges related to patient/physician recruitment and retention. Retrospective single-center studies also do not fully represent the spectrum of real-world performance nor are they likely to have a sufficiently large enough sample size to detect important signals. In recent years, The FDA Center for Devices and Radiological Health has been promoting the development and use of patient registries to advance infrastructure and methodologies for medical device investigation. The FDA 2012 document, "Strengthening the National System for Medical Device Post-market Surveillance," highlighted registries as a core foundational infrastructure when linked to other complementary data sources, including embedded unique device identification. The Vascular Quality Initiative (VQI) thoracic endovascular aortic repair for type B aortic dissection project is an innovative method of using quality improvement registries to meet the needs of device evaluation after market approval. Here we report the organization and background of this project and highlight the innovation facilitated by collaboration of physicians, the FDA, and device manufacturers. This effort used an existing national network of VQI participants to capture patients undergoing thoracic endovascular aortic repair for acute type B aortic dissection within a registry that aligns with standard practice and existing quality efforts. The VQI captures detailed patient, device, and procedural data for consecutive eligible cases under the auspices of a Patient Safety Organization (PSO). Patients were divided into a 5-year follow-up group (200 acute; 200 chronic dissections) and a 1-year follow-up group (100 acute; 100 chronic). The 5-year cohort required additional imaging details, and the 1-year group required standard VQI registry data entry. The sample size of patients in each of the 5-year acute and chronic dissection arms was achieved ≤24 months of project initiation, and data capture for the 1-year follow-up group is also nearly complete. Data completeness and follow-up has been excellent, and the two FDA-approved devices for dissection are equally represented. Although the completeness of long-term follow-up is yet to be determined, the rapidity of data collection supports the use of this construct for device assessment after market approval. The alignment of this effort with routine clinical practice and ongoing quality improvement initiatives is critical and has required minimal additional effort by practitioners, thus facilitating patient inclusion. Importantly, the success and development of this unique project has helped inform FDA strategy for future device evaluation after market approval. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Variance in binary stellar population synthesis
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane L.
2016-03-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Studying Variance in the Galactic Ultra-compact Binary Population
NASA Astrophysics Data System (ADS)
Larson, Shane L.; Breivik, Katelyn
2017-01-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Lead poisoning in precious metals refinery assayers: a failure to notify workers at high risk.
Kern, D G
1994-05-01
Lead poisoning in a precious metals refinery fire assayer and a routine OSHA inspection prompted an investigation of the index facility, a survey of the industry, and efforts to notify assayers of this previously unrecognized hazard. Air and blood samples were obtained at the index facility. Management personnel from all fire assay laboratories in Rhode Island and southern Massachusetts were interviewed. The industry's trade association, OSHA, NIOSH, trade unions, and the media were asked to assist in a nationwide notification effort. Assayers at the index facility had excessive exposures to lead due to an age-old, lead-based assaying method that remains the industry gold standard. Blood lead levels of the three assayers (mean 61.3 micrograms/dl, range 48-86 micrograms/dl) were considerably higher than those of 16 other refinery workers (mean 27.4 micrograms/dl, range 13-49 micrograms/dl). The industry survey revealed inadequate knowledge of both the lead hazard and the applicability of the OSHA lead standard. Notification efforts failed in large part due to economic obstacles. The notification of workers at high risk of lead exposure and the eradication of occupational lead poisoning will require greater attention to economic forces.
Panico, James; Healey, E Charles
2009-04-01
To determine how text type, topic familiarity, and stuttering frequency influence listener recall, comprehension, and perceived mental effort. Sixty adults listened to familiar and unfamiliar narrative and expository texts produced with 0%, 5%, 10%, and 15% stuttering. Participants listened to 4 experimental text samples at only 1 stuttering frequency. After hearing the text samples, each listener performed a free recall task, answered cued recall questions, answered story comprehension questions, and rated their perceived mental effort. Free and cued recall as well as story comprehension scores were higher for narrative than for expository texts. Free and cued recall scores were better for familiar than for unfamiliar stories, although topic familiarity did not affect story comprehension scores. Samples with all levels of stuttering resulted in higher mental effort ratings for both text types and topic familiarities. Stuttering has a greater influence on listener recall and comprehension for narrative than for expository texts. Topic familiarity affects free and cued recall but has no influence on story comprehension. Regardless of the amount of stuttering, mental effort was high for both text types and levels of familiarity.
Determining the disease management process for epileptic patients: A qualitative study.
Hosseini, Nazafarin; Sharif, Farkhondeh; Ahmadi, Fazlollah; Zare, Mohammad
2016-01-01
Epilepsy exposes patients to many physical, social, and emotional challenges. Thus, it seems to portray a complex picture and needs holistic care. Medical treatment and psychosocial part of epilepsy remain central to managing and improving the patient's qualify of life through team efforts. Some studies have shown the dimensions of self-management, but its management process of epilepsy patients, especially in Iran, is not clear. This study aimed to determine the disease management process in patients with epilepsy in Iran. This qualitative approach and grounded theory study was conducted from January 2009 to February 2012 in Isfahan city (Iran). Thirty-two participants were recruited by the goal-oriented, and snowball sample selection and theoretical sampling methods. After conducting a total of 43 in-depth interviews with the participants, the researchers reached data saturation. Data were analyzed using Strauss and Corbin method. With a focus on disease management process, researchers found three main themes and seven sub-themes as a psychosocial process (PSP). The main themes were: perception of threat to self-identity, effort to preserve self-identity, and burn out. The psychosocial aspect of the disease generated one main variable "the perception of identity loss" and one central variable "searching for self-identity." Participants attributed threat to self-identity and burn out to the way their disease was managed requiring efforts to preserve their identity. Recommendations consist of support programs and strategies to improve the public perception of epilepsy in Iran, help patients accept their condition and preserve self-identity, and most importantly, enhance medical management of epilepsy.
Reiser, Vladimír; Smith, Ryan C; Xue, Jiyan; Kurtz, Marc M; Liu, Rong; Legrand, Cheryl; He, Xuanmin; Yu, Xiang; Wong, Peggy; Hinchcliffe, John S; Tanen, Michael R; Lazar, Gloria; Zieba, Renata; Ichetovkin, Marina; Chen, Zhu; O'Neill, Edward A; Tanaka, Wesley K; Marton, Matthew J; Liao, Jason; Morris, Mark; Hailman, Eric; Tokiwa, George Y; Plump, Andrew S
2011-11-01
With expanding biomarker discovery efforts and increasing costs of drug development, it is critical to maximize the value of mass-limited clinical samples. The main limitation of available methods is the inability to isolate and analyze, from a single sample, molecules requiring incompatible extraction methods. Thus, we developed a novel semiautomated method for tissue processing and tissue milling and division (TMAD). We used a SilverHawk atherectomy catheter to collect atherosclerotic plaques from patients requiring peripheral atherectomy. Tissue preservation by flash freezing was compared with immersion in RNAlater®, and tissue grinding by traditional mortar and pestle was compared with TMAD. Comparators were protein, RNA, and lipid yield and quality. Reproducibility of analyte yield from aliquots of the same tissue sample processed by TMAD was also measured. The quantity and quality of biomarkers extracted from tissue prepared by TMAD was at least as good as that extracted from tissue stored and prepared by traditional means. TMAD enabled parallel analysis of gene expression (quantitative reverse-transcription PCR, microarray), protein composition (ELISA), and lipid content (biochemical assay) from as little as 20 mg of tissue. The mean correlation was r = 0.97 in molecular composition (RNA, protein, or lipid) between aliquots of individual samples generated by TMAD. We also demonstrated that it is feasible to use TMAD in a large-scale clinical study setting. The TMAD methodology described here enables semiautomated, high-throughput sampling of small amounts of heterogeneous tissue specimens by multiple analytical techniques with generally improved quality of recovered biomolecules.
Vassena, Eliana; Gerrits, Robin; Demanet, Jelle; Verguts, Tom; Siugzdaite, Roma
2018-04-26
Preparing for a mentally demanding task calls upon cognitive and motivational resources. The underlying neural implementation of these mechanisms is receiving growing attention because of its implications for professional, social, and medical contexts. While several fMRI studies converge in assigning a crucial role to a cortico-subcortical network including Anterior Cigulate Cortex (ACC) and striatum, the involvement of Dorsolateral Prefrontal Cortex (DLPFC) during mental effort anticipation has yet to be replicated. This study was designed to target DLPFC contribution to anticipation of a difficult task using functional Near Infrared Spectroscopy (fNIRS), as a more cost-effective tool measuring cortical hemodynamics. We adapted a validated mental effort task, where participants performed easy and difficult mental calculation, and measured DLPFC activity during the anticipation phase. As hypothesized, DLPFC activity increased during anticipation of a hard task as compared to an easy task. Besides replicating previous fMRI work, these results establish fNIRS as an effective tool to investigate cortical contributions to anticipation of effortful behavior. This is especially useful if one requires testing large samples (e.g., to target individual differences), populations with contraindication for functional MRI (e.g., infants or patients with metal implants), or subjects in more naturalistic environments (e.g., work or sport). Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Rapid extraction and assay of uranium from environmental surface samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Christopher A.; Chouyyok, Wilaiwan; Speakman, Robert J.
Extraction methods enabling faster removal and concentration of uranium compounds for improved trace and low-level assay are demonstrated for standard surface sampling material in support of nuclear safeguards efforts, health monitoring, and other nuclear analysis applications. A key problem with the existing surface sampling swipes is the requirement for complete digestion of sample and sampling matrix. This is a time-consuming and labour-intensive process that limits laboratory throughput, elevates costs, and increases background levels. Various extraction methods are explored for their potential to quickly and efficiently remove different chemical forms of uranium from standard surface sampling material. A combination of carbonatemore » and peroxide solutions is shown to give the most rapid and complete form of uranyl compound extraction and dissolution. This rapid extraction process is demonstrated to be compatible with standard inductive coupled plasma mass spectrometry methods for uranium isotopic assay as well as screening techniques such as x-ray fluorescence. The general approach described has application beyond uranium to other analytes of nuclear forensic interest (e.g., rare earth elements and plutonium) as well as heavy metals for environmental and industrial hygiene monitoring.« less
Testing and modeling of PBX-9591 shock initiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lam, Kim; Foley, Timothy; Novak, Alan
2010-01-01
This paper describes an ongoing effort to develop a detonation sensitivity test for PBX-9501 that is suitable for studying pristine and damaged HE. The approach involves testing and comparing the sensitivities of HE pressed to various densities and those of pre-damaged samples with similar porosities. The ultimate objectives are to understand the response of pre-damaged HE to shock impacts and to develop practical computational models for use in system analysis codes for HE safety studies. Computer simulation with the CTH shock physics code is used to aid the experimental design and analyze the test results. In the calculations, initiation andmore » growth or failure of detonation are modeled with the empirical HVRB model. The historical LANL SSGT and LSGT were reviewed and it was determined that a new, modified gap test be developed to satisfy the current requirements. In the new test, the donor/spacer/acceptor assembly is placed in a holder that is designed to work with fixtures for pre-damaging the acceptor sample. CTH simulations were made of the gap test with PBX-9501 samples pressed to three different densities. The calculated sensitivities were validated by test observations. The agreement between the computed and experimental critical gap thicknesses, ranging from 9 to 21 mm under various test conditions, is well within 1 mm. These results show that the numerical modeling is a valuable complement to the experimental efforts in studying and understanding shock initiation of PBX-9501.« less
Peek, Monica E.; Wilson, Shannon C.; Bussey-Jones, Jada; Lypson, Monica; Cordasco, Kristina; Jacobs, Elizabeth A.; Bright, Cedric; Brown, Arleen F.
2012-01-01
Purpose To characterize national physician organizations’ efforts to reduce health disparities and identify organizational characteristics associated with such efforts. Method This cross-sectional study was conducted between September 2009 and June 2010. The authors used two-sample t tests and chi-square tests to compare the proportion of organizations with disparity-reducing activities between different organizational types (e.g., primary care versus subspecialty organizations, small [<1,000 members] versus large [>5,000 members]). Inclusion criteria required physician organizations to be (1) focused on physicians, (2) national in scope, and (3) membership based. Results The number of activities per organization ranged from 0 to 22. Approximately half (53%) of organizations had 0 or 1 disparity-reducing activities. Organiza-tional characteristics associated with having at least 1 disparity-reducing effort included membership size (88% of large groups versus 58% of small groups had at least 1 activity; P = .004) and the presence of a health disparities committee (95% versus 59%; P < .001). Primary care (versus subspecialty) organizations and racial/ethnic minority physician organizations were more likely to have disparity-reducing efforts, although findings were not statistically significant. Common themes addressed by activities were health care access, health care disparities, workforce diversity, and language barriers. Common strategies included education of physicians/trainees and patients/general public, position statements, and advocacy. Conclusions Despite the national priority to eliminate health disparities, more than half of national physician organizations are doing little to address this problem. Primary care and minority physician organizations, and those with disparities committees, may provide leadership to extend the scope of disparity-reduction efforts. PMID:22534593
Salamone, John D; Correa, Merce; Yohn, Samantha; Lopez Cruz, Laura; San Miguel, Noemi; Alatorre, Luisa
2016-06-01
This review paper is focused upon the involvement of mesolimbic dopamine (DA) and related brain systems in effort-based processes. Interference with DA transmission affects instrumental behavior in a manner that interacts with the response requirements of the task, such that rats with impaired DA transmission show a heightened sensitivity to ratio requirements. Impaired DA transmission also affects effort-related choice behavior, which is assessed by tasks that offer a choice between a preferred reinforcer that has a high work requirement vs. less preferred reinforcer that can be obtained with minimal effort. Rats and mice with impaired DA transmission reallocate instrumental behavior away from food-reinforced tasks with high response costs, and show increased selection of low reinforcement/low cost options. Tests of effort-related choice have been developed into models of pathological symptoms of motivation that are seen in disorders such as depression and schizophrenia. These models are being employed to explore the effects of conditions associated with various psychopathologies, and to assess drugs for their potential utility as treatments for effort-related symptoms. Studies of the pharmacology of effort-based choice may contribute to the development of treatments for symptoms such as psychomotor slowing, fatigue or anergia, which are seen in depression and other disorders. Copyright © 2016 Elsevier B.V. All rights reserved.
Carvalho, W D; Adania, C H; Esbérard, C E L
2013-02-01
Sampling allows assessing the impact of human activities on mammal communities. It is also possible to assess the accuracy of different sampling methods, especially when the sampling effort is similar. The present study aimed at comparing two mammalian surveys carried out over a three-year interval, in terms of sampling effort, capture success, abundance of domestic dogs, impact of human activities, and relative biomass using camera traps, in the Serra do Japi Biological Reserve and surroundings, located in Jundiaí, state of São Paulo, southeastern Brazil. The total richness recorded was 13 species, one domestic and 12 wild mammals. Sampling effort in both surveys was similar, but capture success and number of captures differed. The abundance of wild mammals and dogs did also differ between surveys. There was a highly significant correlation between abundance of wild mammals and capture effort for the survey performed in 2006/2007, but not for the survey performed in 2009/2010. The difference between samples may be related to human disturbance, since the number of domestic mammals photographed was higher in the second survey, three years after the first survey. Despite being a reserve, the area is still under pressure from urbanization, biological invasion, environmental degradation, and hunting, which may reduce the abundance of wild mammals.
Estimating rates of local species extinction, colonization and turnover in animal communities
Nichols, James D.; Boulinier, T.; Hines, J.E.; Pollock, K.H.; Sauer, J.R.
1998-01-01
Species richness has been identified as a useful state variable for conservation and management purposes. Changes in richness over time provide a basis for predicting and evaluating community responses to management, to natural disturbance, and to changes in factors such as community composition (e.g., the removal of a keystone species). Probabilistic capture-recapture models have been used recently to estimate species richness from species count and presence-absence data. These models do not require the common assumption that all species are detected in sampling efforts. We extend this approach to the development of estimators useful for studying the vital rates responsible for changes in animal communities over time; rates of local species extinction, turnover, and colonization. Our approach to estimation is based on capture-recapture models for closed animal populations that permit heterogeneity in detection probabilities among the different species in the sampled community. We have developed a computer program, COMDYN, to compute many of these estimators and associated bootstrap variances. Analyses using data from the North American Breeding Bird Survey (BBS) suggested that the estimators performed reasonably well. We recommend estimators based on probabilistic modeling for future work on community responses to management efforts as well as on basic questions about community dynamics.
Is psychology suffering from a replication crisis? What does "failure to replicate" really mean?
Maxwell, Scott E; Lau, Michael Y; Howard, George S
2015-09-01
Psychology has recently been viewed as facing a replication crisis because efforts to replicate past study findings frequently do not show the same result. Often, the first study showed a statistically significant result but the replication does not. Questions then arise about whether the first study results were false positives, and whether the replication study correctly indicates that there is truly no effect after all. This article suggests these so-called failures to replicate may not be failures at all, but rather are the result of low statistical power in single replication studies, and the result of failure to appreciate the need for multiple replications in order to have enough power to identify true effects. We provide examples of these power problems and suggest some solutions using Bayesian statistics and meta-analysis. Although the need for multiple replication studies may frustrate those who would prefer quick answers to psychology's alleged crisis, the large sample sizes typically needed to provide firm evidence will almost always require concerted efforts from multiple investigators. As a result, it remains to be seen how many of the recently claimed failures to replicate will be supported or instead may turn out to be artifacts of inadequate sample sizes and single study replications. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
Code of Federal Regulations, 2013 CFR
2013-07-01
... SUPPLIES OF DRINKING WATER § 214.9 Requirements. Providing emergency supplies of clean drinking water... met. (b) The extent of state and local efforts to provide clean drinking water and their capability to do so. Corps efforts to provide temporary supplies of drinking water must be limited to measures...
Code of Federal Regulations, 2014 CFR
2014-07-01
... SUPPLIES OF DRINKING WATER § 214.9 Requirements. Providing emergency supplies of clean drinking water... met. (b) The extent of state and local efforts to provide clean drinking water and their capability to do so. Corps efforts to provide temporary supplies of drinking water must be limited to measures...
Code of Federal Regulations, 2011 CFR
2011-07-01
... SUPPLIES OF DRINKING WATER § 214.9 Requirements. Providing emergency supplies of clean drinking water... met. (b) The extent of state and local efforts to provide clean drinking water and their capability to do so. Corps efforts to provide temporary supplies of drinking water must be limited to measures...
Code of Federal Regulations, 2012 CFR
2012-07-01
... SUPPLIES OF DRINKING WATER § 214.9 Requirements. Providing emergency supplies of clean drinking water... met. (b) The extent of state and local efforts to provide clean drinking water and their capability to do so. Corps efforts to provide temporary supplies of drinking water must be limited to measures...
Digital Management and Curation of the National Rock and Ore Collections at NMNH, Smithsonian
NASA Astrophysics Data System (ADS)
Cottrell, E.; Andrews, B.; Sorensen, S. S.; Hale, L. J.
2011-12-01
The National Museum of Natural History, Smithsonian Institution, is home to the world's largest curated rock collection. The collection houses 160,680 physical rock and ore specimen lots ("samples"), all of which already have a digital record that can be accessed by the public through a searchable web interface (http://collections.mnh.si.edu/search/ms/). In addition, there are 66 accessions pending that when catalogued will add approximately 60,000 specimen lots. NMNH's collections are digitally managed on the KE EMu° platform which has emerged as the premier system for managing collections in natural history museums worldwide. In 2010 the Smithsonian released an ambitious 5 year Digitization Strategic Plan. In Mineral Sciences, new digitization efforts in the next five years will focus on integrating various digital resources for volcanic specimens. EMu sample records will link to the corresponding records for physical eruption information housed within the database of Smithsonian's Global Volcanism Program (GVP). Linkages are also planned between our digital records and geochemical databases (like EarthChem or PetDB) maintained by third parties. We anticipate that these linkages will increase the use of NMNH collections as well as engender new scholarly directions for research. Another large project the museum is currently undertaking involves the integration of the functionality of in-house designed Transaction Management software with the EMu database. This will allow access to the details (borrower, quantity, date, and purpose) of all loans of a given specimen through its catalogue record. We hope this will enable cross-referencing and fertilization of research ideas while avoiding duplicate efforts. While these digitization efforts are critical, we propose that the greatest challenge to sample curation is not posed by digitization and that a global sample registry alone will not ensure that samples are available for reuse. We suggest instead that the ability of the Earth science community to identify and preserve important collections and make them available for future study is limited by personnel and space resources from the level of the individual PI to the level of national facilities. Moreover, when it comes to specimen "estate planning," the cultural attitudes of scientists, institutions, and funding agencies are often inadequate to provide for long-term specimen curation - even if specimen discovery is enabled by digital registry. Timely access to curated samples requires that adequate resources be devoted to the physical care of specimens (facilities) and to the personnel costs associated with curation - from the conservation, storage, and inventory management of specimens, to the dispersal of samples for research, education, and exhibition.
Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M
2012-03-09
A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.
2012-01-01
Background A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. Methods In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). Results The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. Conclusions When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies. PMID:22405400
Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods
Edwards, Matthew S.; Tinker, M. Tim
2009-01-01
Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.
Antipoaching standards in onshore hydrocarbon concessions drawn from a Central African case study.
Vanthomme, Hadrien P A; Tobi, Elie; Todd, Angelique F; Korte, Lisa; Alonso, Alfonso
2017-06-01
Unsustainable hunting outside protected areas is threatening tropical biodiversity worldwide and requires conservationists to engage increasingly in antipoaching activities. Following the example of ecocertified logging companies, we argue that other extractive industries managing large concessions should engage in antipoaching activities as part of their environmental management plans. Onshore hydrocarbon concessions should also adopt antipoaching protocols as a standard because they represent a biodiversity threat comparable to logging. We examined the spatiotemporal patterns of small- and large-mammal poaching in an onshore oil concession in Gabon, Central Africa, with a Bayesian occupancy model based on signs of poaching collected from 2010 to 2015 on antipoaching patrols. Patrol locations were initially determined based on local intelligence and past patrol successes (adaptive management) and subsequently with a systematic sampling of the concession. We generated maps of poaching probability in the concession and determined the temporal trends of this threat over 5 years. The spatiotemporal patterns of large- and small-mammal poaching differed throughout the concession, and likely these groups will need different management strategies. By elucidating the relationship between site-specific sampling effort and detection probability, the Bayesian method allowed us to set goals for future antipoaching patrols. Our results indicate that a combination of systematic sampling and adaptive management data is necessary to infer spatiotemporal patterns with the statistical method we used. On the basis of our case study, we recommend hydrocarbon companies interested in implementing efficient antipoaching activities in their onshore concessions to lay the foundation of long-needed industry standards by: adequately measuring antipoaching effort; mixing adaptive management and balanced sampling; setting goals for antipoaching effort; pairing patrols with large-mammal monitoring; supporting antipoaching patrols across the landscape; restricting access to their concessions; performing random searches for bushmeat and mammal products at points of entry; controlling urban and agricultural expansion; supporting bushmeat alternatives; and supporting land-use planning. Published 2016. This article is a U.S. Government work and is in the public domain in the USA. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
New technologies to detect and monitor Phytophthora ramorum in plant, soil, and water samples
Paul Russell; Nathan McOwen; Robert Bohannon
2013-01-01
The focus of our research efforts has been to develop methods to quickly identify plants, soil, and water samples infested with Phytophthora spp., and to rapidly confirm the findings using novel isothermal DNA technologies suitable for field use. These efforts have led to the development of a rapid Immunostrip® that reliably detects...
Insufficient sampling to identify species affected by turbine collisions
Beston, Julie A.; Diffendorfer, James E.; Loss, Scott
2015-01-01
We compared the number of avian species detected and the sampling effort during fatality monitoring at 50 North American wind facilities. Facilities with short intervals between sampling events and high effort detected more species, but many facilities appeared undersampled. Species accumulation curves for 2 wind facilities studied for more than 1 year had yet to reach an asymptote. The monitoring effort that is typically invested is likely inadequate to identify all of the species killed by wind turbines. This may understate impacts for rare species of conservation concern that collide infrequently with turbines but suffer disproportionate consequences from those fatalities. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.
Insufficient Sampling to Identify Species Affected by Turbine Collisions
Beston, Julie A; Diffendorfer, Jay E; Loss, Scott
2015-01-01
We compared the number of avian species detected and the sampling effort during fatality monitoring at 50 North American wind facilities. Facilities with short intervals between sampling events and high effort detected more species, but many facilities appeared undersampled. Species accumulation curves for 2 wind facilities studied for more than 1 year had yet to reach an asymptote. The monitoring effort that is typically invested is likely inadequate to identify all of the species killed by wind turbines. This may understate impacts for rare species of conservation concern that collide infrequently with turbines but suffer disproportionate consequences from those fatalities. Published 2015. This article is a U.S. Government work and is in the public domain in the USA. PMID:25914425
Zhou, Qing; Lengua, Liliana J.; Wang, Yun
2014-01-01
The relations of parents’ and teachers’ reports of temperament anger-irritability, positive emotionality, and effortful control (attention focusing and inhibitory control) to children’s externalizing and internalizing problems were examined in Chinese (N = 382) and U.S. (N = 322) samples of school-age children. Results suggested that in both cultures, low effortful control and high anger–irritability were associated with high externalizing problems, although the relations were stronger in the Chinese sample than in the U.S. sample. Low positive emotionality was associated with high internalizing problems in both cultures. However, high positive emotionality was associated with noncomorbid externalizing problems (teachers’ reports) in the Chinese sample but not in the U.S. sample. These findings suggest that there are considerable cross-cultural similarities in the temperament-adjustment associations, although some cross-cultural differences might exist. Implications of the findings for the detection and intervention of adjustment problems in Chinese children are discussed. PMID:19413428
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Double sampling to estimate density and population trends in birds
Bart, Jonathan; Earnst, Susan L.
2002-01-01
We present a method for estimating density of nesting birds based on double sampling. The approach involves surveying a large sample of plots using a rapid method such as uncorrected point counts, variable circular plot counts, or the recently suggested double-observer method. A subsample of those plots is also surveyed using intensive methods to determine actual density. The ratio of the mean count on those plots (using the rapid method) to the mean actual density (as determined by the intensive searches) is used to adjust results from the rapid method. The approach works well when results from the rapid method are highly correlated with actual density. We illustrate the method with three years of shorebird surveys from the tundra in northern Alaska. In the rapid method, surveyors covered ~10 ha h-1 and surveyed each plot a single time. The intensive surveys involved three thorough searches, required ~3 h ha-1, and took 20% of the study effort. Surveyors using the rapid method detected an average of 79% of birds present. That detection ratio was used to convert the index obtained in the rapid method into an essentially unbiased estimate of density. Trends estimated from several years of data would also be essentially unbiased. Other advantages of double sampling are that (1) the rapid method can be changed as new methods become available, (2) domains can be compared even if detection rates differ, (3) total population size can be estimated, and (4) valuable ancillary information (e.g. nest success) can be obtained on intensive plots with little additional effort. We suggest that double sampling be used to test the assumption that rapid methods, such as variable circular plot and double-observer methods, yield density estimates that are essentially unbiased. The feasibility of implementing double sampling in a range of habitats needs to be evaluated.
Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations
NASA Astrophysics Data System (ADS)
Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.
2014-12-01
Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.
Monitoring trends in bat populations of the United States and territories: Problems and prospects
O'Shea, T.J.; Bogan, M. A.
2003-01-01
Bats are ecologically and economically important mammals. The life histories of bats (particularly their low reproductive rates and the need for some species to gather in large aggregations at limited numbers of roosting sites) make their populations vulnerable to declines. Many of the species of bats in the United States (U.S.) and territories are categorized as endangered or threatened, have been candidates for such categories, or are considered species of concern. The importance and vulnerability of bat populations makes monitoring trends in their populations a goal for their future management. However, scientifically rigorous monitoring of bat populations requires well-planned, statistically defensible efforts. This volume reports findings of an expert workshop held to examine the topic of monitoring populations of bats. The workshop participants included leading experts in sampling and analysis of wildlife populations, as well as experts in the biology and conservation of bats. Findings are reported in this volume under two sections. Part I of the report presents contributed papers that provide overviews of past and current efforts at monitoring trends in populations of bats in the U.S. and territories. These papers consider current techniques and problems, and summarize what is known about the status and trends in populations of selected groups of bats. The contributed papers in Part I also include a description of the monitoring program developed for bat populations in the United Kingdom, a critique of monitoring programs in wildlife in general with recommendations for survey and sampling strategies, and a compilation and analysis of existing data on trends in bats of the U.S. and territories. Efforts directed at monitoring bat populations are piecemeal and have shortcomings. In Part II of the report, the workshop participants provide critical analyses of these problems and develop recommendations for improving methods, defining objectives and priorities, gaining mandates, and enhancing information exchange to facilitate future efforts for monitoring trends in U.S. bat populations.
Discriminative stimuli that follow a delay have added value for pigeons.
DiGian, Kelly A; Friedrich, Andrea M; Zentall, Thomas R
2004-10-01
Clement, Feltus, Kaiser, and Zentall (2000) reported that pigeons prefer discriminative stimuli that require greater effort (more pecks) to obtain over those that require less effort. In the present experiment, we examined two variables associated with this phenomenon. First, we asked whether delay of reinforcement, presumably a relatively aversive event similar to effort, would produce similar effects. Second, we asked whether the stimulus preference produced by a prior relatively aversive event depends on its anticipation. Anticipation of delay was accomplished by signaling its occurrence. Results indicated that delays can produce preferences similar to those produced by increased effort, but only if the delays are signaled.
Practical application of opt-out recruitment methods in two health services research studies.
Miller, Christopher J; Burgess, James F; Fischer, Ellen P; Hodges, Deborah J; Belanger, Lindsay K; Lipschitz, Jessica M; Easley, Siena R; Koenig, Christopher J; Stanley, Regina L; Pyne, Jeffrey M
2017-04-14
Participant recruitment is an ongoing challenge in health research. Recruitment may be especially difficult for studies of access to health care because, even among those who are in care, people using services least often also may be hardest to contact and recruit. Opt-out recruitment methods (in which potential participants are given the opportunity to decline further contact about the study (opt out) following an initial mailing, and are then contacted directly if they have not opted out within a specified period) can be used for such studies. However, there is a dearth of literature on the effort needed for effective opt-out recruitment. In this paper we describe opt-out recruitment procedures for two studies on access to health care within the U.S. Department of Veterans Affairs. We report resource requirements for recruitment efforts (number of opt-out packets mailed and number of phone calls made). We also compare the characteristics of study participants to potential participants via t-tests, Fisher's exact tests, and chi-squared tests. Recruitment rates for our two studies were 12 and 21%, respectively. Across multiple study sites, we had to send between 4.3 and 9.2 opt-out packets to recruit one participant. The number of phone calls required to arrive at a final status for each potentially eligible Veteran (i.e. study participation or the termination of recruitment efforts) were 2.9 and 6.1 in the two studies, respectively. Study participants differed as expected from the population of potentially eligible Veterans based on planned oversampling of certain subpopulations. The final samples of participants did not differ statistically from those who were mailed opt-out packets, with one exception: in one of our two studies, participants had higher rates of mental health service use in the past year than did those mailed opt-out packets (64 vs. 47%). Our results emphasize the practicality of using opt-out methods for studies of access to health care. Despite the benefits of these methods, opt-out alone may be insufficient to eliminate non-response bias on key variables. Researchers will need to balance considerations of sample representativeness and feasibility when designing studies investigating access to care.
Advanced Curation: Solving Current and Future Sample Return Problems
NASA Technical Reports Server (NTRS)
Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.
2015-01-01
Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon. The chemical kinetics of this reaction are poorly understood at present under the conditions of cached or curated martian samples. Among other parameters, what is the maximum temperature allowed during storage in order to preserve native martian organic compounds for analysis? What is the best means to collect headspace gases from cached martian (and other) samples? This gas will contain not only martian atmosphere but also off-gassed volatiles from the cached solids.
NASA Astrophysics Data System (ADS)
Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland
2016-08-01
In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.
Hadj-Hammou, Jeneen; Loiselle, Steven; Ophof, Daniel; Thornhill, Ian
2017-01-01
While the role of citizen science in engaging the public and providing large-scale datasets has been demonstrated, the nature of and potential for this science to supplement environmental monitoring efforts by government agencies has not yet been fully explored. To this end, the present study investigates the complementarity of a citizen science programme to agency monitoring of water quality. The Environment Agency (EA) is the governmental public body responsible for, among other duties, managing and monitoring water quality and water resources in England. FreshWater Watch (FWW) is a global citizen science project that supports community monitoring of freshwater quality. FWW and EA data were assessed for their spatio-temporal complementarity by comparing the geographical and seasonal coverage of nitrate (N-NO3) sampling across the River Thames catchment by the respective campaigns between spring 2013 and winter 2015. The analysis reveals that FWW citizen science-collected data complements EA data by filling in both gaps in the spatial and temporal coverage as well as gaps in waterbody type and size. In addition, partial spatio-temporal overlap in sampling efforts by the two actors is discovered, but EA sampling is found to be more consistent than FWW sampling. Statistical analyses indicate that regardless of broader geographical overlap in sampling effort, FWW sampling sites are associated with a lower stream order and water bodies of smaller surface areas than EA sampling sites. FWW also samples more still-water body sites than the EA. As a possible result of such differences in sampling tendencies, nitrate concentrations, a measure of water quality, are lower for FWW sites than EA sites. These findings strongly indicate that citizen science has clear potential to complement agency monitoring efforts by generating information on freshwater ecosystems that would otherwise be under reported.
Loiselle, Steven; Ophof, Daniel; Thornhill, Ian
2017-01-01
While the role of citizen science in engaging the public and providing large-scale datasets has been demonstrated, the nature of and potential for this science to supplement environmental monitoring efforts by government agencies has not yet been fully explored. To this end, the present study investigates the complementarity of a citizen science programme to agency monitoring of water quality. The Environment Agency (EA) is the governmental public body responsible for, among other duties, managing and monitoring water quality and water resources in England. FreshWater Watch (FWW) is a global citizen science project that supports community monitoring of freshwater quality. FWW and EA data were assessed for their spatio-temporal complementarity by comparing the geographical and seasonal coverage of nitrate (N-NO3) sampling across the River Thames catchment by the respective campaigns between spring 2013 and winter 2015. The analysis reveals that FWW citizen science-collected data complements EA data by filling in both gaps in the spatial and temporal coverage as well as gaps in waterbody type and size. In addition, partial spatio-temporal overlap in sampling efforts by the two actors is discovered, but EA sampling is found to be more consistent than FWW sampling. Statistical analyses indicate that regardless of broader geographical overlap in sampling effort, FWW sampling sites are associated with a lower stream order and water bodies of smaller surface areas than EA sampling sites. FWW also samples more still-water body sites than the EA. As a possible result of such differences in sampling tendencies, nitrate concentrations, a measure of water quality, are lower for FWW sites than EA sites. These findings strongly indicate that citizen science has clear potential to complement agency monitoring efforts by generating information on freshwater ecosystems that would otherwise be under reported. PMID:29211752
NASA Astrophysics Data System (ADS)
Walker, J. D.; Ash, J. M.; Bowring, J.; Bowring, S. A.; Deino, A. L.; Kislitsyn, R.; Koppers, A. A.
2009-12-01
One of the most onerous tasks in rigorous development of data reporting and databases for geochronological and thermochronological studies is to fully capture all of the metadata needed to completely document both the analytical work as well as the interpretation effort. This information is available in the data reduction programs used by researchers, but has proven difficult to harvest into either publications or databases. For this reason, the EarthChem and EARTHTIME efforts are collaborating to foster the next generation of data management and discovery for age information by integrating data reporting with data reduction. EarthChem is a community-driven effort to facilitate the discovery, access, and preservation of geochemical data of all types and to support research and enable new and better science. EARTHTIME is also a community-initiated project whose aim is to foster the next generation of high-precision geochronology and thermochoronology. In addition, collaboration with the CRONUS effort for cosmogenic radionuclides is in progress. EarthChem workers have met with groups working on the Ar-Ar, U-Pb, and (U-Th)/He systems to establish data reporting requirements as well as XML schemas to be used for transferring data from reduction programs to database. At present, we have prototype systems working for the U-Pb_Redux, ArArCalc, MassSpec, and Helios programs. In each program, the user can select to upload data and metadata to the GEOCHRON system hosted at EarthChem. There are two additional requirements for upload. The first is having a unique identifier (IGSN) obtained either manually or via web services contained within the reduction program from the SESAR system. The second is that the user selects whether the sample is to be available for discovery (public) or remain hidden (private). Search for data at the GEOCHRON portal can be done using age, method, mineral, or location parameters. Data can be downloaded in the full XML format for ingestion back into the reduction program or as abbreviated tables.
Hot Isostatic Pressing of Engineered Forms of I-AgZ
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jubin, Robert Thomas; Watkins, Thomas R.; Bruffey, Stephanie H.
Hot isostatic pressing (HIP) is being considered for direct conversion of 129I-bearing materials to a radiological waste form. The removal of volatile radioactive 129I from the off-gas of a nuclear fuel reprocessing facility will be necessary to comply with regulatory requirements regarding reprocessing facilities sited within the United States, and any iodine-containing media or solid sorbents generated by offgas abatement will require disposal. Zeolite minerals such as silver-exchanged mordenite (AgZ) have been studied as potential iodine sorbents and will contain 129I as chemisorbed AgI. Oak Ridge National Laboratory (ORNL) has conducted several recent studies on the HIP of both iodine-loadedmore » AgZ (I-AgZ) and other iodine-bearing zeolite minerals. The goal of these research efforts is to achieve a stable, highly leach resistant material that is reduced in volume as compared to bulk iodine-loaded I-AgZ. Through the use of HIP, it may be possible to achieve this with the addition of little or no additional materials (waste formers). Other goals for the process include that the waste form will be tolerant to high temperatures and pressures, not chemically hazardous, and that the process will result in minimal secondary waste generation. This document describes the preparation of 27 samples that are distinct from previous efforts in that they are prepared exclusively with an engineered form of AgZ that is manufactured using a binder. Iodine was incorporated solely by chemisorption. This base material is expected to be more representative of an operational system than were samples prepared previously with pure minerals.« less
Interactive Electronic Storybooks for Kindergartners to Promote Vocabulary Growth
ERIC Educational Resources Information Center
Smeets, Daisy J. H.; Bus, Adriana G.
2012-01-01
The goals of this study were to examine (a) whether extratextual vocabulary instructions embedded in electronic storybooks facilitated word learning over reading alone and (b) whether instructional formats that required children to invest more effort were more effective than formats that required less effort. A computer-based "assistant" was added…
34 CFR 461.42 - What is the maintenance of effort requirement?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false What is the maintenance of effort requirement? 461.42 Section 461.42 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION STATE-ADMINISTERED BASIC GRANT...
34 CFR 461.42 - What is the maintenance of effort requirement?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 3 2011-07-01 2011-07-01 false What is the maintenance of effort requirement? 461.42 Section 461.42 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION STATE-ADMINISTERED BASIC GRANT...
34 CFR 461.42 - What is the maintenance of effort requirement?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 3 2014-07-01 2014-07-01 false What is the maintenance of effort requirement? 461.42 Section 461.42 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION STATE-ADMINISTERED BASIC GRANT...
34 CFR 461.42 - What is the maintenance of effort requirement?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 3 2012-07-01 2012-07-01 false What is the maintenance of effort requirement? 461.42 Section 461.42 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION STATE-ADMINISTERED BASIC GRANT...
34 CFR 461.42 - What is the maintenance of effort requirement?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 3 2013-07-01 2013-07-01 false What is the maintenance of effort requirement? 461.42 Section 461.42 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION STATE-ADMINISTERED BASIC GRANT...
TOPEX Project Radar Altimeter Development Requirements and Specifications, Version 6.0
NASA Technical Reports Server (NTRS)
Rossi, Laurence C.
2003-01-01
This document provides the guidelines by which the TOPEX Radar Altimeter hardware development effort for the TOPEX flight project shall be implemented and conducted. The conduct of this activity shall take maximum advantage of the efforts expended during the TOPEX Radar Altimeter Advanced Technology Model development program and other related Radar Altimeter development efforts. This document complies with the TOPEX Project Office document 633-420 (D-2218), entitled, "TOPEX Project Requirements and Constraints for the NASA Radar Altimeter" dated December 1987.
Second-order contrast based on the expectation of effort and reinforcement.
Clement, Tricia S; Zentall, Thomas R
2002-01-01
Pigeons prefer signals for reinforcement that require greater effort (or time) to obtain over those that require less effort to obtain (T. S. Clement, J. Feltus, D. H. Kaiser, & T. R. Zentall, 2000). Preference was attributed to contrast (or to the relatively greater improvement in conditions) produced by the appearance of the signal when it was preceded by greater effort. In Experiment 1, the authors of the present study demonstrated that the expectation of greater effort was sufficient to produce such a preference (a second-order contrast effect). In Experiments 2 and 3, low versus high probability of reinforcement was substituted for high versus low effort, respectively, with similar results. In Experiment 3, the authors found that the stimulus preference could be attributed to positive contrast (when the discriminative stimuli represented an improvement in the probability of reinforcement) and perhaps also negative contrast (when the discriminative stimuli represented reduction in the probability of reinforcement).
Interpreting and Reporting Radiological Water-Quality Data
McCurdy, David E.; Garbarino, John R.; Mullin, Ann H.
2008-01-01
This document provides information to U.S. Geological Survey (USGS) Water Science Centers on interpreting and reporting radiological results for samples of environmental matrices, most notably water. The information provided is intended to be broadly useful throughout the United States, but it is recommended that scientists who work at sites containing radioactive hazardous wastes need to consult additional sources for more detailed information. The document is largely based on recognized national standards and guidance documents for radioanalytical sample processing, most notably the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), and on documents published by the U.S. Environmental Protection Agency and the American National Standards Institute. It does not include discussion of standard USGS practices including field quality-control sample analysis, interpretive report policies, and related issues, all of which shall always be included in any effort by the Water Science Centers. The use of 'shall' in this report signifies a policy requirement of the USGS Office of Water Quality.
1998-09-25
The Food and Drug Administration (FDA) is proposing to amend its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This proposed rule is a companion to the direct final rule published elsewhere in this issue of the Federal Register. This action is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and it is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zolnierczuk, Piotr A; Vacaliuc, Bogdan; Sundaram, Madhan
The Liquids Reflectometer instrument installed at the Spallation Neutron Source (SNS) enables observations of chemical kinetics, solid-state reactions and phase-transitions of thin film materials at both solid and liquid surfaces. Effective measurement of these behaviors requires each sample to be calibrated dynamically using the neutron beam and the data acquisition system in a feedback loop. Since the SNS is an intense neutron source, the time needed to perform the measurement can be the same as the alignment process, leading to a labor-intensive operation that is exhausting to users. An update to the instrument control system, completed in March 2013, implementedmore » the key features of automated sample alignment and robot-driven sample management, allowing for unattended operation over extended periods, lasting as long as 20 hours. We present a case study of the effort, detailing the mechanical, electrical and software modifications that were made as well as the lessons learned during the integration, verification and testing process.« less
Estimates of population change in selected species of tropical birds using mark-recapture data
Brawn, J.; Nichols, J.D.; Hines, J.E.; Nesbitt, J.
2000-01-01
The population biology of tropical birds is known for a only small sample of species; especially in the Neotropics. Robust estimates of parameters such as survival rate and finite rate of population change (A) are crucial for conservation purposes and useful for studies of avian life histories. We used methods developed by Pradel (1996, Biometrics 52:703-709) to estimate A for 10 species of tropical forest lowland birds using data from a long-term (> 20 yr) banding study in Panama. These species constitute a ecologically and phylogenetically diverse sample. We present these estimates and explore if they are consistent with what we know from selected studies of banded birds and from 5 yr of estimating nesting success (i.e., an important component of A). A major goal of these analyses is to assess if the mark-recapture methods generate reliable and reasonably precise estimates of population change than traditional methods that require more sampling effort.
Cognitive Effort Requirements in Recall, Recognition, and Lexical Decision
1985-05-01
that the amount of integrative processing required for an item is a function of its preexisting or baseline familiarity level . Low frequency words... Lockhart , Craik , & Jacoby, 1976). In the present study, increased effort, and possibly increased distinctiveness, does not influence hit rates, which are...ing of items. Second, a lexical decision task, which does not require elabo- rative processing , leads to an overall poor level of recall. Furthermore
2016-08-01
ARMY TRAINING Efforts to Adjust Training Requirements Should Consider the Use of Virtual Training Devices Report...Requirements Should Consider the Use of Virtual Training Devices What GAO Found In 2010, the Army began modifying its training priorities and goals to...until fiscal year 2017. The Army has taken some steps to improve the integration of virtual training devices into operational training, but gaps in
49 CFR Appendix A to Part 26 - Guidance Concerning Good Faith Efforts
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 1 2010-10-01 2010-10-01 false Guidance Concerning Good Faith Efforts A Appendix... A to Part 26—Guidance Concerning Good Faith Efforts I. When, as a recipient, you establish a... efforts to meet the DBE contract requirements. We emphasize, however, that your determination concerning...
Measuring and Modeling Change in Examinee Effort on Low-Stakes Tests across Testing Occasions
ERIC Educational Resources Information Center
Sessoms, John; Finney, Sara J.
2015-01-01
Because schools worldwide use low-stakes tests to make important decisions, value-added indices computed from test scores must accurately reflect student learning, which requires equal test-taking effort across testing occasions. Evaluating change in effort assumes effort is measured equivalently across occasions. We evaluated the longitudinal…
Emerging Definition of Next-Generation of Aeronautical Communications
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.
2006-01-01
Aviation continues to experience rapid growth. In regions such as the United States and Europe air traffic congestion is constraining operations, leading to major new efforts to develop methodologies and infrastructures to enable continued aviation growth through transformational air traffic management systems. Such a transformation requires better communications linking airborne and ground-based elements. Technologies for next-generation communications, the required capacities, frequency spectrum of operation, network interconnectivity, and global interoperability are now receiving increased attention. A number of major planning and development efforts have taken place or are in process now to define the transformed airspace of the future. These activities include government and industry led efforts in the United States and Europe, and by international organizations. This paper will review the features, approaches, and activities of several representative planning and development efforts, and identify the emerging global consensus on requirements of next generation aeronautical communications systems for air traffic control.
Rehabilitation robotics: an academic engineer perspective.
Krebs, Hermano I
2011-01-01
In this paper, we present a retrospective review of our efforts to revolutionize the way physical medicine is practiced by developing and deploying rehabilitation robots. We present a sample of our clinical results with well over 600 stroke patients, both inpatients and outpatients. We discuss the different robots developed at our laboratory over the past 20 years and their unique characteristics. All are configured both to deliver reproducible interactive therapy and also to measure outcomes with minimal encumbrance, thus providing critical measurement tools to help unravel the key remaining question: what constitutes "best practice"? While success to date indicates that this therapeutic application of robots has opened an emerging new frontier in physical medicine and rehabilitation, the barrier to further progress lies not in developing new hardware but rather in finding the most effective way to enhance neuro-recovery. We close this manuscript discussing some of the tools required for advancing the effort beyond the present state to what we believe will be the central feature of research during the next 10 years.
Causal attribution for success and failure in mathematics among MDAB pre-diploma students
NASA Astrophysics Data System (ADS)
Maidinsah, Hamidah; Embong, Rokiah; Wahab, Zubaidah Abd
2014-07-01
The Program Mengubah Destini Anak Bangsa (MDAB) is a pre-diploma programme catering to SPM school leavers who do not meet the minimum requirement to enter any of UiTM diploma programmes. The study aims to evaluate the perceptions of MDAB students toward the main causal attribution factors underlying students' success and failure in mathematics. Research sample comprised of 482 students from five UiTM branch campuses. Research instrument used was a set of GALUS questionnaire consisting of 36 items based on the Weiner Attribution Theory. Four causal attributions factors for success and failures evaluated are ability, effort, question difficulty and environment. GALUS reliability index was 0.93. The research found that effort appears to be the main causal attribution factor in students' success and failure in mathematics, followed by environment, question difficulty and ability. High achiever students strongly agree that the ability factor influenced their success while low achiever students strongly agree that all attributing factors influenced their failures in mathematics.
Dunker, Kristine J.; Sepulveda, Adam; Massengill, Robert L.; Olsen, Jeffrey B.; Russ, Ora L.; Wenburg, John K.; Antonovich, Anton
2016-01-01
Determining the success of invasive species eradication efforts is challenging because populations at very low abundance are difficult to detect. Environmental DNA (eDNA) sampling has recently emerged as a powerful tool for detecting rare aquatic animals; however, detectable fragments of DNA can persist over time despite absence of the targeted taxa and can therefore complicate eDNA sampling after an eradication event. This complication is a large concern for fish eradication efforts in lakes since killed fish can sink to the bottom and slowly decay. DNA released from these carcasses may remain detectable for long periods. Here, we evaluated the efficacy of eDNA sampling to detect invasive Northern pike (Esox lucius) following piscicide eradication efforts in southcentral Alaskan lakes. We used field observations and experiments to test the sensitivity of our Northern pike eDNA assay and to evaluate the persistence of detectable DNA emitted from Northern pike carcasses. We then used eDNA sampling and traditional sampling (i.e., gillnets) to test for presence of Northern pike in four lakes subjected to a piscicide-treatment designed to eradicate this species. We found that our assay could detect an abundant, free-roaming population of Northern pike and could also detect low-densities of Northern pike held in cages. For these caged Northern pike, probability of detection decreased with distance from the cage. We then stocked three lakes with Northern pike carcasses and collected eDNA samples 7, 35 and 70 days post-stocking. We detected DNA at 7 and 35 days, but not at 70 days. Finally, we collected eDNA samples ~ 230 days after four lakes were subjected to piscicide-treatments and detected Northern pike DNA in 3 of 179 samples, with a single detection at each of three lakes, though we did not catch any Northern pike in gillnets. Taken together, we found that eDNA can help to inform eradication efforts if used in conjunction with multiple lines of inquiry and sampling is delayed long enough to allow full degradation of DNA in the water.
Dunker, Kristine J; Sepulveda, Adam J; Massengill, Robert L; Olsen, Jeffrey B; Russ, Ora L; Wenburg, John K; Antonovich, Anton
2016-01-01
Determining the success of invasive species eradication efforts is challenging because populations at very low abundance are difficult to detect. Environmental DNA (eDNA) sampling has recently emerged as a powerful tool for detecting rare aquatic animals; however, detectable fragments of DNA can persist over time despite absence of the targeted taxa and can therefore complicate eDNA sampling after an eradication event. This complication is a large concern for fish eradication efforts in lakes since killed fish can sink to the bottom and slowly decay. DNA released from these carcasses may remain detectable for long periods. Here, we evaluated the efficacy of eDNA sampling to detect invasive Northern pike (Esox lucius) following piscicide eradication efforts in southcentral Alaskan lakes. We used field observations and experiments to test the sensitivity of our Northern pike eDNA assay and to evaluate the persistence of detectable DNA emitted from Northern pike carcasses. We then used eDNA sampling and traditional sampling (i.e., gillnets) to test for presence of Northern pike in four lakes subjected to a piscicide-treatment designed to eradicate this species. We found that our assay could detect an abundant, free-roaming population of Northern pike and could also detect low-densities of Northern pike held in cages. For these caged Northern pike, probability of detection decreased with distance from the cage. We then stocked three lakes with Northern pike carcasses and collected eDNA samples 7, 35 and 70 days post-stocking. We detected DNA at 7 and 35 days, but not at 70 days. Finally, we collected eDNA samples ~ 230 days after four lakes were subjected to piscicide-treatments and detected Northern pike DNA in 3 of 179 samples, with a single detection at each of three lakes, though we did not catch any Northern pike in gillnets. Taken together, we found that eDNA can help to inform eradication efforts if used in conjunction with multiple lines of inquiry and sampling is delayed long enough to allow full degradation of DNA in the water.
Dunker, Kristine J.; Sepulveda, Adam J.; Massengill, Robert L.; Olsen, Jeffrey B.; Russ, Ora L.; Wenburg, John K.; Antonovich, Anton
2016-01-01
Determining the success of invasive species eradication efforts is challenging because populations at very low abundance are difficult to detect. Environmental DNA (eDNA) sampling has recently emerged as a powerful tool for detecting rare aquatic animals; however, detectable fragments of DNA can persist over time despite absence of the targeted taxa and can therefore complicate eDNA sampling after an eradication event. This complication is a large concern for fish eradication efforts in lakes since killed fish can sink to the bottom and slowly decay. DNA released from these carcasses may remain detectable for long periods. Here, we evaluated the efficacy of eDNA sampling to detect invasive Northern pike (Esox lucius) following piscicide eradication efforts in southcentral Alaskan lakes. We used field observations and experiments to test the sensitivity of our Northern pike eDNA assay and to evaluate the persistence of detectable DNA emitted from Northern pike carcasses. We then used eDNA sampling and traditional sampling (i.e., gillnets) to test for presence of Northern pike in four lakes subjected to a piscicide-treatment designed to eradicate this species. We found that our assay could detect an abundant, free-roaming population of Northern pike and could also detect low-densities of Northern pike held in cages. For these caged Northern pike, probability of detection decreased with distance from the cage. We then stocked three lakes with Northern pike carcasses and collected eDNA samples 7, 35 and 70 days post-stocking. We detected DNA at 7 and 35 days, but not at 70 days. Finally, we collected eDNA samples ~ 230 days after four lakes were subjected to piscicide-treatments and detected Northern pike DNA in 3 of 179 samples, with a single detection at each of three lakes, though we did not catch any Northern pike in gillnets. Taken together, we found that eDNA can help to inform eradication efforts if used in conjunction with multiple lines of inquiry and sampling is delayed long enough to allow full degradation of DNA in the water. PMID:27626271
ERIC Educational Resources Information Center
Silva, Kassondra M.; Spinrad, Tracy L.; Eisenberg, Nancy; Sulik, Michael J.; Valiente, Carlos; Huerta, Snjezana; Edwards, Alison; Eggum, Natalie D.; Kupfer, Anne S.; Lonigan, Christopher J.; Phillips, Beth M.; Wilson, Shauna B.; Clancy-Menchetti, Jeanine; Landry, Susan H.; Swank, Paul R.; Assel, Michael A.; Taylor, Heather B.
2011-01-01
Research Findings: The purpose of this study was to examine the relations of children's effortful control and quality of relationships with teachers to school attitudes longitudinally in an ethnically diverse and economically disadvantaged sample. Data were collected as part of a larger intervention project during mid-fall, winter, and late spring…
Monitoring and control technologies for bioregenerative life support systems/CELSS
NASA Technical Reports Server (NTRS)
Knott, William M.; Sager, John C.
1991-01-01
The development of a controlled Ecological Life Support System (CELSS) will require NASA to develop innovative monitoring and control technologies to operate the different components of the system. Primary effort over the past three to four years has been directed toward the development of technologies to operate a biomass production module. Computer hardware and software required to operate, collect, and summarize environmental data for a large plant growth chamber facility were developed and refined. Sensors and controls required to collect information on such physical parameters as relative humidity, temperature, irradiance, pressure, and gases in the atmosphere; and PH, dissolved oxygen, fluid flow rates, and electrical conductivity in the nutrient solutions are being developed and tested. Technologies required to produce high artificial irradiance for plant growth and those required to collect and transport natural light into a plant growth chamber are also being evaluated. Significant effort was directed towards the development and testing of a membrane nutrient delivery system required to manipulate, seed, and harvest crops, and to determine plant health prior to stress impacting plant productivity are also being researched. Tissue culture technologies are being developed for use in management and propagation of crop plants. Though previous efforts have focussed on development of technologies required to operate a biomass production module for a CELSS, current efforts are expanding to include technologies required to operate modules such as food preparation, biomass processing, and resource (waste) recovery which are integral parts of the CELSS.
Efficient Simulation of Tropical Cyclone Pathways with Stochastic Perturbations
NASA Astrophysics Data System (ADS)
Webber, R.; Plotkin, D. A.; Abbot, D. S.; Weare, J.
2017-12-01
Global Climate Models (GCMs) are known to statistically underpredict intense tropical cyclones (TCs) because they fail to capture the rapid intensification and high wind speeds characteristic of the most destructive TCs. Stochastic parametrization schemes have the potential to improve the accuracy of GCMs. However, current analysis of these schemes through direct sampling is limited by the computational expense of simulating a rare weather event at fine spatial gridding. The present work introduces a stochastically perturbed parametrization tendency (SPPT) scheme to increase simulated intensity of TCs. We adapt the Weighted Ensemble algorithm to simulate the distribution of TCs at a fraction of the computational effort required in direct sampling. We illustrate the efficiency of the SPPT scheme by comparing simulations at different spatial resolutions and stochastic parameter regimes. Stochastic parametrization and rare event sampling strategies have great potential to improve TC prediction and aid understanding of tropical cyclogenesis. Since rising sea surface temperatures are postulated to increase the intensity of TCs, these strategies can also improve predictions about climate change-related weather patterns. The rare event sampling strategies used in the current work are not only a novel tool for studying TCs, but they may also be applied to sampling any range of extreme weather events.
Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y
2003-01-01
Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.
Hyperspectral anomaly detection using Sony PlayStation 3
NASA Astrophysics Data System (ADS)
Rosario, Dalton; Romano, João; Sepulveda, Rene
2009-05-01
We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.
Improving size estimates of open animal populations by incorporating information on age
Manly, Bryan F.J.; McDonald, Trent L.; Amstrup, Steven C.; Regehr, Eric V.
2003-01-01
Around the world, a great deal of effort is expended each year to estimate the sizes of wild animal populations. Unfortunately, population size has proven to be one of the most intractable parameters to estimate. The capture-recapture estimation models most commonly used (of the Jolly-Seber type) are complicated and require numerous, sometimes questionable, assumptions. The derived estimates usually have large variances and lack consistency over time. In capture–recapture studies of long-lived animals, the ages of captured animals can often be determined with great accuracy and relative ease. We show how to incorporate age information into size estimates for open populations, where the size changes through births, deaths, immigration, and emigration. The proposed method allows more precise estimates of population size than the usual models, and it can provide these estimates from two sample occasions rather than the three usually required. Moreover, this method does not require specialized programs for capture-recapture data; researchers can derive their estimates using the logistic regression module in any standard statistical package.
Sampling and analyses plan for tank 103 at the 219-S waste handling facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
FOWLER, K.D.
1999-06-23
This document describes the sampling and analysis activities associated with taking a Resource Conservation and Recovery Act (RCRA) protocol sample of the waste from Tank 103 at the 21 9-S Waste Handling Facility treatment storage, andlor disposal (TSD) unit at the 2224 Laboratory complex. This sampling and analyses is required based on negotiations between the State of Washington Department of Ecology (Ecology) and the Department of Energy, Richland Operations, (RL) in letters concerning the TPA Change Form M-32-98-01. In a letter from George H. Sanders, RL to Moses N. Jaraysi, Ecology, dated January 28,1999, it was noted that ''Prior tomore » the Tank 103 waste inventory transfer, a RCRA protocol sample of the waste will be obtained and tested for the constituents contained on the Part A, Form 3 Permit Application for the 219-S Waste Handling Facility.'' In the April 2, 1999 letter, from Brenda L. Becher-Khaleel, Ecology to James, E. Rasmussen, RL, and William O. Adair, FDH, Ecology states that the purpose of these analyses is to provide information and justification for leaving Tank 103 in an isolated condition in the 2194 TSD unit until facility closure. The data may also be used at some future date in making decisions regarding closure methodology for Tank 103. Ecology also notes that As Low As Reasonably Achievable (ALARA) concerns may force deviations from some SW-846 protocol. Every effort will be made to accommodate requirements as specified. Deviations from SW-846 will be documented in accordance with HASQARD.« less
Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Timothy C.; Versteeg, Roelof; Day-Lewis, Frederick D.
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERTmore » to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surfacebased ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.« less
Time-lapse electrical geophysical monitoring of amendment-based biostimulation
Johnson, Timothy C.; Versteeg, Roelof J.; Day-Lewis, Frederick D.; Major, William; Lane, John W.
2015-01-01
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation.Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation.In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.
The study on knowledge transferring incentive for information system requirement development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang
2015-03-10
Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.
Achieving excellence--creating customer passion.
Scheuing, E E
1999-08-01
Customers are the lifeblood of any organization. Without them, it loses its meaning and purpose. Customers provide incentive, vitality, and growth. Serving them well requires a customer-focused culture and a customer-friendly system. It also requires unrelenting effort toward continuous improvement, but the rewards are well worth the effort: unflinching customer loyalty, sustainable growth, and impressive performance.
Assignment Choice: Do Students Choose Briefer Assignments or Finishing What They Started?
ERIC Educational Resources Information Center
Hawthorn-Embree, Meredith L.; Skinner, Christopher H.; Parkhurst, John; O'Neil, Michael; Conley, Elisha
2010-01-01
Academic skill development requires engagement in effortful academic behaviors. Although students may be more likely to choose to engage in behaviors that require less effort, they also may be motivated to complete assignments that they have already begun. Seventh-grade students (N = 88) began a mathematics computation worksheet, but were stopped…
34 CFR 403.182 - What is the maintenance of fiscal effort requirement?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false What is the maintenance of fiscal effort requirement? 403.182 Section 403.182 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION STATE VOCATIONAL AND APPLIED TECHNOLOGY...
Prediction of Stereochemistry using Q2MM
2016-01-01
Conspectus The standard method of screening ligands for selectivity in asymmetric, transition metal-catalyzed reactions requires experimental testing of hundreds of ligands from ligand libraries. This “trial and error” process is costly in terms of time as well as resources and, in general, is scientifically and intellectually unsatisfying as it reveals little about the underlying mechanism behind the selectivity. The accurate computational prediction of stereoselectivity in enantioselective catalysis requires adequate conformational sampling of the selectivity-determining transition state but has to be fast enough to compete with experimental screening techniques to be useful for the synthetic chemist. Although electronic structure calculations are accurate and general, they are too slow to allow for sampling or fast screening of ligand libraries. The combined requirements can be fulfilled by using appropriately fitted transition state force fields (TSFFs) that represent the transition state as a minimum and allow fast conformational sampling using Monte Carlo. Quantum-guided molecular mechanics (Q2MM) is an automated force field parametrization method that generates accurate, reaction-specific TSFFs by fitting the functional form of an arbitrary force field using only electronic structure calculations by minimization of an objective function. A key feature that distinguishes the Q2MM method from many other automated parametrization procedures is the use of the Hessian matrix in addition to geometric parameters and relative energies. This alleviates the known problems of overfitting of TSFFs. After validation of the TSFF by comparison to electronic structure results for a test set and available experimental data, the stereoselectivity of a reaction can be calculated by summation over the Boltzman-averaged relative energies of the conformations leading to the different stereoisomers. The Q2MM method has been applied successfully to perform virtual ligand screens on a range of transition metal-catalyzed reactions that are important from both an industrial and an academic perspective. In this Account, we provide an overview of the continued improvement of the prediction of stereochemistry using Q2MM-derived TSFFs using four examples from different stages of development: (i) Pd-catalyzed allylation, (ii) OsO4-catalyzed asymmetric dihydroxylation (AD) of alkenes, (iii) Rh-catalyzed hydrogenation of enamides, and (iv) Ru-catalyzed hydrogenation of ketones. In the current form, correlation coefficients of 0.8–0.9 between calculated and experimental ee values are typical for a wide range of substrate–ligand combinations, and suitable ligands can be predicted for a given substrate with ∼80% accuracy. Although the generation of a TSFF requires an initial effort and will therefore be most useful for widely used reactions that require frequent screening campaigns, the method allows for a rapid virtual screen of large ligand libraries to focus experimental efforts on the most promising substrate–ligand combinations. PMID:27064579
Work Personality, Work Engagement, and Academic Effort in a Group of College Students
ERIC Educational Resources Information Center
Strauser, David R.; O'Sullivan, Deirdre; Wong, Alex W. K.
2012-01-01
The authors investigated the relationship between the variables of work engagement, developmental work personality, and academic effort in a sample of college students. This study provides evidence for the hypothesized positive relationship between academic effort, engagement, and work personality. When gender was controlled, the Work Tasks…
ASTM and ASME-BPE Standards--Complying with the Needs of the Pharmaceutical Industry.
Huitt, William M
2011-01-01
Designing and building a pharmaceutical facility requires the owner, engineer of record, and constructor to be knowledgeable with regard to the industry codes and standards that apply to this effort. Up until 1997 there were no industry standards directed at the needs and requirements of the pharmaceutical industry. Prior to that time it was a patchwork effort at resourcing and adopting nonpharmaceutical-related codes and standards and then modifying them in order to meet the more stringent requirements of the Food and Drug Administration (FDA). In 1997 the American Society of Mechanical Engineers (ASME) published the first Bioprocessing Equipment (BPE) Standard. Through harmonization efforts this relatively new standard has brought together, scrutinized, and refined industry accepted methodologies together with FDA compliance requirements, and has established an American National Standard that provides a comprehensive set of standards that are integral to the pharmaceutical industry. This article describes various American National Standards, including those developed and published by the American Society for Testing and Materials (ASTM), and how they apply to the pharmaceutical industry. It goes on to discuss the harmonization effort that takes place between the various standards developers in an attempt to prevent conflicts and omissions between the many standards. Also included are examples of tables and figures taken from the ASME-BPE Standard. These examples provide the reader with insight to the relevant content of the ASME-BPE Standard. Designing and building a pharmaceutical facility requires the owner, engineer of record, and constructor to be knowledgeable with regard to the industry codes and standards that apply to this effort. Up until 1997 there were no industry standards directed at the needs and requirements of the pharmaceutical industry. Prior to that time it was a patchwork effort at resourcing and adopting nonpharmaceutical-related codes and standards and then modifying them in order to meet the more stringent requirements of the Food and Drug Administration (FDA). In 1997 the American Society of Mechanical Engineers (ASME) published the first Bioprocessing Equipment (BPE) Standard. In its initial development and ongoing maintenance it works with other American National Standards developers to harmonize the many standards associated with the design, engineering, and construction of bioprocessing facilities. This harmonization effort has established a comprehensive set of standards for the betterment of the pharmaceutical industry at large. This effort is, and will remain, very important as technology, along with new and improved product and processes, evolve into the future.
Using larval fish community structure to guide long-term monitoring of fish spawning activity
Pritt, Jeremy J.; Roseman, Edward F.; Ross, Jason E.; DeBruyne, Robin L.
2015-01-01
Larval fishes provide a direct indication of spawning activity and may therefore be useful for long-term monitoring efforts in relation to spawning habitat restoration. However, larval fish sampling can be time intensive and costly. We sought to understand the spatial and temporal structure of larval fish communities in the St. Clair–Detroit River system, Michigan–Ontario, to determine whether targeted larval fish sampling can be made more efficient for long-term monitoring. We found that larval fish communities were highly nested, with lower river segments and late-spring samples containing the highest genus richness of larval fish. We created four sampling scenarios for each river system: (1) using all available data, (2) limiting temporal sampling to late spring, (3) limiting spatial sampling to lower river segments only, and (4) limiting both spatial and temporal sampling. By limiting the spatial extent of sampling to lower river sites and/or limiting the temporal extent to the late-spring period, we found that effort could be reduced by more than 50% while maintaining over 75% of the observed and estimated total genus richness. Similarly, limiting the sampling effort to lower river sites and/or the late-spring period maintained between 65% and 93% of the observed richness of lithophilic-spawning genera and invasive genera. In general, community composition remained consistent among sampling scenarios. Targeted sampling offers a lower-cost alternative to exhaustive spatial and temporal sampling and may be more readily incorporated into long-term monitoring.
Hosking, Jay G; Cocker, Paul J; Winstanley, Catharine A
2014-06-01
Personal success often requires the choice to expend greater effort for larger rewards, and deficits in such effortful decision making accompany a number of illnesses including depression, schizophrenia, and attention-deficit/hyperactivity disorder. Animal models have implicated brain regions such as the basolateral amygdala (BLA) and anterior cingulate cortex (ACC) in physical effort-based choice, but disentangling the unique contributions of these two regions has proven difficult, and effort demands in industrialized society are predominantly cognitive in nature. Here we utilize the rodent cognitive effort task (rCET), a modification of the five-choice serial reaction-time task, wherein animals can choose to expend greater visuospatial attention to obtain larger sucrose rewards. Temporary inactivation (via baclofen-muscimol) of BLA and ACC showed dissociable effects: BLA inactivation caused hard-working rats to 'slack off' and 'slacker' rats to work harder, whereas ACC inactivation caused all animals to reduce willingness to expend mental effort. Furthermore, BLA inactivation increased the time needed to make choices, whereas ACC inactivation increased motor impulsivity. These data illuminate unique contributions of BLA and ACC to effort-based decision making, and imply overlapping yet distinct circuitry for cognitive vs physical effort. Our understanding of effortful decision making may therefore require expanding our models beyond purely physical costs.
Okayasu, Hiromasa; Brown, Alexandra E; Nzioki, Michael M; Gasasira, Alex N; Takane, Marina; Mkanda, Pascal; Wassilak, Steven G F; Sutter, Roland W
2014-11-01
To assess the quality of supplementary immunization activities (SIAs), the Global Polio Eradication Initiative (GPEI) has used cluster lot quality assurance sampling (C-LQAS) methods since 2009. However, since the inception of C-LQAS, questions have been raised about the optimal balance between operational feasibility and precision of classification of lots to identify areas with low SIA quality that require corrective programmatic action. To determine if an increased precision in classification would result in differential programmatic decision making, we conducted a pilot evaluation in 4 local government areas (LGAs) in Nigeria with an expanded LQAS sample size of 16 clusters (instead of the standard 6 clusters) of 10 subjects each. The results showed greater heterogeneity between clusters than the assumed standard deviation of 10%, ranging from 12% to 23%. Comparing the distribution of 4-outcome classifications obtained from all possible combinations of 6-cluster subsamples to the observed classification of the 16-cluster sample, we obtained an exact match in classification in 56% to 85% of instances. We concluded that the 6-cluster C-LQAS provides acceptable classification precision for programmatic action. Considering the greater resources required to implement an expanded C-LQAS, the improvement in precision was deemed insufficient to warrant the effort. Published by Oxford University Press on behalf of the Infectious Diseases Society of America 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Silveira, Mason M; Tremblay, Melanie; Winstanley, Catharine A
2018-04-24
Cognitive effort is a ubiquitous process, yet surprisingly little is known about the brain mechanisms responsible for evaluating it. Here, we utilize the rat Cognitive Effort Task (rCET) to probe the striatum's role in deciding between options that vary in the amount of cognitive effort required for success. In the rCET, animals choose to perform either an easy trial, in which the attentional demand is low but the potential reward is small, or a difficult trial which is more attentionally demanding but can yield twice the sugar pellets. Twenty-six male Long Evans rats were trained on the rCET and the effects of pharmacologically inactivating the dorsomedial striatum (DMS) and core region of the nucleus accumbens were determined. Temporary inactivation of the DMS decreased all animals' choice of the high-effort, high-reward option, impaired attentional accuracy, and robustly increased premature responding without impairing general indices of motor ability. The DMS therefore appears necessary for the integration of cognitive signals required for optimal performance. In stark contrast, following temporary inactivation of the ventral striatum, subjects were fundamentally unable to perform the task, as reflected by a drastic decrease in the number of trials initiated and an increase in omitted responses. Together, these data suggest the striatum is likely part of a larger cortico-limbic-striatal network whose function is to optimize decisions requiring cognitive effort costs, at least in the attentional domain, and that striatal subregions have dissociable roles in the adjudication and application of this form of cognitive effort. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
NASA Astrophysics Data System (ADS)
Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Lindsey, Cameron; Kutzer, Thomas; Salazar, Eduardo
2018-03-01
The return of samples back to Earth in future missions would require protection of our planet from the risk of bringing uncontrolled biological materials back with the samples. This protection would require "breaking the chain of contact (BTC)", where any returned material reaching Earth for further analysis would have to be sealed inside a container with extremely high confidence. Therefore, the acquired samples would need to be contained while destroying any potential biological materials that may contaminate the external surface of the container. A novel process that could be used to contain returning samples has been developed and demonstrated in a quarter scale size. The process consists of brazing using non-contact induction heating that synchronously separates, seams, seals and sterilizes (S4) the container. The use of brazing involves melting at temperatures higher than 500°C and this level of heating assures sterilization of the exposed areas since all carbon bonds (namely, organic materials) are broken at this temperature. The mechanism consists of a double wall container with inner and outer shells having Earth-clean interior surfaces. The process consists of two-steps, Step-1: the double wall container halves are fabricated and brazed (equivalent to production on Earth); and Step-2 is the S4 process and it is the equivalent to the execution on-orbit around Mars. In a potential future mission, the double wall container would be split into two halves and prepared on Earth. The potential on-orbit execution would consist of inserting the orbiting sample (OS) container into one of the halves and then mated to the other half and brazed. The latest results of this effort will be described and discussed in this manuscript.
Towards automated sleep classification in infants using symbolic and subsymbolic approaches.
Kubat, M; Flotzinger, D; Pfurtscheller, G
1993-04-01
The paper addresses the problem of automatic sleep classification. A special effort is made to find a method of extracting reasonable descriptions of the individual sleep stages from sample measurements of EGG, EMG, EOG, etc., and from a classification of these measurements provided by an expert. The method should satisfy three requirements: classification accuracy, interpretability of the results, and the ability to select the relevant and discard the irrelevant variables. The solution suggested in this paper consists of a combination of the subsymbolic algorithm LVQ with the symbolic decision tree generator ID3. Results demonstrating the feasibility and utility of our approach are also presented.
Novel method to sample very high power CO2 lasers: II Continuing Studies
NASA Astrophysics Data System (ADS)
Eric, John; Seibert, Daniel B., II; Green, Lawrence I.
2005-04-01
For the past 28 years, the Laser Hardened Materials Evaluation Laboratory (LHMEL) at the Wright-Patterson Air Force Base, OH, has worked with CO2 lasers capable of producing continuous energy up to 150 kW. These lasers are used in a number of advanced materials processing applications that require accurate spatial energy measurements of the laser. Conventional non-electronic methods are not satisfactory for determining the spatial energy profile. This paper describes continuing efforts in qualifying the new method in which a continuous, real-time electronic spatial energy profile can be obtained for very high power, (VHP) CO2 lasers.
Preserving, Enhancing, and Continuing the Scientific Legacy of the Apollo Sample Suite
NASA Astrophysics Data System (ADS)
Zeigler, R. A.; Evans, C. A.; Lehnert, K.; Cai, Y.; Todd, N. S.
2016-12-01
From 1969 to 1972, Apollo astronauts collected 382 kg of rocks, soils, and core samples from six geologically diverse locations on the Moon. In the nearly 50 years since the samples were collected, over 3000 different studies have been conducted using the nearly 2200 different Apollo samples. Despite the maturity of the sample collection, many new studies of lunar samples are undertaken each year, with an average of >55 requests and >600 distinct subsamples allocated annually over the past five years. The Apollo samples are a finite resource, however. Although new studies are encouraged, it is important that new studies do not duplicate previous studies, and where possible, leverage previous results to inform and enhance the current studies. This helps to preserve the samples and scientific funding, both of which are precious resources. We have initiated several new efforts to rescue some of the early analyses from these samples, including unpublished analytical data. We are actively scanning NASA documentation in paper form that is related to the Apollo missions and sample processing, and we are collaborating with IEDA to establish a geochemical data base called MoonDB. To populate this database, we are actively working with about a dozen prominent lunar PIs to organize and transcribe years of both published and unpublished data, making it available to all researchers. This effort will also take advantage of new online analytical tools like PetDB. There have already been tangible results from the MoonDB data rescue effort. A pilot project involving the rescue of geochemical data of John Delano on Apollo pyroclastic glasses has already been referenced in multiple Apollo sample requests, and in fact, the compiled data was used as part of one of the new studies. Similarly, scanned sample handling reports have been utilized to find previously analyzed samples that were appropriate to fulfill new sample requests. We have also begun to image the Apollo samples using (1) micro-CT scanning to document the interior structure of samples, and (2) comprehensive high resolution photography of samples, enabling high resolution 3D reconstructions of the samples. Both efforts will provide comprehensive access to these samples and allow for more targeted requests, and thus better curation of the samples for decades to come.
Preserving, Enhancing, and Continuing the Scientific Legacy of the Apollo Sample Suite
NASA Technical Reports Server (NTRS)
Zeigler, Ryan; Evans, Cindy; Cai, Yue; Lehnert, Kerstin; Todd, Nancy; Blumenfeld, Erika
2016-01-01
From 1969 to 1972, Apollo astronauts collected 382 kg of rocks, soils, and core samples from six geologically diverse locations on the Moon. In the nearly 50 years since the samples were collected, over 3000 different studies have been conducted using the nearly 2200 different Apollo samples. Despite the maturity of the sample collection, many new studies of lunar samples are undertaken each year, with an average of more than 55 requests and more than 600 distinct subsamples allocated annually over the past five years. The Apollo samples are a finite resource, however. Although new studies are encouraged, it is important that new studies do not duplicate previous studies, and where possible, leverage previous results to inform and enhance the current studies. This helps to preserve the samples and scientific funding, both of which are precious resources. We have initiated several new efforts to rescue some of the early analyses from these samples, including unpublished analytical data. We are actively scanning NASA documentation in paper form that is related to the Apollo missions and sample processing, and we are collaborating with IEDA to establish a geochemical data base called MoonDB. To populate this database, we are actively working with about a dozen prominent lunar PIs to organize and transcribe years of both published and unpublished data, making it available to all researchers. This effort will also take advantage of new online analytical tools like PetDB. There have already been tangible results from the MoonDB data rescue effort. A pilot project involving the rescue of geochemical data of John Delano on Apollo pyroclastic glasses has already been referenced in multiple Apollo sample requests, and in fact, the compiled data was used as part of one of the new studies. Similarly, scanned sample handling reports have been utilized to find previously analyzed samples that were appropriate to fulfill new sample requests. We have also begun to image the Apollo samples using (1) micro-CT scanning to document the interior structure of samples, and (2) comprehensive high resolution photography of samples, enabling high resolution 3D reconstructions of the samples. Both efforts will provide comprehensive access to these samples and allow for more targeted requests, and thus better curation of the samples for decades to come.
Seasonal prevalence of malaria in West Sumba district, Indonesia
Syafruddin, Din; Krisin; Asih, Puji; Sekartuti; Dewi, Rita M; Coutrier, Farah; Rozy, Ismail E; Susanti, Augustina I; Elyazar, Iqbal RF; Sutamihardja, Awalludin; Rahmat, Agus; Kinzer, Michael; Rogers, William O
2009-01-01
Background Accurate information about the burden of malaria infection at the district or provincial level is required both to plan and assess local malaria control efforts. Although many studies of malaria epidemiology, immunology, and drug resistance have been conducted at many sites in Indonesia, there is little published literature describing malaria prevalence at the district, provincial, or national level. Methods Two stage cluster sampling malaria prevalence surveys were conducted in the wet season and dry season across West Sumba, Nusa Tenggara Province, Indonesia. Results Eight thousand eight hundred seventy samples were collected from 45 sub-villages in the surveys. The overall prevalence of malaria infection in the West Sumba District was 6.83% (95% CI, 4.40, 9.26) in the wet season and 4.95% (95% CI, 3.01, 6.90) in the dry. In the wet season Plasmodium falciparum accounted for 70% of infections; in the dry season P. falciparum and Plasmodium vivax were present in equal proportion. Malaria prevalence varied substantially across the district; prevalences in individual sub-villages ranged from 0–34%. The greatest malaria prevalence was in children and teenagers; the geometric mean parasitaemia in infected individuals decreased with age. Malaria infection was clearly associated with decreased haemoglobin concentration in children under 10 years of age, but it is not clear whether this association is causal. Conclusion Malaria is hypoendemic to mesoendemic in West Sumba, Indonesia. The age distribution of parasitaemia suggests that transmission has been stable enough to induce some clinical immunity. These prevalence data will aid the design of future malaria control efforts and will serve as a baseline against which the results of current and future control efforts can be assessed. PMID:19134197
Determining the disease management process for epileptic patients: A qualitative study
Hosseini, Nazafarin; Sharif, Farkhondeh; Ahmadi, Fazlollah; Zare, Mohammad
2016-01-01
Background: Epilepsy exposes patients to many physical, social, and emotional challenges. Thus, it seems to portray a complex picture and needs holistic care. Medical treatment and psychosocial part of epilepsy remain central to managing and improving the patient's qualify of life through team efforts. Some studies have shown the dimensions of self-management, but its management process of epilepsy patients, especially in Iran, is not clear. This study aimed to determine the disease management process in patients with epilepsy in Iran. Materials and Methods: This qualitative approach and grounded theory study was conducted from January 2009 to February 2012 in Isfahan city (Iran). Thirty-two participants were recruited by the goal-oriented, and snowball sample selection and theoretical sampling methods. After conducting a total of 43 in-depth interviews with the participants, the researchers reached data saturation. Data were analyzed using Strauss and Corbin method. Results: With a focus on disease management process, researchers found three main themes and seven sub-themes as a psychosocial process (PSP). The main themes were: perception of threat to self-identity, effort to preserve self-identity, and burn out. The psychosocial aspect of the disease generated one main variable “the perception of identity loss” and one central variable “searching for self-identity.” Conclusions: Participants attributed threat to self-identity and burn out to the way their disease was managed requiring efforts to preserve their identity. Recommendations consist of support programs and strategies to improve the public perception of epilepsy in Iran, help patients accept their condition and preserve self-identity, and most importantly, enhance medical management of epilepsy. PMID:26985223
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
NASA Astrophysics Data System (ADS)
Christe, Steven
Over the past decade, the NASA Marshall Space Flight Center (MSFC) has been improving the angular resolution of hard X-ray (HXR; 20 "70 keV) optics to the point that we now routinely manufacture optics modules with an angular resolution of 20 arcsec Half Power Diameter (HDP), almost three times the performance of NuSTAR optics (Ramsey et al. 2013; Gubarev et al. 2013a; Atkins et al. 2013). New techniques are currently being developed to provide even higher angular resolution. High angular resolution HXR optics require detectors with a large number of fine pixels in order to adequately sample the telescope point spread function (PSF) over the entire field of view. Excessively over-sampling the PSF will increase readout noise and require more processing with no appreciable increase in image quality. An appropriate level of over-sampling is to have 3 pixels within the HPD. For the HERO mirrors, where the HPD is 26 arcsec over a 6-m focal length converts to 750 μm, the optimum pixel size is around 250 μm. At a 10-m focal length these detectors can support a 16 arcsec HPD. Of course, the detectors must also have high efficiency in the HXR region, good energy resolution, low background, low power requirements, and low sensitivity to radiation damage (Ramsey 2001). The ability to handle high counting rates is also desirable for efficient calibration. A collaboration between Goddard Space Flight Center (GSFC), MSFC, and Rutherford Appleton Laboratory (RAL) in the UK is developing precisely such detectors under an ongoing, funded APRA program (FY2015 to FY2017). The detectors use the RALdeveloped Application Specific Integrated Circuit (ASIC) dubbed HEXITEC, for High Energy X-Ray Imaging Technology. These HEXITEC ASICs can be bonded to 1- or 2- mm-thick Cadmium Telluride (CdTe) or Cadmium-Zinc-Telluride (CZT) to create a fine (250 μm pitch) HXR detector (Jones et al. 2009; Seller et al. 2011). The objectives of this funded effort are to develop and test a HEXITEC-based detector system through the (1) design, manufacture, and test of front-end electronics instrument boards and (2) calibration of the detectors to assess their performance and (3) vibration and environmental testing. By the end of this program, multiple detector assemblies will be built and characterized, and can be used as part of future instruments. We propose to augment the existing effort with the development of an anti-coincidence shield for these HEXITEC-based detector assemblies to maximize sensitivity. Designing the anti-coincidence shield is enabled by the addition of a new team member, Wayne Baumgartner, who has recently and fortuitously joined the existing effort. Dr. Baumgartner has valuable and relevant past experience with a similar shield systems developed for NuSTAR and the InFOCμS x-ray telescope. We are asking for a modest amount of additional funding in this proposal year, as it coincides with a key time in the characterization and environmental testing of the detector assemblies. Characterization and environmental testing of the bare assemblies is already funded under the current effort. The addition of this active shield will allow for a more complete detector module vibration and environment test at the end of the existing development program so that this project results in a detector system with a demonstrated TRL of 6: "System/subsystem model or prototype demonstration in a relevant environment."
O'Connell, M. T.; Uzee O'Connell, A.M.; Williams, J.D.
2005-01-01
Accurate knowledge of an organism's distribution is necessary for conserving species with small or isolated populations. A perceived rarity may only reflect inadequate sampling effort and suggest the need for more research. We used a recently developed method to evaluate the distribution of a rare fish species, the blackmouth shiner Notropis melanostomus Bortone 1989 (Cyprinidae), which occurs in disjunct populations in Mississippi and Florida. Until 1995, N. melanostomus had been collected from only three localities in Mississippi, but in 1995, eight new localities were discovered. We analyzed museum records of fish collections from Mississippi, Florida, and Alabama to compare sampling effort before and after 1995. Results supported our predictions that 1) pre-1995 data would indicate inadequate sampling effort in Mississippi, 2) additional post-1995 sampling improved confidence in the currently known Mississippi distribution, and 3) there has not been enough sampling to accurately represent the actual distribution of N. melanostomus in Florida and across its entire known range. This last prediction was confirmed with the recent (2003) discovery of the first N. melanostomus in Alabama.
Assessment of fertility control efforts in a selected area of Karachi, Pakistan.
Shirmeen, Amra; Khan, Muhammad F H; Khan, Khizer H; Khan, Khurum H
2007-09-01
To investigate the impact of fertility control efforts on reducing fertility and to study the contributory role of fertility inhibiting factors viz, age of the marriage, breast feeding and post-partum amenorrhea, abortion and use of contraceptives in selected area in Karachi, Pakistan. The aim was to estimate the gap between knowledge of contraceptives and its practice i.e. KAP-GAP as well as to determine the level of unmet need in the PIB colony in Karachi. A sample survey was conducted in PIB colony in Karachi from October 2005 to November 2005 by interviewing 340 married women in reproductive ages. The data was tabulated and John Bongaarts technique was used to analyse the success of fertility control efforts in the selected area. Of the total of 340 respondents, 38% were currently using contraceptive methods with 26% using OCP's and 12% were condom users. A slight reduction in total fertility (TFR) was noticed. The population policy of Pakistan envisages achieving population stabilization in 2020 by reducing the annual rate of population growth from 1.9% to 1.3% and TFR at 2.1. This target requires strenuous efforts to make the concept of small family an accepted milieu through an eagerly designed communication and education campaign. Concentration on proximate determinants of fertility particularly breast feeding and prolonging birth interval will not generate opposition from the community because these concepts are in accordance with Islamic injunctions and teachings.
Using Six Sigma to improve once daily gentamicin dosing and therapeutic drug monitoring performance.
Egan, Sean; Murphy, Philip G; Fennell, Jerome P; Kelly, Sinead; Hickey, Mary; McLean, Carolyn; Pate, Muriel; Kirke, Ciara; Whiriskey, Annette; Wall, Niall; McCullagh, Eddie; Murphy, Joan; Delaney, Tim
2012-12-01
Safe, effective therapy with the antimicrobial gentamicin requires good practice in dose selection and monitoring of serum levels. Suboptimal therapy occurs with breakdown in the process of drug dosing, serum blood sampling, laboratory processing and level interpretation. Unintentional underdosing may result. This improvement effort aimed to optimise this process in an academic teaching hospital using Six Sigma process improvement methodology. A multidisciplinary project team was formed. Process measures considered critical to quality were defined, and baseline practice was examined through process mapping and audit. Root cause analysis informed improvement measures. These included a new dosing and monitoring schedule, and standardised assay sampling and drug administration timing which maximised local capabilities. Three iterations of the improvement cycle were conducted over a 24-month period. The attainment of serum level sampling in the required time window improved by 85% (p≤0.0001). A 66% improvement in accuracy of dosing was observed (p≤0.0001). Unnecessary dose omission while awaiting level results and inadvertent disruption to therapy due to dosing and monitoring process breakdown were eliminated. Average daily dose administered increased from 3.39 mg/kg to 4.78 mg/kg/day. Using Six Sigma methodology enhanced gentamicin usage process performance. Local process related factors may adversely affect adherence to practice guidelines for gentamicin, a drug which is complex to use. It is vital to adapt dosing guidance and monitoring requirements so that they are capable of being implemented in the clinical environment as a matter of routine. Improvement may be achieved through a structured localised approach with multidisciplinary stakeholder involvement.
Mist net effort required to inventory a forest bat species assemblage.
Theodore J. Weller; Danny C. Lee
2007-01-01
Little quantitative information exists about the survey effort necessary to inventory temperate bat species assemblages. We used a bootstrap resampling lgorithm to estimate the number of mist net surveys required to capture individuals from 9 species at both study area and site levels using data collected in a forested watershed in northwestern California, USA, during...
Code of Federal Regulations, 2010 CFR
2010-07-01
... VOCATIONAL AND APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403... 34 Education 3 2010-07-01 2010-07-01 false Under what circumstances may the Secretary waive the maintenance of effort requirement? 403.183 Section 403.183 Education Regulations of the Offices of the...
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
ERIC Educational Resources Information Center
Bijleveld, Erik; Custers, Ruud; Aarts, Henk
2012-01-01
When in pursuit of rewards, humans weigh the value of potential rewards against the amount of effort that is required to attain them. Although previous research has generally conceptualized this process as a deliberate calculation, recent work suggests that rudimentary mechanisms--operating without conscious intervention--play an important role as…
34 CFR 461.44 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 3 2014-07-01 2014-07-01 false How does a State request a waiver of the maintenance of effort requirement? 461.44 Section 461.44 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION...
34 CFR 461.44 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 3 2012-07-01 2012-07-01 false How does a State request a waiver of the maintenance of effort requirement? 461.44 Section 461.44 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION...
34 CFR 461.44 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 34 Education 3 2013-07-01 2013-07-01 false How does a State request a waiver of the maintenance of effort requirement? 461.44 Section 461.44 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION...
34 CFR 461.44 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 3 2011-07-01 2011-07-01 false How does a State request a waiver of the maintenance of effort requirement? 461.44 Section 461.44 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION...
34 CFR 461.44 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does a State request a waiver of the maintenance of effort requirement? 461.44 Section 461.44 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF VOCATIONAL AND ADULT EDUCATION, DEPARTMENT OF EDUCATION ADULT EDUCATION...
The ethical use of existing samples for genome research.
Bathe, Oliver F; McGuire, Amy L
2009-10-01
Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.
López-Picazo Ferrer, J J; Tomás García, N; Cubillana Herrero, J D; Gómez Company, J A; de Dios Cánovas García, J
2014-01-01
To measure the appropriateness of hospital admissions, to classify its Clinical Services (CS) according to the level of inappropriateness, and to determine the usefulness of applying rapid assessment techniques (lot quality assurance sampling) in these types of measurements. A descriptive, retrospective study was conducted in a tertiary hospital to assess the clinical records of emergency admissions to the 12 CS with a higher volume of admissions, using the Appropriateness Evaluation Protocol (AEP). A four-level («A» to «D») increasingly inadequate admissions scale was constructed setting both standard and threshold values in every stratum. Every CS was classified in one of them using lot quality assurance sampling (LQAS). A total of 156 cases (13 cases from every CS) were assessed. The assessment effort (devoted time) was also estimated. There were 22.4±6.3% of inadequate admissions. In the CS classification, 9 (75%) got a good or acceptable appropriateness level, and only 1 (8%) got an inacceptable level. The time devoted was estimated at 17 hours. AEP is useful to assess the admission appropriateness and may be included in the «Emergencies» process management, although its variability prevents the use for external comparisons. If both LQAS and the appropriateness classification level and the global estimation (by unifying lot samples) are combined, the monitoring is affordable without a great effort. To extend these tools to other quality indicators requiring direct observation or clinical records, manual assessment could improve the monitoring efficiency. Copyright © 2013 SECA. Published by Elsevier Espana. All rights reserved.
König, Stephan; Wubet, Tesfaye; Dormann, Carsten F.; Hempel, Stefan; Renker, Carsten; Buscot, François
2010-01-01
Large-scale (temporal and/or spatial) molecular investigations of the diversity and distribution of arbuscular mycorrhizal fungi (AMF) require considerable sampling efforts and high-throughput analysis. To facilitate such efforts, we have developed a TaqMan real-time PCR assay to detect and identify AMF in environmental samples. First, we screened the diversity in clone libraries, generated by nested PCR, of the nuclear ribosomal DNA internal transcribed spacer (ITS) of AMF in environmental samples. We then generated probes and forward primers based on the detected sequences, enabling AMF sequence type-specific detection in TaqMan multiplex real-time PCR assays. In comparisons to conventional clone library screening and Sanger sequencing, the TaqMan assay approach provided similar accuracy but higher sensitivity with cost and time savings. The TaqMan assays were applied to analyze the AMF community composition within plots of a large-scale plant biodiversity manipulation experiment, the Jena Experiment, primarily designed to investigate the interactive effects of plant biodiversity on element cycling and trophic interactions. The results show that environmental variables hierarchically shape AMF communities and that the sequence type spectrum is strongly affected by previous land use and disturbance, which appears to favor disturbance-tolerant members of the genus Glomus. The AMF species richness of disturbance-associated communities can be largely explained by richness of plant species and plant functional groups, while plant productivity and soil parameters appear to have only weak effects on the AMF community. PMID:20418424
Marks, Michael; Fookes, Maria; Wagner, Josef; Butcher, Robert; Ghinai, Rosanna; Sokana, Oliver; Sarkodie, Yaw-Adu; Lukehart, Sheila A; Solomon, Anthony W; Mabey, David C W; Thomson, Nicholas
2018-01-01
Abstract Background Yaws-like chronic ulcers can be caused by Treponema pallidum subspecies pertenue, Haemophilus ducreyi, or other, still-undefined bacteria. To permit accurate evaluation of yaws elimination efforts, programmatic use of molecular diagnostics is required. The accuracy and sensitivity of current tools remain unclear because our understanding of T. pallidum diversity is limited by the low number of sequenced genomes. Methods We tested samples from patients with suspected yaws collected in the Solomon Islands and Ghana. All samples were from patients whose lesions had previously tested negative using the Centers for Disease Control and Prevention (CDC) diagnostic assay in widespread use. However, some of these patients had positive serological assays for yaws on blood. We used direct whole-genome sequencing to identify T. pallidum subsp pertenue strains missed by the current assay. Results From 45 Solomon Islands and 27 Ghanaian samples, 11 were positive for T. pallidum DNA using the species-wide quantitative polymerase chain reaction (PCR) assay, from which we obtained 6 previously undetected T. pallidum subsp pertenue whole-genome sequences. These show that Solomon Islands sequences represent distinct T. pallidum subsp pertenue clades. These isolates were invisible to the CDC diagnostic PCR assay, due to sequence variation in the primer binding site. Conclusions Our data double the number of published T. pallidum subsp pertenue genomes. We show that Solomon Islands strains are undetectable by the PCR used in many studies and by health ministries. This assay is therefore not adequate for the eradication program. Next-generation genome sequence data are essential for these efforts. PMID:29045605
Testing Evaluation of the Electrochemical Organic Content Analyzer
NASA Technical Reports Server (NTRS)
Davenport, R. J.
1979-01-01
The breadboard electrochemical organic content analyzer was evalauted for aerospace applications. An awareness of the disadvantages of expendables in some systems resulted in an effort to investigate ways of reducing the consumption of the analyzer's electrolyte from the rate of 5.17 kg/30 days. It was found that the electrochemical organic content analyzer can result in an organic monitor in the water quality monitor having a range of 0.1 to 100 mg/1 total organic carbon for a large number of common organic solutes. In a flight version it is anticipated the analyzer would occupy .0002 cu m, weigh 1.4 kg, and require 10 W or less of power. With the optimum method of injecting electrolyte into the sample (saturation of the sample with a salt) it would expend only 0.04 kg of electrolyte during 30 days of continuous operation.
Complex disease and phenotype mapping in the domestic dog
Hayward, Jessica J.; Castelhano, Marta G.; Oliveira, Kyle C.; Corey, Elizabeth; Balkman, Cheryl; Baxter, Tara L.; Casal, Margret L.; Center, Sharon A.; Fang, Meiying; Garrison, Susan J.; Kalla, Sara E.; Korniliev, Pavel; Kotlikoff, Michael I.; Moise, N. S.; Shannon, Laura M.; Simpson, Kenneth W.; Sutter, Nathan B.; Todhunter, Rory J.; Boyko, Adam R.
2016-01-01
The domestic dog is becoming an increasingly valuable model species in medical genetics, showing particular promise to advance our understanding of cancer and orthopaedic disease. Here we undertake the largest canine genome-wide association study to date, with a panel of over 4,200 dogs genotyped at 180,000 markers, to accelerate mapping efforts. For complex diseases, we identify loci significantly associated with hip dysplasia, elbow dysplasia, idiopathic epilepsy, lymphoma, mast cell tumour and granulomatous colitis; for morphological traits, we report three novel quantitative trait loci that influence body size and one that influences fur length and shedding. Using simulation studies, we show that modestly larger sample sizes and denser marker sets will be sufficient to identify most moderate- to large-effect complex disease loci. This proposed design will enable efficient mapping of canine complex diseases, most of which have human homologues, using far fewer samples than required in human studies. PMID:26795439
A spatial model of bird abundance as adjusted for detection probability
Gorresen, P.M.; Mcmillan, G.P.; Camp, R.J.; Pratt, T.K.
2009-01-01
Modeling the spatial distribution of animals can be complicated by spatial and temporal effects (i.e. spatial autocorrelation and trends in abundance over time) and other factors such as imperfect detection probabilities and observation-related nuisance variables. Recent advances in modeling have demonstrated various approaches that handle most of these factors but which require a degree of sampling effort (e.g. replication) not available to many field studies. We present a two-step approach that addresses these challenges to spatially model species abundance. Habitat, spatial and temporal variables were handled with a Bayesian approach which facilitated modeling hierarchically structured data. Predicted abundance was subsequently adjusted to account for imperfect detection and the area effectively sampled for each species. We provide examples of our modeling approach for two endemic Hawaiian nectarivorous honeycreepers: 'i'iwi Vestiaria coccinea and 'apapane Himatione sanguinea. ?? 2009 Ecography.
Nutritional Status Assessment (SMO -16E)
NASA Technical Reports Server (NTRS)
Smith, Scott M.; Heer, M. A.; Zwart, S. R.
2012-01-01
The Nutritional Status Assessment Supplemental Medical Objective was an experiment initiated to expand nominal pre- and postflight clinical nutrition testing, and to gain a better understanding of the time course of changes during flight. The primary activity of this effort was collecting blood and urine samples 5 times during flight for analysis after return to Earth. Samples were subjected to a battery of tests, including nutritional, physiological, general chemistry, and endocrinology indices. These data provide a comprehensive survey of how nutritional status and related systems are affected by 4-6 months of space flight. Analyzing the data will help us to define nutritional requirements for long-duration missions, and better understand human adaptation to microgravity. This expanded set of measurements will also aid in the identification of nutritional countermeasures to counteract, for example, the deleterious effects of microgravity on bone and muscle and the effects of space radiation.
Steyer, G.D.; Sasser, C.E.; Visser, J.M.; Swenson, E.M.; Nyman, J.A.; Raynie, R.C.
2003-01-01
Wetland restoration efforts conducted in Louisiana under the Coastal Wetlands Planning, Protection and Restoration Act require monitoring the effectiveness of individual projects as well as monitoring the cumulative effects of all projects in restoring, creating, enhancing, and protecting the coastal landscape. The effectiveness of the traditional paired-reference monitoring approach in Louisiana has been limited because of difficulty in finding comparable reference sites. A multiple reference approach is proposed that uses aspects of hydrogeomorphic functional assessments and probabilistic sampling. This approach includes a suite of sites that encompass the range of ecological condition for each stratum, with projects placed on a continuum of conditions found for that stratum. Trajectories in reference sites through time are then compared with project trajectories through time. Plant community zonation complicated selection of indicators, strata, and sample size. The approach proposed could serve as a model for evaluating wetland ecosystems.
Steyer, Gregory D; Sasser, Charles E; Visser, Jenneke M; Swenson, Erick M; Nyman, John A; Raynie, Richard C
2003-01-01
Wetland restoration efforts conducted in Louisiana under the Coastal Wetlands Planning, Protection and Restoration Act require monitoring the effectiveness of individual projects as well as monitoring the cumulative effects of all projects in restoring, creating, enhancing, and protecting the coastal landscape. The effectiveness of the traditional paired-reference monitoring approach in Louisiana has been limited because of difficulty in finding comparable reference sites. A multiple reference approach is proposed that uses aspects of hydrogeomorphic functional assessments and probabilistic sampling. This approach includes a suite of sites that encompass the range of ecological condition for each stratum, with projects placed on a continuum of conditions found for that stratum. Trajectories in reference sites through time are then compared with project trajectories through time. Plant community zonation complicated selection of indicators, strata, and sample size. The approach proposed could serve as a model for evaluating wetland ecosystems.
NASA Astrophysics Data System (ADS)
Foreman-Mackey, Daniel; Hogg, David W.; Lang, Dustin; Goodman, Jonathan
2013-03-01
We introduce a stable, well tested Python implementation of the affine-invariant ensemble sampler for Markov chain Monte Carlo (MCMC) proposed by Goodman & Weare (2010). The code is open source and has already been used in several published projects in the astrophysics literature. The algorithm behind emcee has several advantages over traditional MCMC sampling methods and it has excellent performance as measured by the autocorrelation time (or function calls per independent sample). One major advantage of the algorithm is that it requires hand-tuning of only 1 or 2 parameters compared to ˜N2 for a traditional algorithm in an N-dimensional parameter space. In this document, we describe the algorithm and the details of our implementation. Exploiting the parallelism of the ensemble method, emcee permits any user to take advantage of multiple CPU cores without extra effort. The code is available online at http://dan.iel.fm/emcee under the GNU General Public License v2.
Grist, Eric P M; Flegg, Jennifer A; Humphreys, Georgina; Mas, Ignacio Suay; Anderson, Tim J C; Ashley, Elizabeth A; Day, Nicholas P J; Dhorda, Mehul; Dondorp, Arjen M; Faiz, M Abul; Gething, Peter W; Hien, Tran T; Hlaing, Tin M; Imwong, Mallika; Kindermans, Jean-Marie; Maude, Richard J; Mayxay, Mayfong; McDew-White, Marina; Menard, Didier; Nair, Shalini; Nosten, Francois; Newton, Paul N; Price, Ric N; Pukrittayakamee, Sasithon; Takala-Harrison, Shannon; Smithuis, Frank; Nguyen, Nhien T; Tun, Kyaw M; White, Nicholas J; Witkowski, Benoit; Woodrow, Charles J; Fairhurst, Rick M; Sibley, Carol Hopkins; Guerin, Philippe J
2016-10-24
Artemisinin-resistant Plasmodium falciparum malaria parasites are now present across much of mainland Southeast Asia, where ongoing surveys are measuring and mapping their spatial distribution. These efforts require substantial resources. Here we propose a generic 'smart surveillance' methodology to identify optimal candidate sites for future sampling and thus map the distribution of artemisinin resistance most efficiently. The approach uses the 'uncertainty' map generated iteratively by a geostatistical model to determine optimal locations for subsequent sampling. The methodology is illustrated using recent data on the prevalence of the K13-propeller polymorphism (a genetic marker of artemisinin resistance) in the Greater Mekong Subregion. This methodology, which has broader application to geostatistical mapping in general, could improve the quality and efficiency of drug resistance mapping and thereby guide practical operations to eliminate malaria in affected areas.
Engineering Design of ITER Prototype Fast Plant System Controller
NASA Astrophysics Data System (ADS)
Goncalves, B.; Sousa, J.; Carvalho, B.; Rodrigues, A. P.; Correia, M.; Batista, A.; Vega, J.; Ruiz, M.; Lopez, J. M.; Rojo, R. Castro; Wallander, A.; Utzel, N.; Neto, A.; Alves, D.; Valcarcel, D.
2011-08-01
The ITER control, data access and communication (CODAC) design team identified the need for two types of plant systems. A slow control plant system is based on industrial automation technology with maximum sampling rates below 100 Hz, and a fast control plant system is based on embedded technology with higher sampling rates and more stringent real-time requirements than that required for slow controllers. The latter is applicable to diagnostics and plant systems in closed-control loops whose cycle times are below 1 ms. Fast controllers will be dedicated industrial controllers with the ability to supervise other fast and/or slow controllers, interface to actuators and sensors and, if necessary, high performance networks. Two prototypes of a fast plant system controller specialized for data acquisition and constrained by ITER technological choices are being built using two different form factors. This prototyping activity contributes to the Plant Control Design Handbook effort of standardization, specifically regarding fast controller characteristics. Envisaging a general purpose fast controller design, diagnostic use cases with specific requirements were analyzed and will be presented along with the interface with CODAC and sensors. The requirements and constraints that real-time plasma control imposes on the design were also taken into consideration. Functional specifications and technology neutral architecture, together with its implications on the engineering design, were considered. The detailed engineering design compliant with ITER standards was performed and will be discussed in detail. Emphasis will be given to the integration of the controller in the standard CODAC environment. Requirements for the EPICS IOC providing the interface to the outside world, the prototype decisions on form factor, real-time operating system, and high-performance networks will also be discussed, as well as the requirements for data streaming to CODAC for visualization and archiving.
Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy
2011-01-01
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685
Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.
Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N
2011-04-15
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.
Energy Efficient Engine (E3) combustion system component technology performance report
NASA Technical Reports Server (NTRS)
Burrus, D. L.; Chahrour, C. A.; Foltz, H. L.; Sabla, P. E.; Seto, S. P.; Taylor, J. R.
1984-01-01
The Energy Efficient Engine (E3) combustor effort was conducted as part of the overall NASA/GE E3 Program. This effort included the selection of an advanced double-annular combustion system design. The primary intent of this effort was to evolve a design that meets the stringent emissions and life goals of the E3, as well as all of the usual performance requirements of combustion systems for modern turbofan engines. Numerous detailed design studies were conducted to define the features of the combustion system design. Development test hardware was fabricated, and an extensive testing effort was undertaken to evaluate the combustion system subcomponents in order to verify and refine the design. Technology derived from this effort was incorporated into the engine combustion hardware design. The advanced engine combustion system was then evaluated in component testing to verify the design intent. What evolved from this effort was an advanced combustion system capable of satisfying all of the combustion system design objectives and requirements of the E3.
ERIC Educational Resources Information Center
McDowall, Philippa S.; Schaughency, Elizabeth
2017-01-01
School efforts to engage parents are posited to influence whether and how they are involved in their children's schooling. The authors examined educators' engagement efforts in beginning reading, their subjective evaluations of engagement practices, and beliefs about parent involvement, in two stratified samples of New Zealand elementary school…
ERIC Educational Resources Information Center
Hofer, Claire; Eisenberg, Nancy; Reiser, Mark
2010-01-01
The relations among effortful control, ego resiliency, socialization, and social functioning were examined with a sample of 182 French adolescents (14-20 years old). Adolescents, their parents, and/or teachers completed questionnaires on these constructs. Effortful control and ego resiliency were correlated with adolescents' social functioning,…
Sample handling for mass spectrometric proteomic investigations of human sera.
West-Nielsen, Mikkel; Høgdall, Estrid V; Marchiori, Elena; Høgdall, Claus K; Schou, Christian; Heegaard, Niels H H
2005-08-15
Proteomic investigations of sera are potentially of value for diagnosis, prognosis, choice of therapy, and disease activity assessment by virtue of discovering new biomarkers and biomarker patterns. Much debate focuses on the biological relevance and the need for identification of such biomarkers while less effort has been invested in devising standard procedures for sample preparation and storage in relation to model building based on complex sets of mass spectrometric (MS) data. Thus, development of standardized methods for collection and storage of patient samples together with standards for transportation and handling of samples are needed. This requires knowledge about how sample processing affects MS-based proteome analyses and thereby how nonbiological biased classification errors are avoided. In this study, we characterize the effects of sample handling, including clotting conditions, storage temperature, storage time, and freeze/thaw cycles, on MS-based proteomics of human serum by using principal components analysis, support vector machine learning, and clustering methods based on genetic algorithms as class modeling and prediction methods. Using spiking to artificially create differentiable sample groups, this integrated approach yields data that--even when working with sample groups that differ more than may be expected in biological studies--clearly demonstrate the need for comparable sampling conditions for samples used for modeling and for the samples that are going into the test set group. Also, the study emphasizes the difference between class prediction and class comparison studies as well as the advantages and disadvantages of different modeling methods.
Marchetti, Igor; Shumake, Jason; Grahek, Ivan; Koster, Ernst H W
2018-08-01
Temperamental effortful control and attentional networks are increasingly viewed as important underlying processes in depression and anxiety. However, it is still unknown whether these factors facilitate depressive and anxiety symptoms in the general population and, more specifically, in remitted depressed individuals. We investigated to what extent effortful control and attentional networks (i.e., Attention Network Task) explain concurrent depressive and anxious symptoms in healthy individuals (n = 270) and remitted depressed individuals (n = 90). Both samples were highly representative of the US population. Increased effortful control predicted a substantial decrease in symptoms of both depression and anxiety in the whole sample, whereas decreased efficiency of executive attention predicted a modest increase in depressive symptoms. Remitted depressed individuals did not show less effortful control nor less efficient attentional networks than healthy individuals. Moreover, clinical status did not moderate the relationship between temperamental factors and either depressive or anxiety symptoms. Limitations include the cross-sectional nature of the study. Our study shows that temperamental effortful control represents an important transdiagnostic process for depressive and anxiety symptoms in adults. Copyright © 2018 Elsevier B.V. All rights reserved.
Magnuson, Matthew; Ernst, Hiba; Griggs, John; Fitz-James, Schatzi; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Smith, Terry; Hedrick, Elizabeth
2014-11-01
Catastrophic incidents, such as natural disasters, terrorist attacks, and industrial accidents, can occur suddenly and have high impact. However, they often occur at such a low frequency and in unpredictable locations that planning for the management of the consequences of a catastrophe can be difficult. For those catastrophes that result in the release of contaminants, the ability to analyze environmental samples is critical and contributes to the resilience of affected communities. Analyses of environmental samples are needed to make appropriate decisions about the course of action to restore the area affected by the contamination. Environmental samples range from soil, water, and air to vegetation, building materials, and debris. In addition, processes used to decontaminate any of these matrices may also generate wastewater and other materials that require analyses to determine the best course for proper disposal. This paper summarizes activities and programs the United States Environmental Protection Agency (USEPA) has implemented to ensure capability and capacity for the analysis of contaminated environmental samples following catastrophic incidents. USEPA's focus has been on building capability for a wide variety of contaminant classes and on ensuring national laboratory capacity for potential surges in the numbers of samples that could quickly exhaust the resources of local communities. USEPA's efforts have been designed to ensure a strong and resilient laboratory infrastructure in the United States to support communities as they respond to contamination incidents of any magnitude. The efforts include not only addressing technical issues related to the best-available methods for chemical, biological, and radiological contaminants, but also include addressing the challenges of coordination and administration of an efficient and effective response. Laboratory networks designed for responding to large scale contamination incidents can be sustained by applying their resources during incidents of lesser significance, for special projects, and for routine surveillance and monitoring as part of ongoing activities of the environmental laboratory community. Published by Elsevier Ltd.
Disentangling sampling and ecological explanations underlying species-area relationships
Cam, E.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Alpizar-Jara, R.; Flather, C.H.
2002-01-01
We used a probabilistic approach to address the influence of sampling artifacts on the form of species-area relationships (SARs). We developed a model in which the increase in observed species richness is a function of sampling effort exclusively. We assumed that effort depends on area sampled, and we generated species-area curves under that model. These curves can be realistic looking. We then generated SARs from avian data, comparing SARs based on counts with those based on richness estimates. We used an approach to estimation of species richness that accounts for species detection probability and, hence, for variation in sampling effort. The slopes of SARs based on counts are steeper than those of curves based on estimates of richness, indicating that the former partly reflect failure to account for species detection probability. SARs based on estimates reflect ecological processes exclusively, not sampling processes. This approach permits investigation of ecologically relevant hypotheses. The slope of SARs is not influenced by the slope of the relationship between habitat diversity and area. In situations in which not all of the species are detected during sampling sessions, approaches to estimation of species richness integrating species detection probability should be used to investigate the rate of increase in species richness with area.
NASA Astrophysics Data System (ADS)
Hanasoge, Shravan; Agarwal, Umang; Tandon, Kunj; Koelman, J. M. Vianney A.
2017-09-01
Determining the pressure differential required to achieve a desired flow rate in a porous medium requires solving Darcy's law, a Laplace-like equation, with a spatially varying tensor permeability. In various scenarios, the permeability coefficient is sampled at high spatial resolution, which makes solving Darcy's equation numerically prohibitively expensive. As a consequence, much effort has gone into creating upscaled or low-resolution effective models of the coefficient while ensuring that the estimated flow rate is well reproduced, bringing to the fore the classic tradeoff between computational cost and numerical accuracy. Here we perform a statistical study to characterize the relative success of upscaling methods on a large sample of permeability coefficients that are above the percolation threshold. We introduce a technique based on mode-elimination renormalization group theory (MG) to build coarse-scale permeability coefficients. Comparing the results with coefficients upscaled using other methods, we find that MG is consistently more accurate, particularly due to its ability to address the tensorial nature of the coefficients. MG places a low computational demand, in the manner in which we have implemented it, and accurate flow-rate estimates are obtained when using MG-upscaled permeabilities that approach or are beyond the percolation threshold.
Pitfalls in lung cancer molecular pathology: how to limit them in routine practice?
Ilie, M; Hofman, P
2012-01-01
New treatment options in advanced non-small cell lung carcinoma (NSCLC) targeting activating epidermal growth factor receptor (EGFR) gene mutations and other genetic alterations demonstrated the clinical significance of the molecular features of specific subsets of tumors. Therefore, the development of personalized medicine has stimulated the routine integration into pathology departments of somatic mutation testing. However, clinical mutation testing must be optimized and standardized with regard to histological profile, type of samples, pre-analytical steps, methodology and result reporting. Routine molecular testing in NSCLC is currently moving beyond EGFR mutational analysis. Recent progress of targeted therapies will require molecular testing for a wide panel of mutations for a personalized molecular diagnosis. As a consequence, efficient testing of multiple molecular abnormalities is an urgent requirement in thoracic oncology. Moreover, increasingly limited tumor sample becomes a major challenge for molecular pathology. Continuous efforts should be made for safe, effective and specific molecular analyses. This must be based on close collaboration between the departments involved in the management of lung cancer. In this review we explored the practical issues and pitfalls surrounding the routine implementation of molecular testing in NSCLC in a pathology laboratory.
From SPOT 5 to Pleiades HR: evolution of the instrumental specifications
NASA Astrophysics Data System (ADS)
Rosak, A.; Latry, C.; Pascal, V.; Laubier, D.
2017-11-01
Image quality specifications should aimed to fulfil high resolution mission requirements of remote sensing satellites with a minimum cost. The most important trade-off to be taken into account is between Modulation Transfer Function, radiometric noise and sampling scheme. This compromise is the main driver during design optimisation and requirement definition in order to achieve good performances and to minimise the mission cost. For the SPOT 5 satellite, a new compromise had been chosen. The supermode principle of imagery (sampling at 2.5 meter with a pixel size of 5 meter) imp roves the resolution by a factor of four compared with the SPOT 4 satellite (10 meter resolution). This paper presents the image quality specifications of the HRG-SPOT 5 instrument. We introduce all the efforts made on the instrument to achieve good image quality and low radiometric noise, then we compare the results with the SPOT 4 instrument's performances to highlight the improvements achieved. Then, the in-orbit performance will be described. Finally, we will present the new goals of image quality specifications for the new Pleiades-HR satellite for earth observation (0.7 meter resolution) and the instrument concept.
Effort-related functions of nucleus accumbens dopamine and associated forebrain circuits.
Salamone, J D; Correa, M; Farrar, A; Mingote, S M
2007-04-01
Over the last several years, it has become apparent that there are critical problems with the hypothesis that brain dopamine (DA) systems, particularly in the nucleus accumbens, directly mediate the rewarding or primary motivational characteristics of natural stimuli such as food. Hypotheses related to DA function are undergoing a substantial restructuring, such that the classic emphasis on hedonia and primary reward is giving way to diverse lines of research that focus on aspects of instrumental learning, reward prediction, incentive motivation, and behavioral activation. The present review discusses dopaminergic involvement in behavioral activation and, in particular, emphasizes the effort-related functions of nucleus accumbens DA and associated forebrain circuitry. The effects of accumbens DA depletions on food-seeking behavior are critically dependent upon the work requirements of the task. Lever pressing schedules that have minimal work requirements are largely unaffected by accumbens DA depletions, whereas reinforcement schedules that have high work (e.g., ratio) requirements are substantially impaired by accumbens DA depletions. Moreover, interference with accumbens DA transmission exerts a powerful influence over effort-related decision making. Rats with accumbens DA depletions reallocate their instrumental behavior away from food-reinforced tasks that have high response requirements, and instead, these rats select a less-effortful type of food-seeking behavior. Along with prefrontal cortex and the amygdala, nucleus accumbens is a component of the brain circuitry regulating effort-related functions. Studies of the brain systems regulating effort-based processes may have implications for understanding drug abuse, as well as energy-related disorders such as psychomotor slowing, fatigue, or anergia in depression.
Defibaugh-Chavez, Stephanie; Douris, Aphrodite; Vetter, Danah; Atkinson, Richard; Kissler, Bonnie; Khroustalev, Allison; Robertson, Kis; Sharma, Yudhbir; Becker, Karen; Dessai, Uday; Antoine, Nisha; Allen, Latasha; Holt, Kristin; Gieraltowski, Laura; Wise, Matthew; Schwensohn, Colin
2018-01-01
Abstract On June 28, 2013, the Food Safety and Inspection Service (FSIS) was notified by the Centers for Disease Control and Prevention (CDC) of an investigation of a multistate cluster of illnesses of Salmonella enterica serovar Heidelberg. Since case-patients in the cluster reported consumption of a variety of chicken products, FSIS used a simple likelihood-based approach using traceback information to focus on intensified sampling efforts. This article describes the multiphased product sampling approach taken by FSIS when epidemiologic evidence implicated chicken products from multiple establishments operating under one corporation. The objectives of sampling were to (1) assess process control of chicken slaughter and further processing and (2) determine whether outbreak strains were present in products from these implicated establishments. As part of the sample collection process, data collected by FSIS personnel to characterize product included category (whole chicken and type of chicken parts), brand, organic or conventional product, injection with salt solutions or flavorings, and whether product was skinless or skin-on. From the period September 9, 2013, through October 31, 2014, 3164 samples were taken as part of this effort. Salmonella percent positive declined from 19.7% to 5.3% during this timeframe as a result of regulatory and company efforts. The results of intensified sampling for this outbreak investigation informed an FSIS regulatory response and corrective actions taken by the implicated establishments. The company noted that a multihurdle approach to reduce Salmonella in products was taken, including on-farm efforts such as environmental testing, depopulation of affected flocks, disinfection of affected houses, vaccination, and use of various interventions within the establishments over the course of several months. PMID:29638165
Optimizing larval assessment to support sea lamprey control in the Great Lakes
Hansen, Michael J.; Adams, Jean V.; Cuddy, Douglas W.; Richards, Jessica M.; Fodale, Michael F.; Larson, Geraldine L.; Ollila, Dale J.; Slade, Jeffrey W.; Steeves, Todd B.; Young, Robert J.; Zerrenner, Adam
2003-01-01
Elements of the larval sea lamprey (Petromyzon marinus) assessment program that most strongly influence the chemical treatment program were analyzed, including selection of streams for larval surveys, allocation of sampling effort among stream reaches, allocation of sampling effort among habitat types, estimation of daily growth rates, and estimation of metamorphosis rates, to determine how uncertainty in each element influenced the stream selection program. First, the stream selection model based on current larval assessment sampling protocol significantly underestimated transforming sea lam-prey abundance, transforming sea lampreys killed, and marginal costs per sea lamprey killed, compared to a protocol that included more years of data (especially for large streams). Second, larval density in streams varied significantly with Type-I habitat area, but not with total area or reach length. Third, the ratio of larval density between Type-I and Type-II habitat varied significantly among streams, and that the optimal allocation of sampling effort varied with the proportion of habitat types and variability of larval density within each habitat. Fourth, mean length varied significantly among streams and years. Last, size at metamorphosis varied more among years than within or among regions and that metamorphosis varied significantly among streams within regions. Study results indicate that: (1) the stream selection model should be used to identify streams with potentially high residual populations of larval sea lampreys; (2) larval sampling in Type-II habitat should be initiated in all streams by increasing sampling in Type-II habitat to 50% of the sampling effort in Type-I habitat; and (3) methods should be investigated to reduce uncertainty in estimates of sea lamprey production, with emphasis on those that reduce the uncertainty associated with larval length at the end of the growing season and those used to predict metamorphosis.
2009-08-01
assess the performance of remedial efforts. These techniques are expensive and, by themselves, are effectively random samples guided by the training...technology should be further explored and developed for use in pre-amendment tracer tests and quantitative remedial assessments . 15. SUBJECT TERMS...and flow of injectate. Site assessment following groundwater remediation efforts typically involves discrete point sampling using wells or
The frequency response of dynamic friction: Enhanced rate-and-state models
NASA Astrophysics Data System (ADS)
Cabboi, A.; Putelat, T.; Woodhouse, J.
2016-07-01
The prediction and control of friction-induced vibration requires a sufficiently accurate constitutive law for dynamic friction at the sliding interface: for linearised stability analysis, this requirement takes the form of a frictional frequency response function. Systematic measurements of this frictional frequency response function are presented for small samples of nylon and polycarbonate sliding against a glass disc. Previous efforts to explain such measurements from a theoretical model have failed, but an enhanced rate-and-state model is presented which is shown to match the measurements remarkably well. The tested parameter space covers a range of normal forces (10-50 N), of sliding speeds (1-10 mm/s) and frequencies (100-2000 Hz). The key new ingredient in the model is the inclusion of contact stiffness to take into account elastic deformations near the interface. A systematic methodology is presented to discriminate among possible variants of the model, and then to identify the model parameter values.
Paula-Moraes, S; Burkness, E C; Hunt, T E; Wright, R J; Hein, G L; Hutchison, W D
2011-12-01
Striacosta albicosta (Smith) (Lepidoptera: Noctuidae), is a native pest of dry beans (Phaseolus vulgaris L.) and corn (Zea mays L.). As a result of larval feeding damage on corn ears, S. albicosta has a narrow treatment window; thus, early detection of the pest in the field is essential, and egg mass sampling has become a popular monitoring tool. Three action thresholds for field and sweet corn currently are used by crop consultants, including 4% of plants infested with egg masses on sweet corn in the silking-tasseling stage, 8% of plants infested with egg masses on field corn with approximately 95% tasseled, and 20% of plants infested with egg masses on field corn during mid-milk-stage corn. The current monitoring recommendation is to sample 20 plants at each of five locations per field (100 plants total). In an effort to develop a more cost-effective sampling plan for S. albicosta egg masses, several alternative binomial sampling plans were developed using Wald's sequential probability ratio test, and validated using Resampling for Validation of Sampling Plans (RVSP) software. The benefit-cost ratio also was calculated and used to determine the final selection of sampling plans. Based on final sampling plans selected for each action threshold, the average sample number required to reach a treat or no-treat decision ranged from 38 to 41 plants per field. This represents a significant savings in sampling cost over the current recommendation of 100 plants.
The Effort Paradox: Effort Is Both Costly and Valued.
Inzlicht, Michael; Shenhav, Amitai; Olivola, Christopher Y
2018-04-01
According to prominent models in cognitive psychology, neuroscience, and economics, effort (be it physical or mental) is costly: when given a choice, humans and non-human animals alike tend to avoid effort. Here, we suggest that the opposite is also true and review extensive evidence that effort can also add value. Not only can the same outcomes be more rewarding if we apply more (not less) effort, sometimes we select options precisely because they require effort. Given the increasing recognition of effort's role in motivation, cognitive control, and value-based decision-making, considering this neglected side of effort will not only improve formal computational models, but also provide clues about how to promote sustained mental effort across time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Space Tracking and Surveillance System (STSS) Cryogenic Technology Efforts and Needs
NASA Astrophysics Data System (ADS)
Kolb, I. L.; Curran, D. G. T.; Lee, C. S.
2004-06-01
The Missile Defense Agency's (MDA) STSS program, the former Space Based Infrared Systems (SBIRS) Low, has been actively supporting and working to advance space-borne cryocooler technology through efforts with the Air Force Research Lab (AFRL) and Small Business Innovation Research (SBIR) program. The envisioned infrared satellite system requires high efficiency, low power, and low weight cooling in a range of temperature and cooling loads below 120K for reliable 10-year operation to meet mission needs. This paper describes cryocooler efforts previously and currently supported by STSS and the possible future cryogenic requirements for later technology insertion.
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
Transforming controversy into consensus: the Steens Mountain initiative
Steven W. Anderson
1995-01-01
Even bitterly disputed management issues can be tempered or eliminated. Agency outreach efforts in conjunction with the media, working groups, effected interests, field trips, and "open house" social events can result in unified management efforts. In addition, distortions or misconceptions can be clarified. Recurrent efforts are required to build good...
The Mental Effort-Reward Imbalances Model and Its Implications for Behaviour Management
ERIC Educational Resources Information Center
Poulton, Alison; Whale, Samina; Robinson, Joanne
2016-01-01
Attention deficit hyperactivity disorder (ADHD) is frequently associated with oppositional defiant disorder (ODD). The Mental Effort Reward Imbalances Model (MERIM) explains this observational association as follows: in ADHD a disproportionate level of mental effort is required for sustaining concentration for achievement; in ODD the subjective…
Motivational Scaffolding, Politeness, and Writing Center Tutoring
ERIC Educational Resources Information Center
Mackiewicz, Jo; Thompson, Isabelle
2013-01-01
Writing center tutors know that improving writing skills requires sustained effort over a long period of time. They also know that motivation--the drive to actively invest in sustained effort toward a goal--is essential for writing improvement. Because motivation can direct attention toward particular tasks and increase both effort and…
Rasmussen, Teresa J.; Paxson, Chelsea R.
2017-08-25
Municipalities in Johnson County in northeastern Kansas are required to implement stormwater management programs to reduce pollutant discharges, protect water quality, and comply with applicable water-quality regulations in accordance with National Pollutant Discharge Elimination System permits for stormwater discharge. To this end, municipalities collect grab samples at streams entering and leaving their jurisdiction to determine levels of excessive nutrients, sediment, and fecal bacteria to characterize pollutants and understand the factors affecting them.In 2014, the U.S. Geological Survey and the Johnson County Stormwater Management Program, with input from the Kansas Department of Health and Environment, initiated a 5-year monitoring program to satisfy minimum sampling requirements for each municipality as described by new stormwater permits issued to Johnson County municipalities. The purpose of this report is to provide a preliminary assessment of the monitoring program. The monitoring program is described, a preliminary assessment of the monitoring program design is provided using water-quality data collected during the first 2 years of the program, and the ability of the current monitoring network and sampling plan to provide data sufficient to quantify improvements in water quality resulting from implemented and planned best management practices is evaluated. The information in this initial report may be used to evaluate changes in data collection methods while data collection is still ongoing that may lead to improved data utility.Discrete water-quality samples were collected at 27 sites and analyzed for nutrients, Escherichia coli (E. coli) bacteria, total suspended solids, and suspended-sediment concentration. In addition, continuous water-quality data (water temperature, pH, dissolved oxygen, specific conductance, turbidity, and nitrate plus nitrite) were collected at one site to characterize variability and provide a basis for comparison to discrete data. Base flow samples indicated that point sources are likely affecting nutrient concentrations and E. coli bacteria densities at several sites. Concentrations of all analytes in storm runoff samples were characterized by substantial variability among sites and samples. About one-half of the sites, representing different watersheds, had storm runoff samples with nitrogen concentrations greater than 10 milligrams per liter. About one-third of the sites, representing different watersheds, had storm runoff samples with total phosphorus concentrations greater than 3 milligrams per liter. Six sites had samples with E. coli densities greater than 100,000 colonies per 100 milliliters of water. Total suspended solids concentrations of about 12,000 milligrams per liter or greater occurred in samples from three sites.Data collected for this monitoring program may be useful for some general assessment purposes but may also be limited in potential to fully inform stormwater management activities. Valuable attributes of the monitoring program design included incorporating many sites across the county for comparisons among watersheds and municipalities, using fixed-stage samplers to collect multiple samples during single events, collection of base flow samples in addition to storm samples to isolate possible point sources from stormwater sources, and use of continuous monitors to characterize variability. Limiting attributes of the monitoring program design included location of monitoring sites along municipal boundaries to satisfy permit requirements rather than using watershed-based criteria such as locations of tributaries, potential pollutant sources, and implemented management practices. Additional limiting attributes include having a large number of widespread sampling locations, which presented logistical challenges for predicting localized rainfall and collecting and analyzing samples during short timeframes associated with storms, and collecting storm samples at fixed-stage elevations only during the rising limb of storms, which does not characterize conditions over the storm hydrograph. The small number of samples collected per site resulted in a sample size too small to be representative of site conditions, including seasonal and hydrologic variability, and insufficient for meaningful statistical analysis or site-specific modeling.Several measures could be taken to improve data utility and include redesigning the monitoring network according to watershed characteristics, incorporating a nested design in which data are collected at different scales (watershed, subwatershed, and best management practices), increasing sampling frequency, and combining different methods to allow for flexibility to focus on areas and conditions of particular interest. A monitoring design that would facilitate most of these improvements would be to focus efforts on a limited number of watersheds for several years, then cycle to the next set of watersheds for several years, eventually returning to previously monitored watersheds to document changes.Redesign of the water-quality monitoring program requires considerable effort and commitment from municipalities of Johnson County. However, the long-term benefit likely is a monitoring program that results in improved stream conditions and more effective management practices and efficient expenditure of resources.
Highly Conducting Molecular Crystals.
NASA Astrophysics Data System (ADS)
Whitehead, Roger James
Available from UMI in association with The British Library. Requires signed TDF. As the result of a wide ranging effort towards the preparation of new electrically conducting molecular crystals, high quality samples were prepared of the organic radical-ion salt (TMTSF)_2SbCl _2F_4 {bis-tetramethyltetraselenafulvalene-dichlorotetrafluoroantimonate(V) }. A collaborative effort to investigate the electronic and structural properties of this material has yielded the necessary depth of information required to give a satisfactory understanding of its rather complicated behaviour. The combination of x-ray structural studies with d.c. transport, reflectance and magnetic measurements has served to underline the importance of crystalline perfection, electronic dimensionality and conduction electron correlation in determining the materials overall behaviour. This thesis describes the method of preparation and characterization of (TMTSF)_2SbCl _2F_4 and the experimental arrangements used to determine the temperature dependence of its ambient pressure electrical conductivity, thermopower and electron spin resonance spectra. The crystal structure and optical reflectance measurements at room temperature are also presented. The results into a study of the low temperature diffraction pattern are described along with the temperature dependence in the static magnetic susceptibility and in the conductivity behaviour under elevated hydrostatic pressures. These findings are rationalized by reference to other materials which show similar behaviour in their electronic and/or structural properties, and also to the various theoretical models currently enjoying favour.
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
Subsurface Sample Acquisition and Transfer Systems (SSATS)
NASA Astrophysics Data System (ADS)
Rafeek, S.; Gorevan, S. P.; Kong, K. Y.
2001-01-01
In the exploration of planets and small bodies, scientists will need the services of a deep drilling and material handling system to not only obtain the samples necessary for analyses but also to precisely transfer and deposit those samples in in-situ instruments on board a landed craft or rover. The technology for such a deep sampling system as the SSATS is currently been developed by Honeybee Robotics through a PIDDP effort. The SSATS has its foundation in a one-meter prototype (SATM) drill that was developed under the New Millenium Program for ST4/Champollion. Additionally the SSATS includes relevant coring technology form a coring drill (Athena Mini-Corer) developed for the Mars Sample Return Mission. These highly developed technologies along with the current PIDDP effort, is combined to produce a sampling system that can acquire and transfer samples from various depths. Additional information is contained in the original extended abstract.
NASA Technical Reports Server (NTRS)
Hinkel, Heather; Cryan, Scott; Zipay, John; Strube, Matthew
2015-01-01
This paper will describe the technology development efforts NASA has underway for Automated Rendezvous and Docking/Capture (AR&D/C) sensors and a docking mechanism and the challenges involved. The paper will additionally address how these technologies will be extended to other missions requiring AR&D/C whether robotic or manned. NASA needs AR&D/C sensors for both the robotic and crewed segments of the Asteroid Redirect Mission (ARM). NASA recently conducted a commonality assessment of the concept of operations for the robotic Asteroid Redirect Vehicle (ARV) and the crewed mission segment using the Orion crew vehicle. The commonality assessment also considered several future exploration and science missions requiring an AR&D/C capability. Missions considered were asteroid sample return, satellite servicing, and planetary entry, descent, and landing. This assessment determined that a common sensor suite consisting of one or more visible wavelength cameras, a threedimensional LIDAR along with long-wavelength infrared cameras for robustness and situational awareness could be used on each mission to eliminate the cost of multiple sensor developments and qualifications. By choosing sensor parameters at build time instead of at design time and, without having to requalify flight hardware, a specific mission can design overlapping bearing, range, relative attitude, and position measurement availability to suit their mission requirements with minimal nonrecurring engineering costs. The resulting common sensor specification provides the union of all performance requirements for each mission and represents an improvement over the current systems used for AR&D/C today. These sensor specifications are tightly coupled to the docking system capabilities and requirements for final docking conditions. The paper will describe NASA's efforts to develop a standard docking system for use across NASA human spaceflight missions to multiple destinations. It will describe the current design status and the considerations and technologies involved in developing this docking mechanism.
NASA Technical Reports Server (NTRS)
Hinkel, Heather; Strube, Matthew; Zipay, John J.; Cryan, Scott
2015-01-01
This paper will describe the technology development efforts NASA has underway for Automated Rendezvous and Docking/Capture (AR and D/C) sensors and a docking mechanism and the challenges involved. The paper will additionally address how these technologies will be extended to other missions requiring AR and D/C whether robotic or manned. NASA needs AR&D/C sensors for both the robotic and crewed segments of the Asteroid Redirect Mission (ARM). NASA recently conducted a commonality assessment of the concept of operations for the robotic Asteroid Redirect Vehicle (ARV) and the crewed mission segment using the Orion crew vehicle. The commonality assessment also considered several future exploration and science missions requiring an AR and D/C capability. Missions considered were asteroid sample return, satellite servicing, and planetary entry, descent, and landing. This assessment determined that a common sensor suite consisting of one or more visible wavelength cameras, a threedimensional LIDAR along with long-wavelength infrared cameras for robustness and situational awareness could be used on each mission to eliminate the cost of multiple sensor developments and qualifications. By choosing sensor parameters at build time instead of at design time and, without having to requalify flight hardware, a specific mission can design overlapping bearing, range, relative attitude, and position measurement availability to suit their mission requirements with minimal nonrecurring engineering costs. The resulting common sensor specification provides the union of all performance requirements for each mission and represents an improvement over the current systems used for AR and D/C today. These sensor specifications are tightly coupled to the docking system capabilities and requirements for final docking conditions. The paper will describe NASA's efforts to develop a standard docking system for use across NASA human spaceflight missions to multiple destinations. It will describe the current design status and the considerations and technologies involved in developing this docking mechanism.
NASA Technical Reports Server (NTRS)
Hinkel, Heather; Strube, Matthew; Zipay, John J.; Cryan, Scott
2016-01-01
This paper will describe the technology development efforts NASA has underway for Automated Rendezvous and Docking/Capture (AR&D/C) sensors and a docking mechanism and the challenges involved. The paper will additionally address how these technologies will be extended to other missions requiring AR&D/C whether robotic or manned. NASA needs AR&D/C sensors for both the robotic and crewed segments of the Asteroid Redirect Mission (ARM). NASA recently conducted a commonality assessment of the concept of operations for the robotic Asteroid Redirect Vehicle (ARV) and the crewed mission segment using the Orion spacecraft. The commonality assessment also considered several future exploration and science missions requiring an AR&D/C capability. Missions considered were asteroid sample return, satellite servicing, and planetary entry, descent, and landing. This assessment determined that a common sensor suite consisting of one or more visible wavelength cameras, a three-dimensional LIDAR along with long-wavelength infrared cameras for robustness and situational awareness could be used on each mission to eliminate the cost of multiple sensor developments and qualifications. By choosing sensor parameters at build-time instead of at design-time and, without having to requalify flight hardware, a specific mission can design overlapping bearing, range, relative attitude, and position measurement availability to suit their mission requirements with minimal non-recurring engineering costs. The resulting common sensor specification provides the union of all performance requirements for each mission and represents an improvement over the current systems used for AR&D/C today. These sensor specifications are tightly coupled to the docking system capabilities and requirements for final docking conditions. The paper will describe NASA's efforts to develop a standard docking system for use across NASA human spaceflight missions to multiple destinations. It will describe the current design status and the considerations and technologies involved in developing this docking mechanism.
International Space Station Medical Projects - Full Services to Mars
NASA Technical Reports Server (NTRS)
Pietrzyk, R. A.; Primeaux, L. L.; Wood, S. J.; Vessay, W. B.; Platts, S. H.
2018-01-01
The International Space Station Medical Projects (ISSMP) Element provides planning, integration, and implementation services for HRP research studies for both spaceflight and flight analog research. Through the implementation of these two efforts, ISSMP offers an innovative way of guiding research decisions to meet the unique challenges of understanding the human risks to space exploration. Flight services provided by ISSMP include leading informed consent briefings, developing and validating in-flight crew procedures, providing ISS crew and ground-controller training, real-time experiment monitoring, on-orbit experiment and hardware operations and facilitating data transfer to investigators. For analog studies at the NASA Human Exploration Research Analog (HERA), the ISSMP team provides subject recruitment and screening, science requirements integration, data collection schedules, data sharing agreements, mission scenarios and facilities to support investigators. The ISSMP also serves as the HRP interface to external analog providers including the :envihab bed rest facility (Cologne, Germany), NEK isolation chamber (Moscow, Russia) and the Antarctica research stations. Investigators working in either spaceflight or analog environments requires a coordinated effort between NASA and the investigators. The interdisciplinary nature of both flight and analog research requires investigators to be aware of concurrent research studies and take into account potential confounding factors that may impact their research objectives. Investigators must define clear research requirements, participate in Investigator Working Group meetings, obtain human use approvals, and provide study-specific training, sample and data collection and procedures all while adhering to schedule deadlines. These science requirements define the technical, functional and performance operations to meet the research objectives. The ISSMP maintains an expert team of professionals with the knowledge and experience to guide investigators science through all aspects of mission planning, crew operations, and research integration. During this session, the ISSMP team will discuss best-practices approaches for successfully preparing and conducting studies in both the flight and analog environments. Critical tips and tricks will be shown to greatly improve your chances of successfully completing your research aboard the International Space Station and in Spaceflight Analogs.
Evaluating the efficiency of environmental monitoring programs
Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina
2014-01-01
Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.
Thompson, William L.; Miller, Amy E.; Mortenson, Dorothy C.; Woodward, Andrea
2011-01-01
Monitoring natural resources in Alaskan national parks is challenging because of their remoteness, limited accessibility, and high sampling costs. We describe an iterative, three-phased process for developing sampling designs based on our efforts to establish a vegetation monitoring program in southwest Alaska. In the first phase, we defined a sampling frame based on land ownership and specific vegetated habitats within the park boundaries and used Path Distance analysis tools to create a GIS layer that delineated portions of each park that could be feasibly accessed for ground sampling. In the second phase, we used simulations based on landcover maps to identify size and configuration of the ground sampling units (single plots or grids of plots) and to refine areas to be potentially sampled. In the third phase, we used a second set of simulations to estimate sample size and sampling frequency required to have a reasonable chance of detecting a minimum trend in vegetation cover for a specified time period and level of statistical confidence. Results of the first set of simulations indicated that a spatially balanced random sample of single plots from the most common landcover types yielded the most efficient sampling scheme. Results of the second set of simulations were compared with field data and indicated that we should be able to detect at least a 25% change in vegetation attributes over 31. years by sampling 8 or more plots per year every five years in focal landcover types. This approach would be especially useful in situations where ground sampling is restricted by access.
Using habitat suitability models to target invasive plant species surveys.
Crall, Alycia W; Jarnevich, Catherine S; Panke, Brendon; Young, Nick; Renz, Mark; Morisette, Jeffrey
2013-01-01
Managers need new tools for detecting the movement and spread of nonnative, invasive species. Habitat suitability models are a popular tool for mapping the potential distribution of current invaders, but the ability of these models to prioritize monitoring efforts has not been tested in the field. We tested the utility of an iterative sampling design (i.e., models based on field observations used to guide subsequent field data collection to improve the model), hypothesizing that model performance would increase when new data were gathered from targeted sampling using criteria based on the initial model results. We also tested the ability of habitat suitability models to predict the spread of invasive species, hypothesizing that models would accurately predict occurrences in the field, and that the use of targeted sampling would detect more species with less sampling effort than a nontargeted approach. We tested these hypotheses on two species at the state scale (Centaurea stoebe and Pastinaca sativa) in Wisconsin (USA), and one genus at the regional scale (Tamarix) in the western United States. These initial data were merged with environmental data at 30-m2 resolution for Wisconsin and 1-km2 resolution for the western United States to produce our first iteration models. We stratified these initial models to target field sampling and compared our models and success at detecting our species of interest to other surveys being conducted during the same field season (i.e., nontargeted sampling). Although more data did not always improve our models based on correct classification rate (CCR), sensitivity, specificity, kappa, or area under the curve (AUC), our models generated from targeted sampling data always performed better than models generated from nontargeted data. For Wisconsin species, the model described actual locations in the field fairly well (kappa = 0.51, 0.19, P < 0.01), and targeted sampling did detect more species than nontargeted sampling with less sampling effort (chi2 = 47.42, P < 0.01). From these findings, we conclude that habitat suitability models can be highly useful tools for guiding invasive species monitoring, and we support the use of an iterative sampling design for guiding such efforts.
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-01-01
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146
Monitoring biological aerosols using UV fluorescence
NASA Astrophysics Data System (ADS)
Eversole, Jay D.; Roselle, Dominick; Seaver, Mark E.
1999-01-01
An apparatus has been designed and constructed to continuously monitor the number density, size, and fluorescent emission of ambient aerosol particles. The application of fluorescence to biological particles suspended in the atmosphere requires laser excitation in the UV spectral region. In this study, a Nd:YAG laser is quadrupled to provide a 266 nm wavelength to excite emission from single micrometer-sized particles in air. Fluorescent emission is used to continuously identify aerosol particles of biological origin. For calibration, biological samples of Bacillus subtilis spores and vegetative cells, Esherichia coli, Bacillus thuringiensis and Erwinia herbicola vegetative cells were prepared as suspensions in water and nebulized to produce aerosols. Detection of single aerosol particles, provides elastic scattering response as well as fluorescent emission in two spectral bands simultaneously. Our efforts have focuses on empirical characterization of the emission and scattering characteristics of various bacterial samples to determine the feasibility of optical discrimination between different cell types. Preliminary spectroscopic evidence suggest that different samples can be distinguished as separate bio-aerosol groups. In addition to controlled sample results, we will also discuss the most recent result on the effectiveness of detection outdoor releases and variations in environmental backgrounds.
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-06-03
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.
Propulsion Technology Development for Sample Return Missions Under NASA's ISPT Program
NASA Technical Reports Server (NTRS)
Anderson, David J.; Pencil, Eric J.; Vento, Daniel; Dankanich, John W.; Munk, Michelle M.; Hahne, David
2011-01-01
The In-Space Propulsion Technology (ISPT) Program was tasked in 2009 to start development of propulsion technologies that would enable future sample return missions. Sample return missions could be quite varied, from collecting and bringing back samples of comets or asteroids, to soil, rocks, or atmosphere from planets or moons. The paper will describe the ISPT Program s propulsion technology development activities relevant to future sample return missions. The sample return propulsion technology development areas for ISPT are: 1) Sample Return Propulsion (SRP), 2) Planetary Ascent Vehicles (PAV), 3) Entry Vehicle Technologies (EVT), and 4) Systems/mission analysis and tools that focuses on sample return propulsion. The Sample Return Propulsion area is subdivided into: a) Electric propulsion for sample return and low cost Discovery-class missions, b) Propulsion systems for Earth Return Vehicles (ERV) including transfer stages to the destination, and c) Low TRL advanced propulsion technologies. The SRP effort will continue work on HIVHAC thruster development in FY2011 and then transitions into developing a HIVHAC system under future Electric Propulsion for sample return (ERV and transfer stages) and low-cost missions. Previous work on the lightweight propellant-tanks will continue under advanced propulsion technologies for sample return with direct applicability to a Mars Sample Return (MSR) mission and with general applicability to all future planetary spacecraft. A major effort under the EVT area is multi-mission technologies for Earth Entry Vehicles (MMEEV), which will leverage and build upon previous work related to Earth Entry Vehicles (EEV). The major effort under the PAV area is the Mars Ascent Vehicle (MAV). The MAV is a new development area to ISPT, and builds upon and leverages the past MAV analysis and technology developments from the Mars Technology Program (MTP) and previous MSR studies.
Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation.
Johnson, Timothy C; Versteeg, Roelof J; Day-Lewis, Frederick D; Major, William; Lane, John W
2015-01-01
Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Filippa, Gianluca; Cremonese, Edoardo; Galvagno, Marta; Migliavacca, Mirco; Morra di Cella, Umberto; Petey, Martina; Siniscalco, Consolata
2015-12-01
The increasingly important effect of climate change and extremes on alpine phenology highlights the need to establish accurate monitoring methods to track inter-annual variation (IAV) and long-term trends in plant phenology. We evaluated four different indices of phenological development (two for plant productivity, i.e., green biomass and leaf area index; two for plant greenness, i.e., greenness from visual inspection and from digital images) from a 5-year monitoring of ecosystem phenology, here defined as the seasonal development of the grassland canopy, in a subalpine grassland site (NW Alps). Our aim was to establish an effective observation strategy that enables the detection of shifts in grassland phenology in response to climate trends and meteorological extremes. The seasonal development of the vegetation at this site appears strongly controlled by snowmelt mostly in its first stages and to a lesser extent in the overall development trajectory. All indices were able to detect an anomalous beginning of the growing season in 2011 due to an exceptionally early snowmelt, whereas only some of them revealed a later beginning of the growing season in 2013 due to a late snowmelt. A method is developed to derive the number of samples that maximise the trade-off between sampling effort and accuracy in IAV detection in the context of long-term phenology monitoring programmes. Results show that spring phenology requires a smaller number of samples than autumn phenology to track a given target of IAV. Additionally, productivity indices (leaf area index and green biomass) have a higher sampling requirement than greenness derived from visual estimation and from the analysis of digital images. Of the latter two, the analysis of digital images stands out as the more effective, rapid and objective method to detect IAV in vegetation development.
Precision of channel catfish catch estimates using hoop nets in larger Oklahoma reservoirs
Stewart, David R.; Long, James M.
2012-01-01
Hoop nets are rapidly becoming the preferred gear type used to sample channel catfish Ictalurus punctatus, and many managers have reported that hoop nets effectively sample channel catfish in small impoundments (<200 ha). However, the utility and precision of this approach in larger impoundments have not been tested. We sought to determine how the number of tandem hoop net series affected the catch of channel catfish and the time involved in using 16 tandem hoop net series in larger impoundments (>200 ha). Hoop net series were fished once, set for 3 d; then we used Monte Carlo bootstrapping techniques that allowed us to estimate the number of net series required to achieve two levels of precision (relative standard errors [RSEs] of 15 and 25) at two levels of confidence (80% and 95%). Sixteen hoop net series were effective at obtaining an RSE of 25 with 80% and 95% confidence in all but one reservoir. Achieving an RSE of 15 was often less effective and required 18-96 hoop net series given the desired level of confidence. We estimated that an hour was needed, on average, to deploy and retrieve three hoop net series, which meant that 16 hoop net series per reservoir could be "set" and "retrieved" within a day, respectively. The estimated number of net series to achieve an RSE of 25 or 15 was positively associated with the coefficient of variation (CV) of the sample but not with reservoir surface area or relative abundance. Our results suggest that hoop nets are capable of providing reasonably precise estimates of channel catfish relative abundance and that the relationship with the CV of the sample reported herein can be used to determine the sampling effort for a desired level of precision.
Filippa, Gianluca; Cremonese, Edoardo; Galvagno, Marta; Migliavacca, Mirco; Morra di Cella, Umberto; Petey, Martina; Siniscalco, Consolata
2015-12-01
The increasingly important effect of climate change and extremes on alpine phenology highlights the need to establish accurate monitoring methods to track inter-annual variation (IAV) and long-term trends in plant phenology. We evaluated four different indices of phenological development (two for plant productivity, i.e., green biomass and leaf area index; two for plant greenness, i.e., greenness from visual inspection and from digital images) from a 5-year monitoring of ecosystem phenology, here defined as the seasonal development of the grassland canopy, in a subalpine grassland site (NW Alps). Our aim was to establish an effective observation strategy that enables the detection of shifts in grassland phenology in response to climate trends and meteorological extremes. The seasonal development of the vegetation at this site appears strongly controlled by snowmelt mostly in its first stages and to a lesser extent in the overall development trajectory. All indices were able to detect an anomalous beginning of the growing season in 2011 due to an exceptionally early snowmelt, whereas only some of them revealed a later beginning of the growing season in 2013 due to a late snowmelt. A method is developed to derive the number of samples that maximise the trade-off between sampling effort and accuracy in IAV detection in the context of long-term phenology monitoring programmes. Results show that spring phenology requires a smaller number of samples than autumn phenology to track a given target of IAV. Additionally, productivity indices (leaf area index and green biomass) have a higher sampling requirement than greenness derived from visual estimation and from the analysis of digital images. Of the latter two, the analysis of digital images stands out as the more effective, rapid and objective method to detect IAV in vegetation development.
Nahmias, Jeffry; Brakenridge, Scott; Jawa, Randeep S; Holena, Daniel N; Agapian, John Varujan; Bruns, Brandon; Chestovich, Paul J; Chung, Bruce; Nguyen, Jonathan; Schulman, Carl I; Staudenmayer, Kristan; Dixon, Rachel; Smith, Jason W; Bernard, Andrew C; Pascual, Jose L
2018-01-01
Oversight of human subject research has evolved considerably since its inception. However, previous studies identified a lack of consistency of institutional review board (IRB) determination for the type of review required and whether informed consent is necessary, especially for prospective observational studies, which pose minimal risk of harm. We hypothesized that there is significant inter-institution variation in IRB requirements for the type of review and necessity of informed consent, especially for prospective observational trials without blood/tissue utilization. We also sought to describe investigators’ and IRB members’ attitudes toward the type of review and need for consent. Eastern Association for the Surgery of Trauma (EAST) and IRB members were sent an electronic survey on IRB review and informed consent requirement. We performed descriptive analyses as well as Fisher’s exact test to determine differences between EAST and IRB members’ responses. The response rate for EAST members from 113 institutions was 13.5%, whereas a convenience sample of IRB members from 14 institutions had a response rate of 64.4%. Requirement for full IRB review for retrospective studies using patient identifiers was reported by zero IRB member compared with 13.1% of EAST members (p=0.05). Regarding prospective observational trials without blood/tissue collection, 48.1% of EAST members reported their institutions required a full IRB review compared with 9.5% of IRB members (p=0.01). For prospective observational trials with blood/tissue collection, 80% of EAST members indicated requirement to submit a full IRB review compared with only 13.6% of IRB members (p<0.001). Most EAST members (78.6%) stated that informed consent is not ethically necessary in prospective observational trials without blood/tissue collection, whereas most IRB members thought that informed consent was ethically necessary (63.6%, p<0.001). There is significant variation in perception and practice regarding the level of review for prospective observational studies and whether informed consent is necessary. We recommend future interdisciplinary efforts between researchers and IRBs should occur to better standardize local IRB efforts. Level of evidence IV. PMID:29862323
Hinz, Andreas; Zenger, Markus; Brähler, Elmar; Spitzer, Silvia; Scheuch, Klaus; Seibt, Reingard
2016-08-01
High degrees of premature retirement among teachers warrant investigating the occupational burden and the mental health status of this profession. A sample of 1074 German teachers participated in this study. Two samples of the general population (N = 824 and N = 792) were used as comparison groups. Work distress was assessed with the Effort-Reward-Imbalance questionnaire, and mental health problems were measured with the General Health Questionnaire (GHQ-12). Teachers reported more effort-reward imbalance (M = 0.64) compared with the general population (M = 0.57), and they perceived more mental health problems (GHQ: M = 12.1) than the comparison group (M = 9.5). School type was not associated with work stress and mental health. Teachers with leading functions perceived high degrees of effort and reward, resulting in a moderate effort-reward ratio and no heightened mental health problems. Teachers working full time reported more effort than teachers working part time, but the reward mean values of both groups were similar. This results in a somewhat unfavourable effort-reward ratio of teachers working full time. Moreover, teachers working full time reported more mental health problems. The results support the appropriateness of the effort-reward conception, applied to the profession of teachers. The higher degree of effort-reward imbalance and the level of mental health problems warrant preventive measures. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.
Methods development for total organic carbon accountability
NASA Technical Reports Server (NTRS)
Benson, Brian L.; Kilgore, Melvin V., Jr.
1991-01-01
This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.
Microbial ecology measurement system
NASA Technical Reports Server (NTRS)
1972-01-01
The sensitivity and potential rapidity of the PIA test that was demonstrated during the feasibility study warranted continuing the effort to examine the possibility of adapting this test to an automated procedure that could be used during manned missions. The effort during this program has optimized the test conditions for two important respiratory pathogens, influenza virus and Mycoplasma pneumoniae, developed a laboratory model automated detection system, and investigated a group antigen concept for virus detection. Preliminary tests on the handling of oropharygeal clinical samples for PIA testing were performed using the adenovirus system. The results obtained indicated that the PIA signal is reduced in positive samples and is increased in negative samples. Treatment with cysteine appeared to reduce nonspecific agglutination in negative samples but did not maintain the signal in positive samples.
AQME: A forensic mitochondrial DNA analysis tool for next-generation sequencing data.
Sturk-Andreaggi, Kimberly; Peck, Michelle A; Boysen, Cecilie; Dekker, Patrick; McMahon, Timothy P; Marshall, Charla K
2017-11-01
The feasibility of generating mitochondrial DNA (mtDNA) data has expanded considerably with the advent of next-generation sequencing (NGS), specifically in the generation of entire mtDNA genome (mitogenome) sequences. However, the analysis of these data has emerged as the greatest challenge to implementation in forensics. To address this need, a custom toolkit for use in the CLC Genomics Workbench (QIAGEN, Hilden, Germany) was developed through a collaborative effort between the Armed Forces Medical Examiner System - Armed Forces DNA Identification Laboratory (AFMES-AFDIL) and QIAGEN Bioinformatics. The AFDIL-QIAGEN mtDNA Expert, or AQME, generates an editable mtDNA profile that employs forensic conventions and includes the interpretation range required for mtDNA data reporting. AQME also integrates an mtDNA haplogroup estimate into the analysis workflow, which provides the analyst with phylogenetic nomenclature guidance and a profile quality check without the use of an external tool. Supplemental AQME outputs such as nucleotide-per-position metrics, configurable export files, and an audit trail are produced to assist the analyst during review. AQME is applied to standard CLC outputs and thus can be incorporated into any mtDNA bioinformatics pipeline within CLC regardless of sample type, library preparation or NGS platform. An evaluation of AQME was performed to demonstrate its functionality and reliability for the analysis of mitogenome NGS data. The study analyzed Illumina mitogenome data from 21 samples (including associated controls) of varying quality and sample preparations with the AQME toolkit. A total of 211 tool edits were automatically applied to 130 of the 698 total variants reported in an effort to adhere to forensic nomenclature. Although additional manual edits were required for three samples, supplemental tools such as mtDNA haplogroup estimation assisted in identifying and guiding these necessary modifications to the AQME-generated profile. Along with profile generation, AQME reported accurate haplogroups for 18 of the 19 samples analyzed. The single errant haplogroup assignment, although phylogenetically close, identified a bug that only affects partial mitogenome data. Future adjustments to AQME's haplogrouping tool will address this bug as well as enhance the overall scoring strategy to better refine and automate haplogroup assignments. As NGS enables broader use of the mtDNA locus in forensics, the availability of AQME and other forensic-focused mtDNA analysis tools will ease the transition and further support mitogenome analysis within routine casework. Toward this end, the AFMES-AFDIL has utilized the AQME toolbox in conjunction with the CLC Genomics Workbench to successfully validate and implement two NGS mitogenome methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Crespo, Bibiana G; Wallhead, Philip J; Logares, Ramiro; Pedrós-Alió, Carlos
2016-01-01
High-throughput sequencing (HTS) techniques have suggested the existence of a wealth of species with very low relative abundance: the rare biosphere. We attempted to exhaustively map this rare biosphere in two water samples by performing an exceptionally deep pyrosequencing analysis (~500,000 final reads per sample). Species data were derived by a 97% identity criterion and various parametric distributions were fitted to the observed counts. Using the best-fitting Sichel distribution we estimate a total species richness of 1,568-1,669 (95% Credible Interval) and 5,027-5,196 for surface and deep water samples respectively, implying that 84-89% of the total richness in those two samples was sequenced, and we predict that a quadrupling of the present sequencing effort would suffice to observe 90% of the total richness in both samples. Comparing the HTS results with a culturing approach we found that most of the cultured taxa were not obtained by HTS, despite the high sequencing effort. Culturing therefore remains a useful tool for uncovering marine bacterial diversity, in addition to its other uses for studying the ecology of marine bacteria.
NASA Efforts on Nanotechnology
NASA Technical Reports Server (NTRS)
Miranda, Felix A.
2003-01-01
An overview of the field of nanotechnology within the theme of "New efforts in Nanotechnology Research," will be presented. NASA's interest, requirements and current efforts in this emerging field will be discussed. In particular, NASA efforts to develop nanoelectronic devices, fuel cells, and other applications of interest using this novel technology by collaborating with academia will be addressed. Progress on current collaborations in this area with the University of Puerto Rico will be highlighted.
34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2010 CFR
2010-07-01
... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does... 34 Education 3 2010-07-01 2010-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of...
34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2013 CFR
2013-07-01
... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does... 34 Education 3 2013-07-01 2013-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of...
34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2014 CFR
2014-07-01
... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does... 34 Education 3 2014-07-01 2014-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of...
34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2012 CFR
2012-07-01
... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does... 34 Education 3 2012-07-01 2012-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of...
34 CFR 403.184 - How does a State request a waiver of the maintenance of effort requirement?
Code of Federal Regulations, 2011 CFR
2011-07-01
... APPLIED TECHNOLOGY EDUCATION PROGRAM What Financial Conditions Must Be Met by a State? § 403.184 How does... 34 Education 3 2011-07-01 2011-07-01 false How does a State request a waiver of the maintenance of effort requirement? 403.184 Section 403.184 Education Regulations of the Offices of the Department of...
Randall, M.T.; Sulak, K.J.
2012-01-01
Evidence of autumn spawning of Gulf sturgeon Acipenser oxyrinchus desotoi in the Suwannee River, Florida, was compiled from multiple investigations between 1986 and 2008. Gulf sturgeon are known from egg collections to spawn in the springtime months following immigration into rivers. Evidence of autumn spawning includes multiple captures of sturgeon in September through early November that were ripe (late-development ova; motile sperm) or exhibited just-spawned characteristics, telemetry of fish that made >175 river kilometer upstream excursions to the spawning grounds in September–October, and the capture of a 9.3 cm TL age-0 Gulf sturgeon on 29 November 2000 (which would have been spawned in late September 2000). Analysis of age-at-length data indicates that ca. 20% of the Suwannee River Gulf sturgeon population may be attributable to autumn spawning. However, with the very low sampling effort expended, eggs or early life stages have not yet been captured in the autumn, which would be the conclusive proof of autumn spawning. More sampling, and sampling at previously unknown sites frequented by acoustic telemetry fish, would be required to find eggs.
Amendment to examination and investigation sample requirements--FDA. Direct final rule.
1998-09-25
The Food and Drug Administration (FDA) is amending its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This direct final rule is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health. Elsewhere in this issue of the Federal Register, FDA is publishing a companion proposed rule under FDA's usual procedures for notice and comment to provide a procedural framework to finalize the rule in the event the agency receives any significant adverse comment and withdraws this direct final rule.
Melcher, Anthony A; Horsburgh, Jeffery S
2017-06-01
Water quality in urban streams and stormwater systems is highly dynamic, both spatially and temporally, and can change drastically during storm events. Infrequent grab samples commonly collected for estimating pollutant loadings are insufficient to characterize water quality in many urban water systems. In situ water quality measurements are being used as surrogates for continuous pollutant load estimates; however, relatively few studies have tested the validity of surrogate indicators in urban stormwater conveyances. In this paper, we describe an observatory aimed at demonstrating the infrastructure required for surrogate monitoring in urban water systems and for capturing the dynamic behavior of stormwater-driven pollutant loads. We describe the instrumentation of multiple, autonomous water quality and quantity monitoring sites within an urban observatory. We also describe smart and adaptive sampling procedures implemented to improve data collection for developing surrogate relationships and for capturing the temporal and spatial variability of pollutant loading events in urban watersheds. Results show that the observatory is able to capture short-duration storm events within multiple catchments and, through inter-site communication, sampling efforts can be synchronized across multiple monitoring sites.
Uncertainty in sample estimates and the implicit loss function for soil information.
NASA Astrophysics Data System (ADS)
Lark, Murray
2015-04-01
One significant challenge in the communication of uncertain information is how to enable the sponsors of sampling exercises to make a rational choice of sample size. One way to do this is to compute the value of additional information given the loss function for errors. The loss function expresses the costs that result from decisions made using erroneous information. In certain circumstances, such as remediation of contaminated land prior to development, loss functions can be computed and used to guide rational decision making on the amount of resource to spend on sampling to collect soil information. In many circumstances the loss function cannot be obtained prior to decision making. This may be the case when multiple decisions may be based on the soil information and the costs of errors are hard to predict. The implicit loss function is proposed as a tool to aid decision making in these circumstances. Conditional on a logistical model which expresses costs of soil sampling as a function of effort, and statistical information from which the error of estimates can be modelled as a function of effort, the implicit loss function is the loss function which makes a particular decision on effort rational. In this presentation the loss function is defined and computed for a number of arbitrary decisions on sampling effort for a hypothetical soil monitoring problem. This is based on a logistical model of sampling cost parameterized from a recent geochemical survey of soil in Donegal, Ireland and on statistical parameters estimated with the aid of a process model for change in soil organic carbon. It is shown how the implicit loss function might provide a basis for reflection on a particular choice of sample size by comparing it with the values attributed to soil properties and functions. Scope for further research to develop and apply the implicit loss function to help decision making by policy makers and regulators is then discussed.
Estimation of mortality for stage-structured zooplankton populations: What is to be done?
NASA Astrophysics Data System (ADS)
Ohman, Mark D.
2012-05-01
Estimation of zooplankton mortality rates in field populations is a challenging task that some contend is inherently intractable. This paper examines several of the objections that are commonly raised to efforts to estimate mortality. We find that there are circumstances in the field where it is possible to sequentially sample the same population and to resolve biologically caused mortality, albeit with error. Precision can be improved with sampling directed by knowledge of the physical structure of the water column, combined with adequate sample replication. Intercalibration of sampling methods can make it possible to sample across the life history in a quantitative manner. Rates of development can be constrained by laboratory-based estimates of stage durations from temperature- and food-dependent functions, mesocosm studies of molting rates, or approximation of development rates from growth rates, combined with the vertical distributions of organisms in relation to food and temperature gradients. Careful design of field studies guided by the assumptions of specific estimation models can lead to satisfactory mortality estimates, but model uncertainty also needs to be quantified. We highlight additional issues requiring attention to further advance the field, including the need for linked cooperative studies of the rates and causes of mortality of co-occurring holozooplankton and ichthyoplankton.
Mental Effort in Binary Categorization Aided by Binary Cues
ERIC Educational Resources Information Center
Botzer, Assaf; Meyer, Joachim; Parmet, Yisrael
2013-01-01
Binary cueing systems assist in many tasks, often alerting people about potential hazards (such as alarms and alerts). We investigate whether cues, besides possibly improving decision accuracy, also affect the effort users invest in tasks and whether the required effort in tasks affects the responses to cues. We developed a novel experimental tool…
Older Adults Expend More Listening Effort than Young Adults Recognizing Speech in Noise
ERIC Educational Resources Information Center
Gosselin, Penny Anderson; Gagne, Jean-Pierre
2011-01-01
Purpose: Listening in noisy situations is a challenging experience for many older adults. The authors hypothesized that older adults exert more listening effort compared with young adults. Listening effort involves the attention and cognitive resources required to understand speech. The purpose was (a) to quantify the amount of listening effort…
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
A review of evaluative studies of computer-based learning in nursing education.
Lewis, M J; Davies, R; Jenkins, D; Tait, M I
2001-01-01
Although there have been numerous attempts to evaluate the learning benefits of computer-based learning (CBL) packages in nursing education, the results obtained have been equivocal. A literature search conducted for this review found 25 reports of the evaluation of nursing CBL packages since 1966. Detailed analysis of the evaluation methods used in these reports revealed that most had significant design flaws, including the use of too small a sample group, the lack of a control group, etc. Because of this, the conclusions reached were not always valid. More effort is required in the design of future evaluation studies of nursing CBL packages. Copyright 2001 Harcourt Publishers Ltd.
Influence of coherent mesoscale structures on satellite-based Doppler lidar wind measurements
NASA Technical Reports Server (NTRS)
Emmitt, G. D.; Houston, S.
1985-01-01
Efforts to develop display routines for overlaying gridded and nongridded data sets are discussed. The primary objective is to have the capability to review global patterns of winds and lidar samples; to zoom in on particular wind features or global areas; and to display contours of wind components and derived fields (e.g., divergence, vorticity, deformation, etc.). Current considerations in support of a polar orbiting shuttle lidar mission are discussed. Ground truth for a shuttle lidar experiment may be limited to fortuitous alignment of lidar wind profiles and scheduled rawinsonde profiles. Any improvement on this would require special rawinsonde launches and/or optimization of the shuttle orbit with global wind measurement networks.
A Blueprint to Address Research Gaps in the Development of Biomarkers for Pediatric Tuberculosis
Nicol, Mark Patrick; Gnanashanmugam, Devasena; Browning, Renee; Click, Eleanor S.; Cuevas, Luis E.; Detjen, Anne; Graham, Steve M.; Levin, Michael; Makhene, Mamodikoe; Nahid, Payam; Perez-Velez, Carlos M.; Reither, Klaus; Song, Rinn; Spiegel, Hans M. L.; Worrell, Carol; Zar, Heather J.; Walzl, Gerhard
2015-01-01
Childhood tuberculosis contributes significantly to the global tuberculosis disease burden but remains challenging to diagnose due to inadequate methods of pathogen detection in paucibacillary pediatric samples and lack of a child-specific host biomarker to identify disease. Accurately diagnosing tuberculosis in children is required to improve case detection, surveillance, healthcare delivery, and effective advocacy. In May 2014, the National Institutes of Health convened a workshop including researchers in the field to delineate priorities to address this research gap. This blueprint describes the consensus from the workshop, identifies critical research steps to advance this field, and aims to catalyze efforts toward harmonization and collaboration in this area. PMID:26409279
Paul, G
2008-09-01
Extreme rates of premature death prior to the advent of modern medicine, very low rates of premature death in First World nations with low rates of prayer, and the least flawed of a large series of clinical trials indicate that remote prayer is not efficacious in treating illness. Mass contamination of sample cohorts renders such clinical studies inherently ineffectual. The required supernatural and paranormal mechanisms render them implausible. The possibility that the latter are not benign, and the potentially adverse psychological impact of certain protocols, renders these medical trials unethical. Resources should no longer be wasted on medical efforts to detect the supernatural and paranormal.
On a useful functional representation of control system structure
NASA Technical Reports Server (NTRS)
Malchow, Harvey L.
1988-01-01
An alternative structure for control systems is proposed. The structure is represented by a three-element block diagram and three functional definitions. It is argued that the three functional elements form a canonical set. The set includes the functions description, estimation and control. General overlay of the structure on parallel state and nested-state control systems is discussed. Breakdown of two real nested-state control systems into the proposed functional format is displayed. Application of the process to the mapping of complex control systems R and D efforts is explained with the Mars Rover Sample and Return mission as an example. A previous application of this basic functional structure to Space Station performance requirements organization is discussed.
Estes, John; Belward, Alan; Loveland, Thomas; Scepan, Joseph; Strahler, Alan H.; Townshend, John B.; Justice, Chris
1999-01-01
This paper focuses on the lessons hearned in the conduct of the lnternational Geosphere Biosphere Program's Data and Information System (rcnr-nts), global 1-km Land-Cover Mapping Project (n$cover). There is stiLL considerable fundamental research to be conducted dealing with the development and validation of thematic geospatial products derived from a combination of remotely sensed and ancillary data. Issues include database and data product development, classification legend definitions, processing and analysis techniques, and sampling strategies. A significant infrastructure is required to support an effort such as DISCover. The infrastructure put in place under the auspices of the IGBP-DIS serves as a model, and must be put in place to enable replication and development of projects such as Discover.
Kagawa-Singer, Marjorie; Park Tanjasiri, Sora; Lee, Susan W; Foo, Mary Anne; Ngoc Nguyen, Tu-Uyen; Tran, Jacqueline H; Valdez, Annalyn
2006-01-01
No data exists on the breast and cervical cancer screening practices among Cambodian, Laotian, Thai, and Tongan women. In this article, we describe the efforts required to conduct a baseline survey among these non-English-speaking women using the participatory action research (PAR) approach. We tailored small population sampling techniques to each of the populations in partnership with Community Health Outreach workers. A total of 1825 surveys were successfully conducted in 8 communities. PAR and the culturally based techniques used to conduct the survey proved successful in maintaining scientific rigor, developing true community-researcher partnership, and achieving over 99% participation.
Going to Scale with High-Quality Early Education
ERIC Educational Resources Information Center
Christina, Rachel; Nicholson-Goodman, JoVictoria
2005-01-01
This report is an initial effort to describe efforts of a number of states that are seeking to create statewide systems of high-quality pre-kindergarten services, as well as some of the progress they have made in doing so. Focusing on the efforts of a sample of eight U.S. states, it examines the policy choices that states have made when…
ERIC Educational Resources Information Center
Schwinger, Malte; Steinmayr, Ricarda; Spinath, Birgit
2009-01-01
It was assumed that the effect of motivational regulation strategies on achievement is mediated by effort management and moderated by intelligence. A sample of 231 11th and 12th grade German high-school students provided self-reports on their use of motivational regulation strategies and effort management and completed an intelligence test.…
ERIC Educational Resources Information Center
Folmer, Amy S.; Cole, David A.; Sigal, Amanda B.; Benbow, Lovisa D.; Satterwhite, Lindsay F.; Swygert, Katherine E.; Ciesla, Jeffrey A.
2008-01-01
Building on Nicholls's earlier work, we examined developmental changes in children's understanding of effort and ability when faced with a negative outcome. In a sample of 166 children and adolescents (ages 5-15 years), younger children conflated the meaning of effort and ability, explaining that smart students work hard, whereas older children…
NASA Astrophysics Data System (ADS)
Rummel, J. D.; Conley, C. A.
2013-12-01
The 2013-2022 NRC Decadal Survey named its #1 Flagship priority as a large, capable Mars rover that would be the first of a three-mission, multi-decadal effort to return samples from Mars. More recently, NASA's Mars Program has stated that a Mars rover mission known as 'Mars 2020' would be flown to Mars (in 2020) to accomplish a subset of the goals specified by the NRC, and the recent report of the Mars 2020 Science Definition Team (SDT) has recommended that the mission accomplish broad and rigorous in situ science, including seeking biosignatures, acquiring a diverse set of samples intended to address a range of Mars science questions and storing them in a cache for potential return to Earth at a later time, and other engineering goals to constrain costs and support future human exploration. In some ways Mars 2020 will share planetary protection requirements with the Mars Science Laboratory mission that landed in 2012, which included landing site constraints based on the presence of a perennial heat source (the MMRTG) aboard the lander/rover. In a very significant way, however, the presence of a sample-cache and the potential that Mars 2020 will be the first mission in the chain that will return a sample from Mars to Earth. Thus Mars 2020 will face more stringent requirements aimed at keeping the mission from returning Earth contamination with the samples from Mars. Mars 2020 will be looking for biosignatures of ancient life, on Mars, but will also need to be concerned with the potential to detect extant biosignatures or life itself within the sample that is eventually returned. If returned samples are able to unlock wide-ranging questions about the geology, surface processes, and habitability of Mars that cannot be answered by study of meteorites or current mission data, then either the returned samples must be free enough of Earth organisms to be releasable from a quarantine facility or the planned work of sample scientists, including high- and low-T geochemistry, igneous and sedimentary petrology, mineral spectroscopy, and astrobiology, will have to be accomplished within a containment facility. The returned samples also need to be clean of Earth organisms to avoid the potential that Earth contamination will mask the potential for martian life to be detected, allowing only non-conclusive or false-negative results. The requirements placed on the Mars 2020 mission to address contamination control in a life-detection framework will be one of the many challenges faced in this potential first step in Mars sample return.
Forrest, Sarah M; Challis, John H; Winter, Samantha L
2014-06-01
Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
Lanier, Wendy E.; Bailey, Larissa L.; Muths, Erin L.
2016-01-01
Conservation of imperiled species often requires knowledge of vital rates and population dynamics. However, these can be difficult to estimate for rare species and small populations. This problem is further exacerbated when individuals are not available for detection during some surveys due to limited access, delaying surveys and creating mismatches between the breeding behavior and survey timing. Here we use simulations to explore the impacts of this issue using four hypothetical boreal toad (Anaxyrus boreas boreas) populations, representing combinations of logistical access (accessible, inaccessible) and breeding behavior (synchronous, asynchronous). We examine the bias and precision of survival and breeding probability estimates generated by survey designs that differ in effort and timing for these populations. Our findings indicate that the logistical access of a site and mismatch between the breeding behavior and survey design can greatly limit the ability to yield accurate and precise estimates of survival and breeding probabilities. Simulations similar to what we have performed can help researchers determine an optimal survey design(s) for their system before initiating sampling efforts.
Adler, Abby D.; Strunk, Daniel R.; Fazio, Russell H.
2015-01-01
This study examined effortful cognitive skills and underlying maladaptive beliefs among patients treated with Cognitive Therapy for depression (CT). Depressed patients (n = 44) completed cognitive measures before and after 16 weeks of CT. Measures included: an assessment of CT skills (Ways of Responding Scale, WOR), an implicit test of maladaptive beliefs (Implicit Association Test, IAT), and a self-report questionnaire of maladaptive beliefs (Dysfunctional Attitude Scale, DAS). A matched sample of never-depressed participants (n = 44) also completed study measures. Prior to treatment, depressed patients endorsed significantly more undesirable cognitions on the WOR, IAT, and DAS compared to never-depressed participants. Patients displayed improvement on the WOR and DAS over the course of treatment, but showed no change on the IAT. Additionally, improvements on the WOR and DAS were each related to greater reductions in depressive symptoms. Results suggest that the degree of symptom reduction among patients participating in CT is related to changes in patients’ acquisition of coping skills requiring deliberate efforts and reflective thought, but not related to reduced endorsement of implicitly-assessed maladaptive beliefs. PMID:25526838
Wang, Jian-Gang; Sung, Eric; Yau, Wei-Yun
2011-07-01
Facial age classification is an approach to classify face images into one of several predefined age groups. One of the difficulties in applying learning techniques to the age classification problem is the large amount of labeled training data required. Acquiring such training data is very costly in terms of age progress, privacy, human time, and effort. Although unlabeled face images can be obtained easily, it would be expensive to manually label them on a large scale and getting the ground truth. The frugal selection of the unlabeled data for labeling to quickly reach high classification performance with minimal labeling efforts is a challenging problem. In this paper, we present an active learning approach based on an online incremental bilateral two-dimension linear discriminant analysis (IB2DLDA) which initially learns from a small pool of labeled data and then iteratively selects the most informative samples from the unlabeled set to increasingly improve the classifier. Specifically, we propose a novel data selection criterion called the furthest nearest-neighbor (FNN) that generalizes the margin-based uncertainty to the multiclass case and which is easy to compute, so that the proposed active learning algorithm can handle a large number of classes and large data sizes efficiently. Empirical experiments on FG-NET and Morph databases together with a large unlabeled data set for age categorization problems show that the proposed approach can achieve results comparable or even outperform a conventionally trained active classifier that requires much more labeling effort. Our IB2DLDA-FNN algorithm can achieve similar results much faster than random selection and with fewer samples for age categorization. It also can achieve comparable results with active SVM but is much faster than active SVM in terms of training because kernel methods are not needed. The results on the face recognition database and palmprint/palm vein database showed that our approach can handle problems with large number of classes. Our contributions in this paper are twofold. First, we proposed the IB2DLDA-FNN, the FNN being our novel idea, as a generic on-line or active learning paradigm. Second, we showed that it can be another viable tool for active learning of facial age range classification.
A MYSQL-BASED DATA ARCHIVER: PRELIMINARY RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew Bickley; Christopher Slominski
2008-01-23
Following an evaluation of the archival requirements of the Jefferson Laboratory accelerator’s user community, a prototyping effort was executed to determine if an archiver based on MySQL had sufficient functionality to meet those requirements. This approach was chosen because an archiver based on a relational database enables the development effort to focus on data acquisition and management, letting the database take care of storage, indexing and data consistency. It was clear from the prototype effort that there were no performance impediments to successful implementation of a final system. With our performance concerns addressed, the lab undertook the design and developmentmore » of an operational system. The system is in its operational testing phase now. This paper discusses the archiver system requirements, some of the design choices and their rationale, and presents the acquisition, storage and retrieval performance.« less
Yin, Guoyu; Hou, Lijun; Liu, Min; Liu, Zhanfei; Gardner, Wayne S
2014-08-19
Nitrogen (N) pollution in aquatic ecosystems has attracted much attention over the past decades, but the dynamics of this bioreactive element are difficult to measure in aquatic oxygen-transition environments. Nitrogen-transformation experiments often require measurement of (15)N-ammonium ((15)NH4(+)) ratios in small-volume (15)N-enriched samples. Published methods to determine N isotope ratios of dissolved ammonium require large samples and/or costly equipment and effort. We present a novel ("OX/MIMS") method to determine N isotope ratios for (15)NH4(+) in experimental waters previously enriched with (15)N compounds. Dissolved reduced (15)N (dominated by (15)NH4(+)) is oxidized with hypobromite iodine to nitrogen gas ((29)N2 and/or (30)N2) and analyzed by membrane inlet mass spectrometry (MIMS) to quantify (15)NH4(+) concentrations. The N isotope ratios, obtained by comparing the (15)NH4(+) to total ammonium (via autoanalyzer) concentrations, are compared to the ratios of prepared standards. The OX/MIMS method requires only small sample volumes of water (ca. 12 mL) or sediment slurries and is rapid, convenient, accurate, and precise (R(2) = 0.9994, p < 0.0001) over a range of salinities and (15)N/(14)N ratios. It can provide data needed to quantify rates of ammonium regeneration, potential ammonium uptake, and dissimilatory nitrate reduction to ammonium (DNRA). Isotope ratio results agreed closely (R = 0.998, P = 0.001) with those determined independently by isotope ratio mass spectrometry for DNRA measurements or by ammonium isotope retention time shift liquid chromatography for water-column N-cycling experiments. Application of OX/MIMS should simplify experimental approaches and improve understanding of N-cycling rates and fate in a variety of freshwater and marine environments.
[Revision of the drinking water regulations].
Hauswirth, S
2011-11-01
The revision the Drinking Water Regulations will come into effect on 01.11.2011. Surveillance authorities and owners of drinking water supply systems had hoped for simplifications and reductions because of the new arrangements. According to the official statement for the revision the legislature intended to create more clarity, consider new scientific findings, to change regulations that have not been proved to close regulatory gaps, to deregulate and to increase the high quality standards. A detailed examination of the regulation text, however, raises doubts. The new classification of water supply systems requires different modalities of registration, water analyses and official observation, which will complicate the work of the authorities. In particular, the implementation of requirements of registration and examination for the owners of commercial and publicly-operated large hot-water systems in accordance with DVGW Worksheet W 551 requires more effort. According to the estimated 30 000 cases of legionellosis in Germany the need for a check of such systems for Legionella, however, is not called into question. Furthermore, the development of sampling plans and the monitoring of mobile water supply systems requires more work for the health authorities. © Georg Thieme Verlag KG Stuttgart · New York.
Robotic sampling system for an unmanned Mars mission
NASA Technical Reports Server (NTRS)
Chun, Wendell
1989-01-01
A major robotics opportunity for NASA will be the Mars Rover/Sample Return Mission which could be launched as early as the 1990s. The exploratory portion of this mission will include two autonomous subsystems: the rover vehicle and a sample handling system. The sample handling system is the key to the process of collecting Martian soils. This system could include a core drill, a general-purpose manipulator, tools, containers, a return canister, certification hardware and a labeling system. Integrated into a functional package, the sample handling system is analogous to a complex robotic workcell. Discussed here are the different components of the system, their interfaces, forseeable problem areas and many options based on the scientific goals of the mission. The various interfaces in the sample handling process (component to component and handling system to rover) will be a major engineering effort. Two critical evaluation criteria that will be imposed on the system are flexibility and reliability. It needs to be flexible enough to adapt to different scenarios and environments and acquire the most desirable specimens for return to Earth. Scientists may decide to change the distribution and ratio of core samples to rock samples in the canister. The long distance and duration of this planetary mission places a reliability burden on the hardware. The communication time delay between Earth and Mars minimizes operator interaction (teleoperation, supervisory modes) with the sample handler. An intelligent system will be required to plan the actions, make sample choices, interpret sensor inputs, and query unknown surroundings. A combination of autonomous functions and supervised movements will be integrated into the sample handling system.
SOCCER: Comet Coma Sample Return Mission
NASA Technical Reports Server (NTRS)
Albee, A. L.; Uesugi, K. T.; Tsou, Peter
1994-01-01
Comets, being considered the most primitive bodies in the solar system, command the highest priority among solar system objects for studying solar nebula evolution and the evolution of life through biogenic elements and compounds. Sample Of Comet Coma Earth Return (SOCCER), a joint effort between NASA and the Institute of Space and Astronautical Science (ISAS) in Japan, has two primary science objectives: (1) the imaging of the comet nucleus and (2) the return to Earth of samples of volatile species and intact dust. This effort makes use of the unique strengths and capabilities of both countries in realizing this important quest for the return of samples from a comet. This paper presents an overview of SOCCER's science payloads, engineering flight system, and its mission operations.
Airport Surface Traffic Control Concept Formulation Study : Volume 4. Estimation of Requirements
DOT National Transportation Integrated Search
1975-07-01
A detailed study of requirements was performed and is presented. This requirements effort provided an estimate of the performance requirements of a surveillance sensor that would be required in a TAGS (Tower Automated Ground Surveillance) system for ...
Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shanks, Sonoya T.; Redding, Ted; Jaussi, Lynn
The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they aremore » deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.« less
Marks, Michael; Fookes, Maria; Wagner, Josef; Butcher, Robert; Ghinai, Rosanna; Sokana, Oliver; Sarkodie, Yaw-Adu; Lukehart, Sheila A; Solomon, Anthony W; Mabey, David C W; Thomson, Nicholas
2018-03-05
Yaws-like chronic ulcers can be caused by Treponema pallidum subspecies pertenue, Haemophilus ducreyi, or other, still-undefined bacteria. To permit accurate evaluation of yaws elimination efforts, programmatic use of molecular diagnostics is required. The accuracy and sensitivity of current tools remain unclear because our understanding of T. pallidum diversity is limited by the low number of sequenced genomes. We tested samples from patients with suspected yaws collected in the Solomon Islands and Ghana. All samples were from patients whose lesions had previously tested negative using the Centers for Disease Control and Prevention (CDC) diagnostic assay in widespread use. However, some of these patients had positive serological assays for yaws on blood. We used direct whole-genome sequencing to identify T. pallidum subsp pertenue strains missed by the current assay. From 45 Solomon Islands and 27 Ghanaian samples, 11 were positive for T. pallidum DNA using the species-wide quantitative polymerase chain reaction (PCR) assay, from which we obtained 6 previously undetected T. pallidum subsp pertenue whole-genome sequences. These show that Solomon Islands sequences represent distinct T. pallidum subsp pertenue clades. These isolates were invisible to the CDC diagnostic PCR assay, due to sequence variation in the primer binding site. Our data double the number of published T. pallidum subsp pertenue genomes. We show that Solomon Islands strains are undetectable by the PCR used in many studies and by health ministries. This assay is therefore not adequate for the eradication program. Next-generation genome sequence data are essential for these efforts. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
2015-07-01
steps to identify and mitigate potential challenges; (2) extent the services’ efforts to validate gender -neutral occupational standards are...to address statutory and Joint Staff requirements for validating gender -neutral occupational standards. GAO identified five elements required for...SOCOM Have Studies Underway to Validate Gender -Neutral Occupational Standards 21 DOD Is Providing Oversight of Integration Efforts, but Has Not
Fully automatic characterization and data collection from crystals of biological macromolecules.
Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W
2015-08-01
Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.
Exploiting molecular dynamics in Nested Sampling simulations of small peptides
NASA Astrophysics Data System (ADS)
Burkoff, Nikolas S.; Baldock, Robert J. N.; Várnai, Csilla; Wild, David L.; Csányi, Gábor
2016-04-01
Nested Sampling (NS) is a parameter space sampling algorithm which can be used for sampling the equilibrium thermodynamics of atomistic systems. NS has previously been used to explore the potential energy surface of a coarse-grained protein model and has significantly outperformed parallel tempering when calculating heat capacity curves of Lennard-Jones clusters. The original NS algorithm uses Monte Carlo (MC) moves; however, a variant, Galilean NS, has recently been introduced which allows NS to be incorporated into a molecular dynamics framework, so NS can be used for systems which lack efficient prescribed MC moves. In this work we demonstrate the applicability of Galilean NS to atomistic systems. We present an implementation of Galilean NS using the Amber molecular dynamics package and demonstrate its viability by sampling alanine dipeptide, both in vacuo and implicit solvent. Unlike previous studies of this system, we present the heat capacity curves of alanine dipeptide, whose calculation provides a stringent test for sampling algorithms. We also compare our results with those calculated using replica exchange molecular dynamics (REMD) and find good agreement. We show the computational effort required for accurate heat capacity estimation for small peptides. We also calculate the alanine dipeptide Ramachandran free energy surface for a range of temperatures and use it to compare the results using the latest Amber force field with previous theoretical and experimental results.
A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases
Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.
2013-01-01
How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357
Pinto-Leite, C M; Rocha, P L B
2012-12-01
Empirical studies using visual search methods to investigate spider communities were conducted with different sampling protocols, including a variety of plot sizes, sampling efforts, and diurnal periods for sampling. We sampled 11 plots ranging in size from 5 by 10 m to 5 by 60 m. In each plot, we computed the total number of species detected every 10 min during 1 hr during the daytime and during the nighttime (0630 hours to 1100 hours, both a.m. and p.m.). We measured the influence of time effort on the measurement of species richness by comparing the curves produced by sample-based rarefaction and species richness estimation (first-order jackknife). We used a general linear model with repeated measures to assess whether the phase of the day during which sampling occurred and the differences in the plot lengths influenced the number of species observed and the number of species estimated. To measure the differences in species composition between the phases of the day, we used a multiresponse permutation procedure and a graphical representation based on nonmetric multidimensional scaling. After 50 min of sampling, we noted a decreased rate of species accumulation and a tendency of the estimated richness curves to reach an asymptote. We did not detect an effect of plot size on the number of species sampled. However, differences in observed species richness and species composition were found between phases of the day. Based on these results, we propose guidelines for visual search for tropical web spiders.
A novel method to handle the effect of uneven sampling effort in biodiversity databases.
Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B
2013-01-01
How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.
Brillouin zone grid refinement for highly resolved ab initio THz optical properties of graphene
NASA Astrophysics Data System (ADS)
Warmbier, Robert; Quandt, Alexander
2018-07-01
Optical spectra of materials can in principle be calculated within numerical frameworks based on Density Functional Theory. The huge numerical effort involved in these methods severely constraints the accuracy achievable in practice. In the case of the THz spectrum of graphene the primary limitation lays in the density of the reciprocal space sampling. In this letter we have developed a non-uniform sampling using grid refinement to achieve a high local sampling density with only moderate numerical effort. The resulting THz electron energy loss spectrum shows a plasmon signal below 50 meV with a ω(q) ∝√{ q } dispersion relation.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Shielding requirements for the Space Station habitability modules
NASA Technical Reports Server (NTRS)
Avans, Sherman L.; Horn, Jennifer R.; Williamsen, Joel E.
1990-01-01
The design, analysis, development, and tests of the total meteoroid/debris protection system for the Space Station Freedom habitability modules, such as the habitation module, the laboratory module, and the node structures, are described. Design requirements are discussed along with development efforts, including a combination of hypervelocity testing and analyses. Computer hydrocode analysis of hypervelocity impact phenomena associated with Space Station habitability structures is covered and the use of optimization techniques, engineering models, and parametric analyses is assessed. Explosive rail gun development efforts and protective capability and damage tolerance of multilayer insulation due to meteoroid/debris impact are considered. It is concluded that anticipated changes in the debris environment definition and requirements will require rescoping the tests and analysis required to develop a protection system.
Quality Control for Ambient Sampling of PCDD/PCDF from Open Combustion Sources
Both long duration (> 6 h) and high temperature (up to 139o C) sampling efforts were conducted using ambient air sampling methods to determine if either high volume throughput or higher than ambient sampling temperatures resulted in loss of target polychlorinated dibenzodioxins/d...
Developing reforestation technology for southern pines: a historical perspective
James Barnett
2013-01-01
Early in the 20th century, the forests of the South were decimated by aggressive harvesting, resulting in millions of acres of forest land in need of reforestation. Foresighted individuals committed efforts to restore this harvested land to a productive condition. The effort required dedication, cooperation, and leadership. The efforts of this small cadre of...
34 CFR 299.5 - What maintenance of effort requirements apply to ESEA programs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... funds only if the SEA finds that either the combined fiscal effort per student or the aggregate... preceding fiscal year was not less than 90 percent of the combined fiscal effort per student or the... Technology Resources). (4) Part A of title IV (Safe and Drug-Free Schools and Communities) (other than...
Limiting Central Government Budget Deficits: International Experiences
2010-03-11
Economic Cooperation and Development ( OECD ) countries, limit their fiscal deficits. Financial markets support government efforts to reduce deficit...fiscal consolidation efforts and developing medium-term budgetary frameworks for fiscal planning . Fiscal consolidation efforts, however, generally...require policymakers to weigh the effects of various policy trade-offs, including the trade-off between adopting stringent, but enforceable, rules- based
NASA Astrophysics Data System (ADS)
Wibawa, Teja A.; Lehodey, Patrick; Senina, Inna
2017-02-01
Geo-referenced catch and fishing effort data of the bigeye tuna fisheries in the Indian Ocean over 1952-2014 were analyzed and standardized to facilitate population dynamics modeling studies. During this 62-year historical period of exploitation, many changes occurred both in the fishing techniques and the monitoring of activity. This study includes a series of processing steps used for standardization of spatial resolution, conversion and standardization of catch and effort units, raising of geo-referenced catch into nominal catch level, screening and correction of outliers, and detection of major catchability changes over long time series of fishing data, i.e., the Japanese longline fleet operating in the tropical Indian Ocean. A total of 30 fisheries were finally determined from longline, purse seine and other-gears data sets, from which 10 longline and 4 purse seine fisheries represented 96 % of the whole historical geo-referenced catch. Nevertheless, one-third of total nominal catch is still not included due to a total lack of geo-referenced information and would need to be processed separately, accordingly to the requirements of the study. The geo-referenced records of catch, fishing effort and associated length frequency samples of all fisheries are available at doi:10.1594/PANGAEA.864154.
Folmer, Amy S; Cole, David A; Sigal, Amanda B; Benbow, Lovisa D; Satterwhite, Lindsay F; Swygert, Katherine E; Ciesla, Jeffrey A
2008-02-01
Building on Nicholls's earlier work, we examined developmental changes in children's understanding of effort and ability when faced with a negative outcome. In a sample of 166 children and adolescents (ages 5-15 years), younger children conflated the meaning of effort and ability, explaining that smart students work hard, whereas older children understood effort and ability to be reciprocally related constructs, explaining that smart students do not need to work as hard. Understanding the reciprocal relation between effort and ability was correlated with age. Age-related changes in the meaning and correlates of effort and ability were also examined. Developmental implications for attribution theory and achievement motivation are discussed.
Folmer, Amy S.; Cole, David A.; Sigal, Amanda B.; Benbow, Lovisa D.; Satterwhite, Lindsay F.; Swygert, Katherine E.; Ciesla, Jeffrey A.
2008-01-01
Building upon Nicholls' (1978) work, we examined developmental changes in children's understanding of effort and ability when faced with a negative outcome. In a sample of 166 children and adolescents (ages 5 to 15), younger children conflated the meaning of effort and ability, explaining that smart students work hard; whereas older children understood effort and ability to be reciprocally related constructs, explaining that smart students do not have to work as hard. Understanding the reciprocal relation between effort and ability was correlated with age. Age-related changes in the meaning and correlates of effort and ability were also examined. Developmental implications for attribution theory and achievement motivation are discussed. PMID:18067917
Izawa, Shuhei; Tsutsumi, Akizumi; Ogawa, Namiko
2016-10-01
Accumulating evidence shows that effort-reward imbalance (ERI) at work can cause various health problems. However, few studies have investigated the biological pathways linking ERI and health outcomes, and their findings have been inconsistent. In this study, we investigated the associations between ERI, the hypothalamic-pituitary-adrenocortical axis, and inflammation in a sample of police officers. One hundred forty-two male police officers that were engaged in a working system of 24-h shifts were followed up during the work shift as well as during the two subsequent work-free days. Throughout this period, the participants provided two saliva samples each day for the 3-day period, and we measured the concentrations of cortisol and C-reactive protein (CRP) in the saliva. The police officers also completed the Japanese short version of the Effort-Reward Imbalance Questionnaire. The results of linear mixed model analyses controlled for possible confounding variables indicated that higher effort scores (p = 0.031) as well as effort-reward ratio (p = 0.080) were associated with lower cortisol levels, and the effect of effort was strengthened in the younger police officers (p = 0.017). Furthermore, higher effort scores were associated with higher CRP levels in younger police officers (p = 0.037). Our results indicate that effort, a component of ERI, has physiological effects in younger police officers, which possibly contribute to the development of stress-related diseases.
Olives, Casey; Valadez, Joseph J; Pagano, Marcello
2014-03-01
To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.
Hosking, Jay G; Cocker, Paul J; Winstanley, Catharine A
2016-04-01
Personal success often necessitates expending greater effort for greater reward but, equally important, also requires judicious use of our limited cognitive resources (e.g., attention). Previous animal models have shown that the prelimbic (PL) and infralimbic (IL) regions of the prefrontal cortex (PFC) are not involved in (physical) effort-based choice, whereas human studies have demonstrated PFC contributions to (mental) effort. Here, we utilize the rat Cognitive Effort Task (rCET) to probe PFC's role in effort-based decision making. In the rCET, animals can choose either an easy trial, where the attentional demand is low but the reward (sugar) is small or a difficult trial on which both the attentional demand and reward are greater. Temporary inactivation of PL and IL decreased all animals' willingness to expend mental effort and increased animals' distractibility; PL inactivations more substantially affected performance (i.e., attention), whereas IL inactivations increased motor impulsivity. These data imply that the PFC contributes to attentional resources, and when these resources are diminished, animals shift their choice (via other brain regions) accordingly. Thus, one novel therapeutic approach to deficits in effort expenditure may be to focus on the resources that such decision making requires, rather than the decision-making process per se. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Anticipating cognitive effort: roles of perceived error-likelihood and time demands.
Dunn, Timothy L; Inzlicht, Michael; Risko, Evan F
2017-11-13
Why are some actions evaluated as effortful? In the present set of experiments we address this question by examining individuals' perception of effort when faced with a trade-off between two putative cognitive costs: how much time a task takes vs. how error-prone it is. Specifically, we were interested in whether individuals anticipate engaging in a small amount of hard work (i.e., low time requirement, but high error-likelihood) vs. a large amount of easy work (i.e., high time requirement, but low error-likelihood) as being more effortful. In between-subject designs, Experiments 1 through 3 demonstrated that individuals anticipate options that are high in perceived error-likelihood (yet less time consuming) as more effortful than options that are perceived to be more time consuming (yet low in error-likelihood). Further, when asked to evaluate which of the two tasks was (a) more effortful, (b) more error-prone, and (c) more time consuming, effort-based and error-based choices closely tracked one another, but this was not the case for time-based choices. Utilizing a within-subject design, Experiment 4 demonstrated overall similar pattern of judgments as Experiments 1 through 3. However, both judgments of error-likelihood and time demand similarly predicted effort judgments. Results are discussed within the context of extant accounts of cognitive control, with considerations of how error-likelihood and time demands may independently and conjunctively factor into judgments of cognitive effort.
17 CFR 38.252 - Additional requirements for physical-delivery contracts.
Code of Federal Regulations, 2013 CFR
2013-04-01
... commodity and show a good-faith effort to resolve conditions that are interfering with convergence; and (b...-faith effort to resolve conditions that threaten the adequacy of supplies or the delivery process. ...
Illinois highway materials sustainability efforts of 2014.
DOT National Transportation Integrated Search
2015-08-01
This report presents the 2014 sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling reclaimed materials in highway construction. This report meets the requirements of Illinois : Public Act 097-0314 by documenting I...
Illinois highway materials sustainability efforts of 2013.
DOT National Transportation Integrated Search
2014-08-01
This report presents the sustainability efforts of the Illinois Department of Transportation (IDOT) in : recycling and reclaiming materials for use in highway construction. This report meets the requirements of : Illinois Public Act 097-0314 by docum...
Illinois highway materials sustainability efforts of 2015.
DOT National Transportation Integrated Search
2016-08-01
This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2015. This report meets the requirements of Illinois Publ...
Illinois highway materials sustainability efforts of 2016.
DOT National Transportation Integrated Search
2017-07-04
This report provides a summary of the sustainability efforts of the Illinois Department of Transportation (IDOT) in recycling : reclaimed materials in highway construction during calendar year 2016. This report meets the requirements of Illinois Publ...
Farnsworth, Matthew L.; Kendall, William L.; Doherty, Paul F.; Miller, Ryan S.; White, Gary C.; Nichols, James D.; Burnham, Kenneth P.; Franklin, Alan B.; Majumdar, S.; Brenner, F.J.; Huffman, J.E.; McLean, R.G.; Panah, A.I.; Pietrobon, P.J.; Keeler, S.P.; Shive, S.
2011-01-01
Introduction of Asian strain H5N1 Highly Pathogenic avian influenca via waterfowl migration is one potential route of entry into the United States. In conjunction with state, tribe, and laboratory partners, the United States Department of Agriculture collected and tested 124,603 wild bird samples in 2006 as part of a national surveillance effort. A sampling plan was devised to increase the probability fo detecting Asian strain H5N1 at a national scale. Band recovery data were used to identify and prioritize sampling for wild migratory waterfowl, resulting in spatially targeted sampling recommendations focused on reads with high numbers of recoveries. We also compared the spatial and temporal distribution of the 2006 cloacal and fecal waterfowl sampling effort to the bird banding recovery data and found concordance between the two .Finally, we present improvements made to the 2007 fecal sampling component of the surveillance plan and suggest further improvements for future sampling.
Ocean OSSEs: recent developments and future challenges
NASA Astrophysics Data System (ADS)
Kourafalou, V. H.
2012-12-01
Atmospheric OSSEs have had a much longer history of applications than OSSEs (and OSEs) in oceanography. Long standing challenges include the presence of coastlines and steep bathymetric changes, which require the superposition of a wide variety of space and time scales, leading to difficulties on ocean observation and prediction. For instance, remote sensing is critical for providing a quasi-synoptic oceanographic view, but the coverage is limited at the ocean surface. Conversely, in situ measurements are capable to monitor the entire water column, but at a single location and usually for a specific, short time. Despite these challenges, substantial progress has been made in recent years and international initiatives have provided successful OSSE/OSE examples and formed appropriate forums that helped define the future roadmap. These will be discussed, together with various challenges that require a community effort. Examples include: integrated (remote and in situ) observing system requirements for monitoring large scale and climatic changes, vs. short term variability that is particularly important on the regional and coastal spatial scales; satisfying the needs of both global and regional/coastal nature runs, from development to rigorous evaluation and under a clear definition of metrics; data assimilation in the presence of tides; estimation of real-time river discharges for Earth system modeling. An overview of oceanographic efforts that complement the standard OSSE methodology will also be given. These include ocean array design methods, such as representer-based analysis and adaptive sampling. Exciting new opportunities for both global and regional ocean OSSE/OSE studies have recently become possible with targeted periods of comprehensive data sets, such as the existing Gulf of Mexico observations from multiple sources in the aftermath of the DeepWater Horizon incident and the upcoming airborne AirSWOT, in preparation for the SWOT (Surface Water and Ocean Topography) mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Christopher A.; Martinez, Alonzo; McNamara, Bruce K.
International Atom Energy Agency (IAEA) safeguard verification measures in gaseous centrifuge enrichment plants (GCEPs) rely on environmental sampling, non-destructive assay (NDA), and destructive assay (DA) sampling and analysis to determine uranium enrichment. UF6 bias defect measurements are made by DA sampling and analysis to assure that enrichment is consistent with declarations. DA samples are collected from a limited number of cylinders for high precision, offsite mass spectrometer analysis. Samples are typically drawn from a sampling tap into a UF6 sample bottle, then packaged, sealed, and shipped under IAEA chain of custody to an offsite analytical laboratory. Future DA safeguard measuresmore » may require improvements in efficiency and effectiveness as GCEP capacities increase and UF6 shipping regulations become increasingly more restrictive. The Pacific Northwest National Laboratory (PNNL) DA sampler concept and Laser Ablation Absorption Ratio Spectrometry (LAARS) assay method are under development to potentially provide DA safeguard tools that increase inspection effectiveness and reduce sample shipping constraints. The PNNL DA sampler concept uses a handheld sampler to collect DA samples for either onsite LAARS assay or offsite laboratory analysis. The DA sampler design will use a small sampling planchet that is coated with an adsorptive film to collect controlled quantities of UF6 gas directly from a cylinder or process sampling tap. Development efforts are currently underway at PNNL to enhance LAARS assay performance to allow high-precision onsite bias defect measurements. In this paper, we report on the experimental investigation to develop adsorptive films for the PNNL DA sampler concept. These films are intended to efficiently capture UF6 and then stabilize the collected DA sample prior to onsite LAARS or offsite laboratory analysis. Several porous material composite films were investigated, including a film designed to maximize the chemical adsorption and binding of gaseous UF6 onto the sampling planchet.« less
Perception that "everything requires a lot of effort": transcultural SCL-25 item validation.
Moreau, Nicolas; Hassan, Ghayda; Rousseau, Cécile; Chenguiti, Khalid
2009-09-01
This brief report illustrates how the migration context can affect specific item validity of mental health measures. The SCL-25 was administered to 432 recently settled immigrants (220 Haitian and 212 Arabs). We performed descriptive analyses, as well as Infit and Outfit statistics analyses using WINSTEPS Rasch Measurement Software based on Item Response Theory. The participants' comments about the item You feel everything requires a lot of effort in the SCL-25 were also qualitatively analyzed. Results revealed that the item You feel everything requires a lot of effort is an outlier and does not adjust in an expected and valid fashion with its cluster items, as it is over-endorsed by Haitian and Arab healthy participants. Our study thus shows that, in transcultural mental health research, the cultural and migratory contexts may interact and significantly influence the meaning of some symptom items and consequently, the validity of symptom scales.
Pilliod, David S.; Arkle, Robert S.
2013-01-01
Resource managers and scientists need efficient, reliable methods for quantifying vegetation to conduct basic research, evaluate land management actions, and monitor trends in habitat conditions. We examined three methods for quantifying vegetation in 1-ha plots among different plant communities in the northern Great Basin: photography-based grid-point intercept (GPI), line-point intercept (LPI), and point-quarter (PQ). We also evaluated each method for within-plot subsampling adequacy and effort requirements relative to information gain. We found that, for most functional groups, percent cover measurements collected with the use of LPI, GPI, and PQ methods were strongly correlated. These correlations were even stronger when we used data from the upper canopy only (i.e., top “hit” of pin flags) in LPI to estimate cover. PQ was best at quantifying cover of sparse plants such as shrubs in early successional habitats. As cover of a given functional group decreased within plots, the variance of the cover estimate increased substantially, which required more subsamples per plot (i.e., transect lines, quadrats) to achieve reliable precision. For GPI, we found that that six–nine quadrats per hectare were sufficient to characterize the vegetation in most of the plant communities sampled. All three methods reasonably characterized the vegetation in our plots, and each has advantages depending on characteristics of the vegetation, such as cover or heterogeneity, study goals, precision of measurements required, and efficiency needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pochan, M.J.; Massey, M.J.
1979-02-01
This report discusses the results of actual raw product gas sampling efforts and includes: Rationale for raw product gas sampling efforts; design and operation of the CMU gas sampling train; development and analysis of a sampling train data base; and conclusions and future application of results. The results of sampling activities at the CO/sub 2/-Acceptor and Hygas pilot plants proved that: The CMU gas sampling train is a valid instrument for characterization of environmental parameters in coal gasification gas-phase process streams; depending on the particular process configuration, the CMU gas sampling train can reduce gasifier effluent characterization activity to amore » single location in the raw product gas line; and in contrast to the slower operation of the EPA SASS Train, CMU's gas sampling train can collect representative effluent data at a rapid rate (approx. 2 points per hour) consistent with the rate of change of process variables, and thus function as a tool for process engineering-oriented analysis of environmental characteristics.« less
Pathfinder autonomous rendezvous and docking project
NASA Technical Reports Server (NTRS)
Lamkin, Stephen (Editor); Mccandless, Wayne (Editor)
1990-01-01
Capabilities are being developed and demonstrated to support manned and unmanned vehicle operations in lunar and planetary orbits. In this initial phase, primary emphasis is placed on definition of the system requirements for candidate Pathfinder mission applications and correlation of these system-level requirements with specific requirements. The FY-89 activities detailed are best characterized as foundation building. The majority of the efforts were dedicated to assessing the current state of the art, identifying desired elaborations and expansions to this level of development and charting a course that will realize the desired objectives in the future. Efforts are detailed across all work packages in developing those requirements and tools needed to test, refine, and validate basic autonomous rendezvous and docking elements.
Conceptual design studies of control and instrumentation systems for ignition experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, P.J.; Dewolf, J.B.; Heinemann, P.C.
1978-03-01
Studies at the Charles Stark Draper Laboratory in the past year were a continuation of prior studies of control and instrumentation systems for current and next generation Tokomaks. Specifically, the FY 77 effort has focused on the following two main efforts: (1) control requirements--(a) defining and evolving control requirements/concepts for a prototype experimental power reactor(s), and (b) defining control requirements for diverters and mirror machines, specifically the MX; and (2) defining requirements and scoping design for a functional control simulator. Later in the year, a small additional task was added: (3) providing analysis and design support to INESCO for itsmore » low cost fusion power system, FPC/DMT.« less
Alternative Reinforcer Response Cost Impacts Cocaine Choice in Humans
Stoops, William W.; Lile, Joshua A.; Glaser, Paul E.A.; Hays, Lon R.; Rush, Craig R.
2011-01-01
Cocaine use disorders are an unrelenting public health concern. Behavioral treatments reduce cocaine use by providing non-drug alternative reinforcers. The purpose of this human laboratory experiment was to determine how response cost for non-drug alternative reinforcers influenced cocaine choice. Seven cocaine-using, non-treatment-seeking subjects completed a crossover, double-blind protocol in which they first sampled doses of intranasal cocaine (5, 10, 20 or 30 mg) and completed a battery of subject-rated and physiological measures. Subjects then made eight discrete choices between the sampled dose and an alternative reinforce (US$0.25). The response cost to earn a cocaine dose was always a fixed ratio (FR) of 100 responses. The response cost for the alternative reinforcer varied across sessions (FR1, FR10, FR100, FR1000). Dose-related increases were observed for cocaine choice. Subjects made fewer drug choices when the FR requirements for the alternative reinforcers were lower than that for drug relative to when the FR requirements were equal to or higher than that for drug. Intranasal cocaine also produced prototypical stimulant-like subject-rated and physiological effects (e.g., increased ratings of Like Drug; elevated blood pressure). These data demonstrate that making alternative reinforcers easier to earn reduces cocaine self-administration, which has implications for treatment efforts. PMID:22015480
Boja, Emily S; Rodriguez, Henry
2012-04-01
Traditional shotgun proteomics used to detect a mixture of hundreds to thousands of proteins through mass spectrometric analysis, has been the standard approach in research to profile protein content in a biological sample which could lead to the discovery of new (and all) protein candidates with diagnostic, prognostic, and therapeutic values. In practice, this approach requires significant resources and time, and does not necessarily represent the goal of the researcher who would rather study a subset of such discovered proteins (including their variations or posttranslational modifications) under different biological conditions. In this context, targeted proteomics is playing an increasingly important role in the accurate measurement of protein targets in biological samples in the hope of elucidating the molecular mechanism of cellular function via the understanding of intricate protein networks and pathways. One such (targeted) approach, selected reaction monitoring (or multiple reaction monitoring) mass spectrometry (MRM-MS), offers the capability of measuring multiple proteins with higher sensitivity and throughput than shotgun proteomics. Developing and validating MRM-MS-based assays, however, is an extensive and iterative process, requiring a coordinated and collaborative effort by the scientific community through the sharing of publicly accessible data and datasets, bioinformatic tools, standard operating procedures, and well characterized reagents. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R
2017-12-01
Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; ...
2016-12-29
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here in this article, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with highmore » reproducibility (CV ≤ 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. Lastly, this SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure.« less
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; Zink, Erika M; Kim, Young-Mo; Burnum-Johnson, Kristin E; Orton, Daniel J; Apffel, Alex; Ibrahim, Yehia M; Monroe, Matthew E; Moore, Ronald J; Smith, Jordan N; Ma, Jian; Renslow, Ryan S; Thomas, Dennis G; Blackwell, Anne E; Swinford, Glenn; Sausen, John; Kurulugama, Ruwan T; Eno, Nathan; Darland, Ed; Stafford, George; Fjeldsted, John; Metz, Thomas O; Teeguarden, Justin G; Smith, Richard D; Baker, Erin S
2016-12-01
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with high reproducibility (CV 6 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. This SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure.
Zhang, Xing; Romm, Michelle; Zheng, Xueyun; Zink, Erika M.; Kim, Young-Mo; Burnum-Johnson, Kristin E.; Orton, Daniel J.; Apffel, Alex; Ibrahim, Yehia M.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Jordan N.; Ma, Jian; Renslow, Ryan S.; Thomas, Dennis G.; Blackwell, Anne E.; Swinford, Glenn; Sausen, John; Kurulugama, Ruwan T.; Eno, Nathan; Darland, Ed; Stafford, George; Fjeldsted, John; Metz, Thomas O.; Teeguarden, Justin G.; Smith, Richard D.; Baker, Erin S.
2017-01-01
Characterization of endogenous metabolites and xenobiotics is essential to deconvoluting the genetic and environmental causes of disease. However, surveillance of chemical exposure and disease-related changes in large cohorts requires an analytical platform that offers rapid measurement, high sensitivity, efficient separation, broad dynamic range, and application to an expansive chemical space. Here, we present a novel platform for small molecule analyses that addresses these requirements by combining solid-phase extraction with ion mobility spectrometry and mass spectrometry (SPE-IMS-MS). This platform is capable of performing both targeted and global measurements of endogenous metabolites and xenobiotics in human biofluids with high reproducibility (CV 6 3%), sensitivity (LODs in the pM range in biofluids) and throughput (10-s sample-to-sample duty cycle). We report application of this platform to the analysis of human urine from patients with and without type 1 diabetes, where we observed statistically significant variations in the concentration of disaccharides and previously unreported chemical isomers. This SPE-IMS-MS platform overcomes many of the current challenges of large-scale metabolomic and exposomic analyses and offers a viable option for population and patient cohort screening in an effort to gain insights into disease processes and human environmental chemical exposure. PMID:29276770
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toops, Todd J.; Bilheux, Hassina Z.; Voisin, Sophie
2013-08-19
This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerizedmore » reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). Lastly, this effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF.« less
Veterinary public health capacity in the United States: opportunities for improvement.
Jarman, Dwayne W; Liang, Jennifer L; Luce, Richard R; Wright, Jennifer G; Stennies, Gail M; Bisgard, Kristine M
2011-01-01
In 2006, the Association of American Veterinary Medical Colleges reported that the shortage (≥ 1,500) of public health veterinarians is expected to increase tenfold by 2020. In 2008, the Centers for Disease Control and Prevention (CDC) Preventive Medicine Fellows conducted a pilot project among CDC veterinarians to identify national veterinary public health workforce concerns and potential policy strategies. Fellows surveyed a convenience sample (19/91) of public health veterinarians at CDC to identify veterinary workforce recruitment and retention problems faced by federal agencies; responses were categorized into themes. A focus group (20/91) of staff veterinarians subsequently prioritized the categorized themes from least to most important. Participants identified activities to address the three recruitment concerns with the highest combined weight. Participants identified the following three highest prioritized problems faced by federal agencies when recruiting veterinarians to public health: (1) lack of awareness of veterinarians' contributions to public health practice, (2) competitive salaries, and (3) employment and training opportunities. Similarly, key concerns identified regarding retention of public health practice veterinarians included: (1) lack of recognition of veterinary qualifications, (2) competitive salaries, and (3) seamless integration of veterinary and human public health. Findings identified multiple barriers that can affect recruitment and retention of veterinarians engaged in public health practice. Next steps should include replicating project efforts among a national sample of public health veterinarians. A committed and determined long-term effort might be required to sustain initiatives and policy proposals to increase U.S. veterinary public health capacity.
Auto-Gopher-II: an autonomous wireline rotary-hammer ultrasonic drill
NASA Astrophysics Data System (ADS)
Badescu, Mircea; Lee, Hyeong Jae; Sherrit, Stewart; Bao, Xiaoqi; Bar-Cohen, Yoseph; Jackson, Shannon; Chesin, Jacob; Zacny, Kris; Paulsen, Gale L.; Mellerowicz, Bolek; Kim, Daniel
2017-04-01
Developing technologies that would enable future NASA exploration missions to penetrate deeper into the subsurface of planetary bodies for sample collection is of great importance. Performing these tasks while using minimal mass/volume systems and with low energy consumption is another set of requirements imposed on such technologies. A deep drill, called Auto-Gopher II, is currently being developed as a joint effort between JPL's NDEAA laboratory and Honeybee Robotics Corp. The Auto-Gopher II is a wireline rotary-hammer drill that combines formation breaking by hammering using an ultrasonic actuator and cuttings removal by rotating a fluted auger bit. The hammering mechanism is based on the Ultrasonic/Sonic Drill/Corer (USDC) mechanism that has been developed as an adaptable tool for many drilling and coring applications. The USDC uses an intermediate free-flying mass to transform high frequency vibrations of a piezoelectric transducer horn tip into sonic hammering of the drill bit. The USDC concept was used in a previous task to develop an Ultrasonic/Sonic Ice Gopher and then integrated into a rotary hammer device to develop the Auto-Gopher-I. The lessons learned from these developments are being integrated into the development of the Auto- Gopher-II, an autonomous deep wireline drill with integrated cuttings and sample management and drive electronics. Subsystems of the wireline drill are being developed in parallel at JPL and Honeybee Robotics Ltd. This paper presents the development efforts of the piezoelectric actuator, cuttings removal and retention flutes and drive electronics.
Log sampling methods and software for stand and landscape analyses.
Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough
2008-01-01
We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...
45 CFR 158.244 - Unclaimed rebates.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Unclaimed rebates. An issuer must make a good faith effort to locate and deliver to an enrollee any rebate required under this Part. If, after making a good faith effort, an issuer is unable to locate a former...
45 CFR 158.244 - Unclaimed rebates.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Unclaimed rebates. An issuer must make a good faith effort to locate and deliver to an enrollee any rebate required under this Part. If, after making a good faith effort, an issuer is unable to locate a former...
45 CFR 158.244 - Unclaimed rebates.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Unclaimed rebates. An issuer must make a good faith effort to locate and deliver to an enrollee any rebate required under this Part. If, after making a good faith effort, an issuer is unable to locate a former...
45 CFR 158.244 - Unclaimed rebates.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Unclaimed rebates. An issuer must make a good faith effort to locate and deliver to an enrollee any rebate required under this Part. If, after making a good faith effort, an issuer is unable to locate a former...
Marko-Varga, György; Végvári, Ákos; Welinder, Charlotte; Lindberg, Henrik; Rezeli, Melinda; Edula, Goutham; Svensson, Katrin J; Belting, Mattias; Laurell, Thomas; Fehniger, Thomas E
2012-11-02
Biobanks are a major resource to access and measure biological constituents that can be used to monitor the status of health and disease, both in unique individual samples and within populations. Most "omic" activities rely on access to these collections of stored samples to provide the basis for establishing the ranges and frequencies of expression. Furthermore, information about the relative abundance and form of protein constituents found in stored samples provides an important historical index for comparative studies of inherited, epidemic, and developing disease. Standardizations of sample quality, form, and analysis are an important unmet need and requirement for gaining the full benefit from collected samples. Coupled to this standard is the provision of annotation describing clinical status and metadata of measurements of clinical phenotype that characterizes the sample. Today we have not yet achieved consensus on how to collect, manage, and build biobank archives in order to reach goals where these efforts are translated into value for the patient. Several initiatives (OBBR, ISBER, BBMRI) that disseminate best practice examples for biobanking are expected to play an important role in ensuring the need to preserve the sample integrity of biosamples stored for periods that reach one or several decades. These developments will be of great value and importance to programs such as the Chromosome Human Protein Project (C-HPP) that will associate protein expression in healthy and disease states with genetic foci along of each of the human chromosomes.
Retention of African American Women in a Lifestyle Physical Activity Program
Buchholz, Susan W.; Wilbur, JoEllen; Schoeny, Michael E.; Fogg, Louis; Ingram, Diana M.; Miller, Arlene; Braun, Lynne
2015-01-01
Using a cohort of African American women enrolled in a physical activity program, the purpose of the paper is to examine how well individual characteristics, neighborhood characteristics and intervention participation predict study retention and staff level of effort needed for retention. Secondary data analysis was conducted from a randomized clinical trial. Participants were 40–65 years without major signs/symptoms of cardiovascular disease. Assessments were conducted at community sites in/bordering African American communities. Study retention was 90%. Of those retained, 24% required moderate/high level of staff effort for retention. Retention was predicted by being older, having lower perceived neighborhood walkability, living in neighborhoods with greater disadvantage and crime, and having greater program participation. More staff effort was predicted by participants being younger, having more economic hardships, poorer health, or lower intervention participation. We may be able to identify people at baseline likely to require more staff effort to retain. PMID:26475680
Accommodating complexity and human behaviors in decision analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan
2007-11-01
This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.
NASA Astrophysics Data System (ADS)
Streuber, Gregg Mitchell
Environmental and economic factors motivate the pursuit of more fuel-efficient aircraft designs. Aerodynamic shape optimization is a powerful tool in this effort, but is hampered by the presence of multimodality in many design spaces. Gradient-based multistart optimization uses a sampling algorithm and multiple parallel optimizations to reliably apply fast gradient-based optimization to moderately multimodal problems. Ensuring that the sampled geometries remain physically realizable requires manually developing specialized linear constraints for each class of problem. Utilizing free-form deformation geometry control allows these linear constraints to be written in a geometry-independent fashion, greatly easing the process of applying the algorithm to new problems. This algorithm was used to assess the presence of multimodality when optimizing a wing in subsonic and transonic flows, under inviscid and viscous conditions, and a blended wing-body under transonic, viscous conditions. Multimodality was present in every wing case, while the blended wing-body was found to be generally unimodal.