Options in Extraterrestrial Sample Handling and Study
NASA Technical Reports Server (NTRS)
Papanastassiou, Dimitri A.
2000-01-01
This presentation mentions important service functions such as: sample preservation, hazard assessment, and handling. It also discuss how preliminary examination of samples is necessary for sample hazard assessment and for sample allocations. Clean facilities and clean sample handling are required. Conflicts, cross contamination issues will be present and need to be resolved. Extensive experience is available for extraterrestrial samples and must be sought and applied. Extensive experience is available in studies of pathogenicity and must be sought and applied as necessary. Advisory and oversight structures must also be in place
An extension of command shaping methods for controlling residual vibration using frequency sampling
NASA Technical Reports Server (NTRS)
Singer, Neil C.; Seering, Warren P.
1992-01-01
The authors present an extension to the impulse shaping technique for commanding machines to move with reduced residual vibration. The extension, called frequency sampling, is a method for generating constraints that are used to obtain shaping sequences which minimize residual vibration in systems such as robots whose resonant frequencies change during motion. The authors present a review of impulse shaping methods, a development of the proposed extension, and a comparison of results of tests conducted on a simple model of the space shuttle robot arm. Frequency shaping provides a method for minimizing the impulse sequence duration required to give the desired insensitivity.
Comparing three sampling techniques for estimating fine woody down dead biomass
Robert E. Keane; Kathy Gray
2013-01-01
Designing woody fuel sampling methods that quickly, accurately and efficiently assess biomass at relevant spatial scales requires extensive knowledge of each sampling method's strengths, weaknesses and tradeoffs. In this study, we compared various modifications of three common sampling methods (planar intercept, fixed-area microplot and photoload) for estimating...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-09
... requirements or power calculations that justify the proposed sample size, the expected response rate, methods...] Notice of Request for a Revision to and Extension of Approval of an Information Collection; Qualitative... associated with qualitative customer and stakeholder feedback on service delivery by the Animal and Plant...
Curriculum Development for Professional Leaders in Extension Education.
ERIC Educational Resources Information Center
Findlay, Edward Weldon
The study is based on the premise that if one is able to identify the areas of behavior in which professionals require competence, one can link this behavior to a related structure of concepts which may serve as logical teaching and learning objectives in the development of training programs. A sample of 211 extension agents (in agriculture, home…
USDA-ARS?s Scientific Manuscript database
High-salt samples present a challenge to mass spectrometry (MS) analysis, particularly when electrospray ionization (ESI) is used, requiring extensive sample preparation steps such as desalting, extraction, and purification. In this study, infrared matrix-assisted laser desorption electrospray ioniz...
Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn
2013-03-06
Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.
2013-01-01
Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171
Network Model-Assisted Inference from Respondent-Driven Sampling Data
Gile, Krista J.; Handcock, Mark S.
2015-01-01
Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328
Network Model-Assisted Inference from Respondent-Driven Sampling Data.
Gile, Krista J; Handcock, Mark S
2015-06-01
Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.
Hanmer, Lyn; Nicola, Edward; Bradshaw, Debbie
2017-01-01
The quality of morbidity data in multiple routine inpatient records in a sample of South African hospitals is being assessed in terms of data accuracy and completeness. Extensive modification of available data collection tools was required to make it possible to collect the required data for the study.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
Palmer, W G; Scholz, R C; Moorman, W J
1983-03-01
Sampling of complex mixtures of airborne contaminants for chronic animal toxicity tests often involves numerous sampling devices, requires extensive sampling time, and yields forms of collected materials unsuitable for administration to animals. A method is described which used a high volume, wet venturi scrubber for collection of respirable fractions of emissions from iron foundry casting operations. The construction and operation of the sampler are presented along with collection efficiency data and its application to the preparation of large quantities of samples to be administered to animals by intratracheal instillation.
JPL Contamination Control Engineering
NASA Technical Reports Server (NTRS)
Blakkolb, Brian
2013-01-01
JPL has extensive expertise fielding contamination sensitive missions-in house and with our NASA/industry/academic partners.t Development and implementation of performance-driven cleanliness requirements for a wide range missions and payloads - UV-Vis-IR: GALEX, Dawn, Juno, WFPC-II, AIRS, TES, et al - Propulsion, thermal control, robotic sample acquisition systems. Contamination control engineering across the mission life cycle: - System and payload requirements derivation, analysis, and contamination control implementation plans - Hardware Design, Risk trades, Requirements V-V - Assembly, Integration & Test planning and implementation - Launch site operations and launch vehicle/payload integration - Flight ops center dot Personnel on staff have expertise with space materials development and flight experiments. JPL has capabilities and expertise to successfully address contamination issues presented by space and habitable environments. JPL has extensive experience fielding and managing contamination sensitive missions. Excellent working relationship with the aerospace contamination control engineering community/.
Cometary Dust: The Diversity of Primitive Matter
NASA Technical Reports Server (NTRS)
Wooden, D. H.; Ishiiii, H. A.; Zolensky, M. E.
2017-01-01
The connections between comet dust and primitive chondrites from asteroids has strengthened considerably over the past decade. Understanding the importance of the connections between Stardust samples and chondrites requires geochemistry lingo as well as a perspective of other cometary dust samples besides Stardust. We present the principal findings of an extensive review prepared for by us for the June 2016 "Cometary Science After Rosetta" meeting at The Royal Society, London.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-23
... collection: Extension of the time frame required to complete approved and ongoing methodological research on... methodological research on the National Crime Victimization Survey. (2) Title of the Form/Collection: National.... This generic clearance will cover methodological research that will use existing or new sampled...
From East Gondwana to Central America: Historical biogeography of the Alstroemeriaceae
USDA-ARS?s Scientific Manuscript database
Southern South America and Australia/New Zealand share some 15 plant families more or less restricted to them. Understanding these Austral floristic links requires extensive sampling in both regions. For the Alstroemeriaceae, with 189 species in three South American genera, two in an Australian/Tasm...
Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples
2012-01-01
Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466
Toward a Principled Sampling Theory for Quasi-Orders
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601
Toward a Principled Sampling Theory for Quasi-Orders.
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.
Studies on spectral analysis of randomly sampled signals: Application to laser velocimetry data
NASA Technical Reports Server (NTRS)
Sree, David
1992-01-01
Spectral analysis is very useful in determining the frequency characteristics of many turbulent flows, for example, vortex flows, tail buffeting, and other pulsating flows. It is also used for obtaining turbulence spectra from which the time and length scales associated with the turbulence structure can be estimated. These estimates, in turn, can be helpful for validation of theoretical/numerical flow turbulence models. Laser velocimetry (LV) is being extensively used in the experimental investigation of different types of flows, because of its inherent advantages; nonintrusive probing, high frequency response, no calibration requirements, etc. Typically, the output of an individual realization laser velocimeter is a set of randomly sampled velocity data. Spectral analysis of such data requires special techniques to obtain reliable estimates of correlation and power spectral density functions that describe the flow characteristics. FORTRAN codes for obtaining the autocorrelation and power spectral density estimates using the correlation-based slotting technique were developed. Extensive studies have been conducted on simulated first-order spectrum and sine signals to improve the spectral estimates. A first-order spectrum was chosen because it represents the characteristics of a typical one-dimensional turbulence spectrum. Digital prefiltering techniques, to improve the spectral estimates from randomly sampled data were applied. Studies show that the spectral estimates can be increased up to about five times the mean sampling rate.
Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach
ERIC Educational Resources Information Center
Rotondi, Michael A.; Donner, Allan
2009-01-01
The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…
Simulation of Wind Profile Perturbations for Launch Vehicle Design
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
2004-01-01
Ideally, a statistically representative sample of measured high-resolution wind profiles with wavelengths as small as tens of meters is required in design studies to establish aerodynamic load indicator dispersions and vehicle control system capability. At most potential launch sites, high- resolution wind profiles may not exist. Representative samples of Rawinsonde wind profiles to altitudes of 30 km are more likely to be available from the extensive network of measurement sites established for routine sampling in support of weather observing and forecasting activity. Such a sample, large enough to be statistically representative of relatively large wavelength perturbations, would be inadequate for launch vehicle design assessments because the Rawinsonde system accurately measures wind perturbations with wavelengths no smaller than 2000 m (1000 m altitude increment). The Kennedy Space Center (KSC) Jimsphere wind profiles (150/month and seasonal 2 and 3.5-hr pairs) are the only adequate samples of high resolution profiles approx. 150 to 300 m effective resolution, but over-sampled at 25 m intervals) that have been used extensively for launch vehicle design assessments. Therefore, a simulation process has been developed for enhancement of measured low-resolution Rawinsonde profiles that would be applicable in preliminary launch vehicle design studies at launch sites other than KSC.
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Restoring a smooth function from its noisy integrals
NASA Astrophysics Data System (ADS)
Goulko, Olga; Prokof'ev, Nikolay; Svistunov, Boris
2018-05-01
Numerical (and experimental) data analysis often requires the restoration of a smooth function from a set of sampled integrals over finite bins. We present the bin hierarchy method that efficiently computes the maximally smooth function from the sampled integrals using essentially all the information contained in the data. We perform extensive tests with different classes of functions and levels of data quality, including Monte Carlo data suffering from a severe sign problem and physical data for the Green's function of the Fröhlich polaron.
Metadynamic metainference: Enhanced sampling of the metainference ensemble using metadynamics
Bonomi, Massimiliano; Camilloni, Carlo; Vendruscolo, Michele
2016-01-01
Accurate and precise structural ensembles of proteins and macromolecular complexes can be obtained with metainference, a recently proposed Bayesian inference method that integrates experimental information with prior knowledge and deals with all sources of errors in the data as well as with sample heterogeneity. The study of complex macromolecular systems, however, requires an extensive conformational sampling, which represents a separate challenge. To address such challenge and to exhaustively and efficiently generate structural ensembles we combine metainference with metadynamics and illustrate its application to the calculation of the free energy landscape of the alanine dipeptide. PMID:27561930
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Joyner Melito, Helen S; Jones, Kari E; Rasco, Barbara A
2016-06-01
Pasta presents a challenge to microwave processing due to its unique cooking requirements. The objective of this study was to determine the effects of microwave processing on pasta physicochemical and mechanical properties. Fettuccine pasta was parboiled for selected times, then pasteurized using a Microwave Assisted Pasteurization System and stored under refrigeration for 1 wk. Samples were analyzed using microscopy, mechanical testing, and chemical analyses after storage. While no significant differences were observed for free amylose among fresh samples, samples parboiled for ≤6 min had significantly higher free amylose, suggesting reduced starch retrogradation. Increased heat treatment increased degree of protein polymerization, observed in microstructures as increased gluten strand thickness and network density. Firmness and extensibility increased with increased parboil time; however, extension data indicated an overall weakening of microwave-treated pasta regardless of total cooking time. Overall, microwave pasteurization was shown to be a viable cooking method for pasta. © 2016 Institute of Food Technologists®
Optimal space communications techniques. [all digital phase locked loop for FM demodulation
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1973-01-01
The design, development, and analysis are reported of a digital phase-locked loop (DPLL) for FM demodulation and threshold extension. One of the features of the developed DPLL is its synchronous, real time operation. The sampling frequency is constant and all the required arithmetic and logic operations are performed within one sampling period, generating an output sequence which is converted to analog form and filtered. An equation relating the sampling frequency to the carrier frequency must be satisfied to guarantee proper DPLL operation. The synchronous operation enables a time-shared operation of one DPLL to demodulate several FM signals simultaneously. In order to obtain information about the DPLL performance at low input signal-to-noise ratios, a model of an input noise spike was introduced, and the DPLL equation was solved using a digital computer. The spike model was successful in finding a second order DPLL which yielded a five db threshold extension beyond that of a first order DPLL.
Advances in spectroscopic methods for quantifying soil carbon
Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean
2012-01-01
The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.
Study of advanced atmospheric entry systems for Mars
NASA Technical Reports Server (NTRS)
1978-01-01
Entry system designs are described for various advanced Mars missions including sample return, hard lander, and Mars airplane. The Mars exploration systems for sample return and the hard lander require decleration from direct approach entry velocities of about 6 km/s to terminal velocities consistent with surface landing requirements. The Mars airplane entry system is decelerated from orbit at 4.6 km/s to deployment near the surface. Mass performance characteristics of major elements of the Mass performance characteristics are estimated for the major elements of the required entry systems using Viking technology or logical extensions of technology in order to provide a common basis of comparison for the three entry modes mission mode approaches. The entry systems, although not optimized, are based on Viking designs and reflect current hardware performance capability and realistic mass relationships.
Vision and the representation of the surroundings in spatial memory
Tatler, Benjamin W.; Land, Michael F.
2011-01-01
One of the paradoxes of vision is that the world as it appears to us and the image on the retina at any moment are not much like each other. The visual world seems to be extensive and continuous across time. However, the manner in which we sample the visual environment is neither extensive nor continuous. How does the brain reconcile these differences? Here, we consider existing evidence from both static and dynamic viewing paradigms together with the logical requirements of any representational scheme that would be able to support active behaviour. While static scene viewing paradigms favour extensive, but perhaps abstracted, memory representations, dynamic settings suggest sparser and task-selective representation. We suggest that in dynamic settings where movement within extended environments is required to complete a task, the combination of visual input, egocentric and allocentric representations work together to allow efficient behaviour. The egocentric model serves as a coding scheme in which actions can be planned, but also offers a potential means of providing the perceptual stability that we experience. PMID:21242146
Burnout among Extension Agents in the Ohio Cooperative Extension Service.
ERIC Educational Resources Information Center
Igodan, O. Chris; Newcomb, L. H.
A study examined the extent and causes of burnout among extension agents in Ohio. From the 241 extension agents working in the 88 counties of Ohio, researchers selected a random sample of 101 agents. Included in the sample were 34 agriculture agents, 33 home economics agents. Included in the sample agents were asked to complete a survey…
Scalability, Complexity and Reliability in Quantum Information Processing
2007-03-01
hidden subgroup framework to abelian groups which are not finitely generated. An extension of the basic algorithm breaks the Buchmann-Williams...finding short lattice vectors . In [2], we showed that the generalization of the standard method --- random coset state preparation followed by fourier...sampling --- required exponential time for sufficiently non-abelian groups including the symmetric group , at least when the fourier transforms are
Japanese national forest inventory and its spatial extension by remote sensing
Yasumasa Hirata; Mitsuo Matsumoto; Toshiro Iehara
2009-01-01
Japan has two independent forest inventory systems. One forest inventory is required by the forest planning system based on the Forest Law, in which forest registers and forest planning maps are prepared. The other system is a forest resource monitoring survey, in which systematic sampling is done at 4-km grid intervals. Here, we present these national forest inventory...
NASA Technical Reports Server (NTRS)
Blackburn, L. B.; Ellingsworth, J. R. (Inventor)
1985-01-01
An improved mechanical extensometer is described for use with a constant load creep test machine. The dead weight of the extensometer is counterbalanced by two pairs of weights connected through a pulley system and to rod extension and leading into the furnace where the test sample is undergoing elevated temperature (above 500 F.) tensile testing. Novel gripper surfaces, conical tip and flat surface are provided in each sampling engaging platens to reduce the grip pressure normally required for attachment of the extensometer to the specimen and reduce initial specimen bending normally associated with foil-gage metal testing.
Oscillational instabilities in single-mode acoustic levitators
NASA Technical Reports Server (NTRS)
Rudnick, Joseph; Barmatz, M.
1990-01-01
An extension of standard results for the acoustic force on an object in a single-mode resonant chamber yields predictions for the onset of oscillational instabilities when objects are levitated or positioned in these chambers. The results are consistent with experimental investigations. The present approach accounts for the effect of time delays on the response of a cavity to the motion of an object inside it. Quantitative features of the instabilities are investigated. The experimental conditions required for sample stability, saturation of sample oscillations, hysteretic effects, and the loss of the ability to levitate are discussed.
37 CFR 1.730 - Applicant for extension of patent term; signature requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Applicant for extension of patent term; signature requirements. 1.730 Section 1.730 Patents, Trademarks, and Copyrights UNITED... for extension of patent term; signature requirements. (a) Any application for extension of a patent...
Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long
2015-01-01
The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and (131)I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rauscher, Sarah; Neale, Chris; Pomès, Régis
2009-10-13
Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.
Pinares-Patiño, César; Gere, José; Williams, Karen; Gratton, Roberto; Juliarena, Paula; Molano, German; MacLean, Sarah; Sandoval, Edgar; Taylor, Grant; Koolaard, John
2012-01-01
Simple Summary Extended sample collection for the SF6 tracer technique is desirable for extensive grazing systems. Breath samples from eight cows were collected while lucerne silage was fed to achieve fixed intakes among the cows. Samples were collected over a 10-day period, using either apparatuses used in New Zealand (NZL) or Argentina (ARG), and either daily, over two consecutive 5-day periods or over a 10-day period (in duplicate). The NZL system had a greater sampling success and more consistent CH4 emission estimates than the ARG system, with no differences in mean emissions among sample collection periods. This study showed that extended sample collection is feasible, but definitive evaluation under grazing situation is required before a decision on recommendation can be made. Abstract The daily sample collection protocol of the sulphur hexafluoride (SF6) tracer technique for the estimation of methane (CH4) emissions from ruminants may not be practical under extensive grazing systems. Here, under controlled conditions, we evaluated extended periods of sampling as an alternative to daily sample collections. Eight rumen-fistulated cows were housed and fed lucerne silage to achieve common daily feed intakes of 6.4 kg dry matter per cow. Following SF6 permeation tube dosing, eight sampling lines were fitted to the breath collection harness, so that a common gas mix was available to each line. Half of the lines collected samples into PVC yokes using a modified capillary system as commonly used in New Zealand (NZL), and half collected samples into stainless steel cylinders using a ball-bearing flow restrictor as used in Argentina (ARG), all within a 10-day time frame, either daily, across two consecutive 5-day periods or across one 10-day period (in duplicate). The NZL system had greater sampling success (97.3 vs. 79.5%) and yielded more consistent CH4 emission estimates than the ARG system. Emission estimates from NZL daily, NZL 5-day and NZL 10-day samplings were 114, 110 and 111 g d−1, respectively. Extended sample collection protocol may be feasible, but definitive evaluation of this alternative as well as sample collection systems is required under grazing situations before a decision on recommendation can be made. PMID:26486921
Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino
2015-09-01
The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of <0.10. The enumerative sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.
Laser Diffraction Techniques Replace Sieving for Lunar Soil Particle Size Distribution Data
NASA Technical Reports Server (NTRS)
Cooper, Bonnie L.; Gonzalez, C. P.; McKay, D. S.; Fruland, R. L.
2012-01-01
Sieving was used extensively until 1999 to determine the particle size distribution of lunar samples. This method is time-consuming, and requires more than a gram of material in order to obtain a result in which one may have confidence. This is demonstrated by the difference in geometric mean and median for samples measured by [1], in which a 14-gram sample produced a geometric mean of approx.52 micrometers, whereas two other samples of 1.5 grams resulted in gave means of approx.63 and approx.69 micrometers. Sample allocations for sieving are typically much smaller than a gram, and many of the sample allocations received by our lab are 0.5 to 0.25 grams in mass. Basu [2] has described how the finest fraction of the soil is easily lost in the sieving process, and this effect is compounded when sample sizes are small.
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Application of up-sampling and resolution scaling to Fresnel reconstruction of digital holograms.
Williams, Logan A; Nehmetallah, Georges; Aylo, Rola; Banerjee, Partha P
2015-02-20
Fresnel transform implementation methods using numerical preprocessing techniques are investigated in this paper. First, it is shown that up-sampling dramatically reduces the minimum reconstruction distance requirements and allows maximal signal recovery by eliminating aliasing artifacts which typically occur at distances much less than the Rayleigh range of the object. Second, zero-padding is employed to arbitrarily scale numerical resolution for the purpose of resolution matching multiple holograms, where each hologram is recorded using dissimilar geometric or illumination parameters. Such preprocessing yields numerical resolution scaling at any distance. Both techniques are extensively illustrated using experimental results.
Epidemiology of Chronic Wasting Disease: PrPres Detection, Shedding, and Environmental Contamination
2008-08-01
encephalopathies (TSEs) in that it occurs in free- ranging as well as captive wild ruminants and environmental contamination appears to play a...sensitivity and high specificity and second, dogma suggests that current assays for the detection of PrPres utilize protease digestion . Proving a highly...to achieve specificity as samples require protease digestion , protein precipitation, or extensive processing in order to distinguish PrPres from the
Jensen, Pamela C.; Purcell, Maureen K.; Morado, J. Frank; Eckert, Ginny L.
2012-01-01
The Alaskan red king crab (Paralithodes camtschaticus) fishery was once one of the most economically important single-species fisheries in the world, but is currently depressed. This fishery would benefit from improved stock assessment capabilities. Larval crab distribution is patchy temporally and spatially, requiring extensive sampling efforts to locate and track larval dispersal. Large-scale plankton surveys are generally cost prohibitive because of the effort required for collection and the time and taxonomic expertise required to sort samples to identify plankton individually via light microscopy. Here, we report the development of primers and a dual-labeled probe for use in a DNA-based real-time polymerase chain reaction assay targeting the red king crab, mitochondrial gene cytochrome oxidase I for the detection of red king crab larvae DNA in plankton samples. The assay allows identification of plankton samples containing crab larvae DNA and provides an estimate of DNA copy number present in a sample without sorting the plankton sample visually. The assay was tested on DNA extracted from whole red king crab larvae and plankton samples seeded with whole larvae, and it detected DNA copies equivalent to 1/10,000th of a larva and 1 crab larva/5mL sieved plankton, respectively. The real-time polymerase chain reaction assay can be used to screen plankton samples for larvae in a fraction of the time required for traditional microscopial methods, which offers advantages for stock assessment methodologies for red king crab as well as a rapid and reliable method to assess abundance of red king crab larvae as needed to improve the understanding of life history and population processes, including larval population dynamics.
Kellenberger, Colleen A; Sales-Lee, Jade; Pan, Yuchen; Gassaway, Madalee M; Herr, Amy E; Hammond, Ming C
2015-01-01
Cyclic di-GMP (c-di-GMP) is a second messenger that is important in regulating bacterial physiology and behavior, including motility and virulence. Many questions remain about the role and regulation of this signaling molecule, but current methods of detection are limited by either modest sensitivity or requirements for extensive sample purification. We have taken advantage of a natural, high affinity receptor of c-di-GMP, the Vc2 riboswitch aptamer, to develop a sensitive and rapid electrophoretic mobility shift assay (EMSA) for c-di-GMP quantitation that required minimal engineering of the RNA.
A digital repository with an extensible data model for biobanking and genomic analysis management.
Izzo, Massimiliano; Mortola, Francesco; Arnulfo, Gabriele; Fato, Marco M; Varesio, Luigi
2014-01-01
Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid.
A digital repository with an extensible data model for biobanking and genomic analysis management
2014-01-01
Motivation Molecular biology laboratories require extensive metadata to improve data collection and analysis. The heterogeneity of the collected metadata grows as research is evolving in to international multi-disciplinary collaborations and increasing data sharing among institutions. Single standardization is not feasible and it becomes crucial to develop digital repositories with flexible and extensible data models, as in the case of modern integrated biobanks management. Results We developed a novel data model in JSON format to describe heterogeneous data in a generic biomedical science scenario. The model is built on two hierarchical entities: processes and events, roughly corresponding to research studies and analysis steps within a single study. A number of sequential events can be grouped in a process building up a hierarchical structure to track patient and sample history. Each event can produce new data. Data is described by a set of user-defined metadata, and may have one or more associated files. We integrated the model in a web based digital repository with a data grid storage to manage large data sets located in geographically distinct areas. We built a graphical interface that allows authorized users to define new data types dynamically, according to their requirements. Operators compose queries on metadata fields using a flexible search interface and run them on the database and on the grid. We applied the digital repository to the integrated management of samples, patients and medical history in the BIT-Gaslini biobank. The platform currently manages 1800 samples of over 900 patients. Microarray data from 150 analyses are stored on the grid storage and replicated on two physical resources for preservation. The system is equipped with data integration capabilities with other biobanks for worldwide information sharing. Conclusions Our data model enables users to continuously define flexible, ad hoc, and loosely structured metadata, for information sharing in specific research projects and purposes. This approach can improve sensitively interdisciplinary research collaboration and allows to track patients' clinical records, sample management information, and genomic data. The web interface allows the operators to easily manage, query, and annotate the files, without dealing with the technicalities of the data grid. PMID:25077808
Khatkar, Mehar S; Nicholas, Frank W; Collins, Andrew R; Zenger, Kyall R; Cavanagh, Julie A L; Barris, Wes; Schnabel, Robert D; Taylor, Jeremy F; Raadsma, Herman W
2008-04-24
The extent of linkage disequilibrium (LD) within a population determines the number of markers that will be required for successful association mapping and marker-assisted selection. Most studies on LD in cattle reported to date are based on microsatellite markers or small numbers of single nucleotide polymorphisms (SNPs) covering one or only a few chromosomes. This is the first comprehensive study on the extent of LD in cattle by analyzing data on 1,546 Holstein-Friesian bulls genotyped for 15,036 SNP markers covering all regions of all autosomes. Furthermore, most studies in cattle have used relatively small sample sizes and, consequently, may have had biased estimates of measures commonly used to describe LD. We examine minimum sample sizes required to estimate LD without bias and loss in accuracy. Finally, relatively little information is available on comparative LD structures including other mammalian species such as human and mouse, and we compare LD structure in cattle with public-domain data from both human and mouse. We computed three LD estimates, D', Dvol and r2, for 1,566,890 syntenic SNP pairs and a sample of 365,400 non-syntenic pairs. Mean D' is 0.189 among syntenic SNPs, and 0.105 among non-syntenic SNPs; mean r2 is 0.024 among syntenic SNPs and 0.0032 among non-syntenic SNPs. All three measures of LD for syntenic pairs decline with distance; the decline is much steeper for r2 than for D' and Dvol. The value of D' and Dvol are quite similar. Significant LD in cattle extends to 40 kb (when estimated as r2) and 8.2 Mb (when estimated as D'). The mean values for LD at large physical distances are close to those for non-syntenic SNPs. Minor allelic frequency threshold affects the distribution and extent of LD. For unbiased and accurate estimates of LD across marker intervals spanning < 1 kb to > 50 Mb, minimum sample sizes of 400 (for D') and 75 (for r2) are required. The bias due to small samples sizes increases with inter-marker interval. LD in cattle is much less extensive than in a mouse population created from crossing inbred lines, and more extensive than in humans. For association mapping in Holstein-Friesian cattle, for a given design, at least one SNP is required for each 40 kb, giving a total requirement of at least 75,000 SNPs for a low power whole-genome scan (median r2 > 0.19) and up to 300,000 markers at 10 kb intervals for a high power genome scan (median r2 > 0.62). For estimation of LD by D' and Dvol with sufficient precision, a sample size of at least 400 is required, whereas for r2 a minimum sample of 75 is adequate.
Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-12
.... Section 105 of AREERA amended the Smith-Lever Act to require that a specified amount of agricultural... Hatch Act and Smith-Lever Act to require that a specified amount of agricultural research and extension... Smith- Lever Act funds on multistate extension activities and 25 percent on integrated research and...
Representation of complex probabilities and complex Gibbs sampling
NASA Astrophysics Data System (ADS)
Salcedo, Lorenzo Luis
2018-03-01
Complex weights appear in Physics which are beyond a straightforward importance sampling treatment, as required in Monte Carlo calculations. This is the wellknown sign problem. The complex Langevin approach amounts to effectively construct a positive distribution on the complexified manifold reproducing the expectation values of the observables through their analytical extension. Here we discuss the direct construction of such positive distributions paying attention to their localization on the complexified manifold. Explicit localized representations are obtained for complex probabilities defined on Abelian and non Abelian groups. The viability and performance of a complex version of the heat bath method, based on such representations, is analyzed.
Using machine learning to accelerate sampling-based inversion
NASA Astrophysics Data System (ADS)
Valentine, A. P.; Sambridge, M.
2017-12-01
In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.
NASA Astrophysics Data System (ADS)
Artrith, Nongnuch; Urban, Alexander; Ceder, Gerbrand
2018-06-01
The atomistic modeling of amorphous materials requires structure sizes and sampling statistics that are challenging to achieve with first-principles methods. Here, we propose a methodology to speed up the sampling of amorphous and disordered materials using a combination of a genetic algorithm and a specialized machine-learning potential based on artificial neural networks (ANNs). We show for the example of the amorphous LiSi alloy that around 1000 first-principles calculations are sufficient for the ANN-potential assisted sampling of low-energy atomic configurations in the entire amorphous LixSi phase space. The obtained phase diagram is validated by comparison with the results from an extensive sampling of LixSi configurations using molecular dynamics simulations and a general ANN potential trained to ˜45 000 first-principles calculations. This demonstrates the utility of the approach for the first-principles modeling of amorphous materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatrchyan, S.; Khachatryan, V.; Sirunyan, A. M.
A search for supersymmetry or other new physics resulting in similar final states is presented using a data sample of 4.73 inverse femtobarns of pp collisions collected atmore » $$ \\sqrt{s}=7 $$ TeV with the CMS detector at the LHC. Fully hadronic final states are selected based on the variable MT2, an extension of the transverse mass in events with two invisible particles. Two complementary studies are performed. The first targets the region of parameter space with medium to high squark and gluino masses, in which the signal can be separated from the standard model backgrounds by a tight requirement on MT2. The second is optimized to be sensitive to events with a light gluino and heavy squarks. In this case, the MT2 requirement is relaxed, but a higher jet multiplicity and at least one b-tagged jet are required. No significant excess of events over the standard model expectations is observed. Exclusion limits are derived for the parameter space of the constrained minimal supersymmetric extension of the standard model, as well as on a variety of simplified model spectra.« less
NASA Technical Reports Server (NTRS)
Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.;
2018-01-01
In situ exploration of planetary environments allows biochemical analysis of sub-centimeter-scale samples; however, landing sites are selected a priori based on measurable meter- to kilometer-scale geological features. Optimizing life detection mission science return requires both understanding the expected biomarker distributions across sample sites at different scales and efficiently using first-stage in situ geochemical instruments to justify later-stage biological or chemical analysis. Icelandic volcanic regions have an extensive history as Mars analogue sites due to desiccation, low nutrient availability, and temperature extremes, in addition to the advantages of geological youth and isolation from anthropogenic contamination. Many Icelandic analogue sites are also rugged and remote enough to create the same type of instrumentation and sampling constraints typically faced by robotic exploration.
Horowitz, Arthur J.
2013-01-01
Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.
Nakagawa, Seiji
2011-04-01
Mechanical properties (seismic velocities and attenuation) of geological materials are often frequency dependent, which necessitates measurements of the properties at frequencies relevant to a problem at hand. Conventional acoustic resonant bar tests allow measuring seismic properties of rocks and sediments at sonic frequencies (several kilohertz) that are close to the frequencies employed for geophysical exploration of oil and gas resources. However, the tests require a long, slender sample, which is often difficult to obtain from the deep subsurface or from weak and fractured geological formations. In this paper, an alternative measurement technique to conventional resonant bar tests is presented. This technique uses only a small, jacketed rock or sediment core sample mediating a pair of long, metal extension bars with attached seismic source and receiver-the same geometry as the split Hopkinson pressure bar test for large-strain, dynamic impact experiments. Because of the length and mass added to the sample, the resonance frequency of the entire system can be lowered significantly, compared to the sample alone. The experiment can be conducted under elevated confining pressures up to tens of MPa and temperatures above 100 [ordinal indicator, masculine]C, and concurrently with x-ray CT imaging. The described split Hopkinson resonant bar test is applied in two steps. First, extension and torsion-mode resonance frequencies and attenuation of the entire system are measured. Next, numerical inversions for the complex Young's and shear moduli of the sample are performed. One particularly important step is the correction of the inverted Young's moduli for the effect of sample-rod interfaces. Examples of the application are given for homogeneous, isotropic polymer samples, and a natural rock sample. © 2011 American Institute of Physics
Ten-minute analysis of drugs and metabolites in saliva by surface-enhanced Raman spectroscopy
NASA Astrophysics Data System (ADS)
Shende, Chetan; Inscore, Frank; Maksymiuk, Paul; Farquharson, Stuart
2005-11-01
Rapid analysis of drugs in emergency room overdose patients is critical to selecting appropriate medical care. Saliva analysis has long been considered an attractive alternative to blood plasma analysis for this application. However, current clinical laboratory analysis methods involve extensive sample extraction followed by gas chromatography and mass spectrometry, and typically require as much as one hour to perform. In an effort to overcome this limitation we have been investigating metal-doped sol-gels to both separate drugs and their metabolites from saliva and generate surface-enhanced Raman spectra. We have incorporated the sol-gel in a disposable lab-on-a-chip format, and generally no more than a drop of sample is required. The detailed molecular vibrational information allows chemical identification, while the increase in Raman scattering by six orders of magnitude or more allows detection of microg/mL concentrations. Measurements of cocaine, its metabolite benzoylecgonine, and several barbiturates are presented.
Mathematical Models of Continuous Flow Electrophoresis
NASA Technical Reports Server (NTRS)
Saville, D. A.; Snyder, R. S.
1985-01-01
Development of high resolution continuous flow electrophoresis devices ultimately requires comprehensive understanding of the ways various phenomena and processes facilitate or hinder separation. A comprehensive model of the actual three dimensional flow, temperature and electric fields was developed to provide guidance in the design of electrophoresis chambers for specific tasks and means of interpreting test data on a given chamber. Part of the process of model development includes experimental and theoretical studies of hydrodynamic stability. This is necessary to understand the origin of mixing flows observed with wide gap gravitational effects. To insure that the model accurately reflects the flow field and particle motion requires extensive experimental work. Another part of the investigation is concerned with the behavior of concentrated sample suspensions with regard to sample stream stability particle-particle interactions which might affect separation in an electric field, especially at high field strengths. Mathematical models will be developed and tested to establish the roles of the various interactions.
McGregor, Tracy L.; Van Driest, Sara L.; Brothers, Kyle B.; Bowton, Erica A.; Muglia, Louis J.; Roden, Dan M.
2013-01-01
The Vanderbilt DNA repository, BioVU, links DNA from leftover clinical blood samples to de-identified electronic medical records. After initiating adult sample collection, pediatric extension required consideration of ethical concerns specific to pediatrics and implementation of specialized DNA extraction methods. In the first year of pediatric sample collection, over 11,000 samples were included from individuals younger than 18 years. We compared the pediatric BioVU cohort to the overall Vanderbilt University Medical Center pediatric population and found similar demographic characteristics; however, the BioVU cohort has higher rates of select diseases, medication exposures, and laboratory testing, demonstrating enriched representation of severe or chronic disease. This unbalanced sample accumulation may accelerate research of some cohorts, but also may limit study of relatively benign conditions and the accrual of unaffected and unbiased control samples. BioVU represents a feasible model for pediatric DNA biobanking but involves both ethical and practical considerations specific to the pediatric population. PMID:23281421
7 CFR 3419.3 - Determination of non-Federal sources of funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
... RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE MATCHING FUNDS REQUIREMENT FOR AGRICULTURAL RESEARCH AND EXTENSION FORMULA FUNDS AT 1890 LAND-GRANT INSTITUTIONS, INCLUDING TUSKEGEE... agricultural research, extension, and qualified educational activity to meet the matching requirements of...
Aris, John P; Alvers, Ashley L; Ferraiuolo, Roy A; Fishwick, Laura K; Hanvivatpong, Amanda; Hu, Doreen; Kirlew, Christine; Leonard, Michael T; Losin, Kyle J; Marraffini, Michelle; Seo, Arnold Y; Swanberg, Veronica; Westcott, Jennifer L; Wood, Michael S; Leeuwenburgh, Christiaan; Dunn, William A
2013-10-01
We have previously shown that autophagy is required for chronological longevity in the budding yeast Saccharomyces cerevisiae. Here we examine the requirements for autophagy during extension of chronological life span (CLS) by calorie restriction (CR). We find that autophagy is upregulated by two CR interventions that extend CLS: water wash CR and low glucose CR. Autophagy is required for full extension of CLS during water wash CR under all growth conditions tested. In contrast, autophagy was not uniformly required for full extension of CLS during low glucose CR, depending on the atg allele and strain genetic background. Leucine status influenced CLS during CR. Eliminating the leucine requirement in yeast strains or adding supplemental leucine to growth media extended CLS during CR. In addition, we observed that both water wash and low glucose CR promote mitochondrial respiration proficiency during aging of autophagy-deficient yeast. In general, the extension of CLS by water wash or low glucose CR was inversely related to respiration deficiency in autophagy-deficient cells. Also, autophagy is required for full extension of CLS under non-CR conditions in buffered media, suggesting that extension of CLS during CR is not solely due to reduced medium acidity. Thus, our findings show that autophagy is: (1) induced by CR, (2) required for full extension of CLS by CR in most cases (depending on atg allele, strain, and leucine availability) and, (3) promotes mitochondrial respiration proficiency during aging under CR conditions. Copyright © 2013 Elsevier Inc. All rights reserved.
Aris, John P.; Alvers, Ashley L.; Ferraiuolo, Roy A.; Fishwick, Laura K.; Hanvivatpong, Amanda; Hu, Doreen; Kirlew, Christine; Leonard, Michael T.; Losin, Kyle J.; Marraffini, Michelle; Seo, Arnold Y.; Swanberg, Veronica; Westcott, Jennifer L.; Wood, Michael S.; Leeuwenburgh, Christiaan; Dunn, William A.
2013-01-01
We have previously shown that autophagy is required for chronological longevity in the budding yeast Saccharomyces cerevisiae. Here we examine the requirements for autophagy during extension of chronological life span (CLS) by calorie restriction (CR). We find that autophagy is upregulated by two CR interventions that extend CLS: water wash CR and low glucose CR. Autophagy is required for full extension of CLS during water wash CR under all growth conditions tested. In contrast, autophagy was not uniformly required for full extension of CLS during low glucose CR, depending on the atg allele and strain genetic background. Leucine status influenced CLS during CR. Eliminating the leucine requirement in yeast strains or adding supplemental leucine to growth media extended CLS during CR. In addition, we observed that both water wash and low glucose CR promote mitochondrial respiration proficiency during aging of autophagy-deficient yeast. In general, the extension of CLS by water wash or low glucose CR was inversely related to respiration deficiency in autophagy-deficient cells. Also, autophagy is required for full extension of CLS under non-CR conditions in buffered media, suggesting that extension of CLS during CR is not solely due to reduced medium acidity. Thus, our findings show that autophagy is: (1) induced by CR, (2) required for full extension of CLS by CR in most cases (depending on atg allele, strain, and leucine availability) and, (3) promotes mitochondrial respiration proficiency during aging under CR conditions. PMID:23337777
An Internationally Coordinated Science Management Plan for Samples Returned from Mars
NASA Astrophysics Data System (ADS)
Haltigin, T.; Smith, C. L.
2015-12-01
Mars Sample Return (MSR) remains a high priority of the planetary exploration community. Such an effort will undoubtedly be too large for any individual agency to conduct itself, and thus will require extensive global cooperation. To help prepare for an eventual MSR campaign, the International Mars Exploration Working Group (IMEWG) chartered the international Mars Architecture for the Return of Samples (iMARS) Phase II working group in 2014, consisting of representatives from 17 countries and agencies. The overarching task of the team was to provide recommendations for progressing towards campaign implementation, including a proposed science management plan. Building upon the iMARS Phase I (2008) outcomes, the Phase II team proposed the development of an International MSR Science Institute as part of the campaign governance, centering its deliberations around four themes: Organization: including an organizational structure for the Institute that outlines roles and responsibilities of key members and describes sample return facility requirements; Management: presenting issues surrounding scientific leadership, defining guidelines and assumptions for Institute membership, and proposing a possible funding model; Operations & Data: outlining a science implementation plan that details the preliminary sample examination flow, sample allocation process, and data policies; and Curation: introducing a sample curation plan that comprises sample tracking and routing procedures, sample sterilization considerations, and long-term archiving recommendations. This work presents a summary of the group's activities, findings, and recommendations, highlighting the role of international coordination in managing the returned samples.
A multi-stage drop-the-losers design for multi-arm clinical trials.
Wason, James; Stallard, Nigel; Bowden, Jack; Jennison, Christopher
2017-02-01
Multi-arm multi-stage trials can improve the efficiency of the drug development process when multiple new treatments are available for testing. A group-sequential approach can be used in order to design multi-arm multi-stage trials, using an extension to Dunnett's multiple-testing procedure. The actual sample size used in such a trial is a random variable that has high variability. This can cause problems when applying for funding as the cost will also be generally highly variable. This motivates a type of design that provides the efficiency advantages of a group-sequential multi-arm multi-stage design, but has a fixed sample size. One such design is the two-stage drop-the-losers design, in which a number of experimental treatments, and a control treatment, are assessed at a prescheduled interim analysis. The best-performing experimental treatment and the control treatment then continue to a second stage. In this paper, we discuss extending this design to have more than two stages, which is shown to considerably reduce the sample size required. We also compare the resulting sample size requirements to the sample size distribution of analogous group-sequential multi-arm multi-stage designs. The sample size required for a multi-stage drop-the-losers design is usually higher than, but close to, the median sample size of a group-sequential multi-arm multi-stage trial. In many practical scenarios, the disadvantage of a slight loss in average efficiency would be overcome by the huge advantage of a fixed sample size. We assess the impact of delay between recruitment and assessment as well as unknown variance on the drop-the-losers designs.
Evidence for an extensive intracluster medium from radio observations of distant Abell clusters
NASA Technical Reports Server (NTRS)
Hanisch, R. J.; Ulmer, M. P.
1985-01-01
Observations have been made of 18 distance class 5 and 6 Abell clusters of galaxies using the VLA in its 'C' configuration at a frequency of 1460 MHz. Half of the clusters in the sample are confirmed or probable sources of X-ray emission. All the detected radio sources with flux densities above 10 mJy are reported, and information is provided concerning the angular extent of the sources, as well as the most likely optical identification. The existence of an extensive intracluster medium is inferred by identifying extended/distorted radio sources with galaxies whose apparent magnitudes are consistent with their being cluster members and that are at projected distances of 3-4 Abell radii (6-8 Mpc) from the nearest cluster center. By requiring that the radio sources are confined by the ambient medium, the ambient density is calculated and the total cluster mass is estimated. As a sample calculation, a wide-angle-tail radio source some 5 Mpc from the center of Abell 348 is used to estimate these quantities.
ERIC Educational Resources Information Center
Wilson, Gary; Newcomb, L. H.
A study was conducted to determine the relationship of certain motivational appeals to the extent of participation of extension clientele, as perceived by these clientele. A stratified random sample of thirty counties from the ten extension supervisory areas of Ohio was used for the study. This sample provided for 395 adult agricultural clientele…
Specialization of tendon mechanical properties results from interfascicular differences
Thorpe, Chavaunne T.; Udeze, Chineye P.; Birch, Helen L.; Clegg, Peter D.; Screen, Hazel R. C.
2012-01-01
Tendons transfer force from muscle to bone. Specific tendons, including the equine superficial digital flexor tendon (SDFT), also store and return energy. For efficient function, energy-storing tendons need to be more extensible than positional tendons such as the common digital extensor tendon (CDET), and when tested in vitro have a lower modulus and failure stress, but a higher failure strain. It is not known how differences in matrix organization contribute to distinct mechanical properties in functionally different tendons. We investigated the properties of whole tendons, tendon fascicles and the fascicular interface in the high-strain energy-storing SDFT and low-strain positional CDET. Fascicles failed at lower stresses and strains than tendons. The SDFT was more extensible than the CDET, but SDFT fascicles failed at lower strains than CDET fascicles, resulting in large differences between tendon and fascicle failure strain in the SDFT. At physiological loads, the stiffness at the fascicular interface was lower in the SDFT samples, enabling a greater fascicle sliding that could account for differences in tendon and fascicle failure strain. Sliding between fascicles prior to fascicle extension in the SDFT may allow the large extensions required in energy-storing tendons while protecting fascicles from damage. PMID:22764132
Specialization of tendon mechanical properties results from interfascicular differences.
Thorpe, Chavaunne T; Udeze, Chineye P; Birch, Helen L; Clegg, Peter D; Screen, Hazel R C
2012-11-07
Tendons transfer force from muscle to bone. Specific tendons, including the equine superficial digital flexor tendon (SDFT), also store and return energy. For efficient function, energy-storing tendons need to be more extensible than positional tendons such as the common digital extensor tendon (CDET), and when tested in vitro have a lower modulus and failure stress, but a higher failure strain. It is not known how differences in matrix organization contribute to distinct mechanical properties in functionally different tendons. We investigated the properties of whole tendons, tendon fascicles and the fascicular interface in the high-strain energy-storing SDFT and low-strain positional CDET. Fascicles failed at lower stresses and strains than tendons. The SDFT was more extensible than the CDET, but SDFT fascicles failed at lower strains than CDET fascicles, resulting in large differences between tendon and fascicle failure strain in the SDFT. At physiological loads, the stiffness at the fascicular interface was lower in the SDFT samples, enabling a greater fascicle sliding that could account for differences in tendon and fascicle failure strain. Sliding between fascicles prior to fascicle extension in the SDFT may allow the large extensions required in energy-storing tendons while protecting fascicles from damage.
EXTENSION EDUCATION SYMPOSIUM: reinventing extension as a resource--what does the future hold?
Mirando, M A; Bewley, J M; Blue, J; Amaral-Phillips, D M; Corriher, V A; Whittet, K M; Arthur, N; Patterson, D J
2012-10-01
The mission of the Cooperative Extension Service, as a component of the land-grant university system, is to disseminate new knowledge and to foster its application and use. Opportunities and challenges facing animal agriculture in the United States have changed dramatically over the past few decades and require the use of new approaches and emerging technologies that are available to extension professionals. Increased federal competitive grant funding for extension, the creation of eXtension, the development of smartphone and related electronic technologies, and the rapidly increasing popularity of social media created new opportunities for extension educators to disseminate knowledge to a variety of audiences and engage these audiences in electronic discussions. Competitive grant funding opportunities for extension efforts to advance animal agriculture became available from the USDA National Institute of Food and Agriculture (NIFA) and have increased dramatically in recent years. The majority of NIFA funding opportunities require extension efforts to be integrated with research, and NIFA encourages the use of eXtension and other cutting-edge approaches to extend research to traditional clientele and nontraditional audiences. A case study is presented to illustrate how research and extension were integrated to improve the adoption of AI by beef producers. Those in agriculture are increasingly resorting to the use of social media venues such as Facebook, YouTube, LinkedIn, and Twitter to access information required to support their enterprises. Use of these various approaches by extension educators requires appreciation of the technology and an understanding of how the target audiences access information available on social media. Technology to deliver information is changing rapidly, and Cooperative Extension Service professionals will need to continuously evaluate digital technology and social media tools to appropriately integrate them into learning and educational opportunities.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-30
...] Construction Fall Protection Systems Criteria and Practices and Training Requirements; Extension of the Office of Management and Budget's (OMB) Approval of Information Collection (Paperwork) Requirements AGENCY... requirements contained in the construction standards on Fall Protection Systems Criteria and Practices (29 CFR...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less
Open science resources for the discovery and analysis of Tara Oceans data
Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah; Acinas, Silvia G.; Bork, Peer; Boss, Emmanuel; Bowler, Chris; Vargas, Colomban De; Follows, Michael; Gorsky, Gabriel; Grimsley, Nigel; Hingamp, Pascal; Iudicone, Daniele; Jaillon, Olivier; Kandels-Lewis, Stefanie; Karp-Boss, Lee; Karsenti, Eric; Krzic, Uros; Not, Fabrice; Ogata, Hiroyuki; Pesant, Stéphane; Raes, Jeroen; Reynaud, Emmanuel G.; Sardet, Christian; Sieracki, Mike; Speich, Sabrina; Stemmann, Lars; Sullivan, Matthew B.; Sunagawa, Shinichi; Velayoudon, Didier; Weissenbach, Jean; Wincker, Patrick
2015-01-01
The Tara Oceans expedition (2009–2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events. PMID:26029378
Open science resources for the discovery and analysis of Tara Oceans data
NASA Astrophysics Data System (ADS)
2015-05-01
The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.
Open science resources for the discovery and analysis of Tara Oceans data.
Pesant, Stéphane; Not, Fabrice; Picheral, Marc; Kandels-Lewis, Stefanie; Le Bescot, Noan; Gorsky, Gabriel; Iudicone, Daniele; Karsenti, Eric; Speich, Sabrina; Troublé, Romain; Dimier, Céline; Searson, Sarah
2015-01-01
The Tara Oceans expedition (2009-2013) sampled contrasting ecosystems of the world oceans, collecting environmental data and plankton, from viruses to metazoans, for later analysis using modern sequencing and state-of-the-art imaging technologies. It surveyed 210 ecosystems in 20 biogeographic provinces, collecting over 35,000 samples of seawater and plankton. The interpretation of such an extensive collection of samples in their ecological context requires means to explore, assess and access raw and validated data sets. To address this challenge, the Tara Oceans Consortium offers open science resources, including the use of open access archives for nucleotides (ENA) and for environmental, biogeochemical, taxonomic and morphological data (PANGAEA), and the development of on line discovery tools and collaborative annotation tools for sequences and images. Here, we present an overview of Tara Oceans Data, and we provide detailed registries (data sets) of all campaigns (from port-to-port), stations and sampling events.
Mars Sample Return Using Commercial Capabilities: Propulsive Entry, Descent and Landing
NASA Technical Reports Server (NTRS)
Lemke, Lawrence G.; Gonzales, Andrew A.; Huynh, Loc C.
2014-01-01
Mars Sample Return (MSR) is the highest priority science mission for the next decade as recommended by the recent Decadal Survey of Planetary Science. The objective of the study was to determine whether emerging commercial capabilities can be integrated into to such a mission. The premise of the study is that commercial capabilities can be more efficient than previously described systems, and by using fewer systems and fewer or less extensive launches, overall mission cost can be reduced. This presentation describes an EDL technique using planned upgrades to the Dragon capsule to perform a Supersonic Retropulsion Entry - Red Dragon concept. Landed Payload capability meets mission requirements for a MSR Architecture that reduces complexity.
Chang, Yu-Wei; Tsong, Yi; Zhao, Zhigen
2017-01-01
Assessing equivalence or similarity has drawn much attention recently as many drug products have lost or will lose their patents in the next few years, especially certain best-selling biologics. To claim equivalence between the test treatment and the reference treatment when assay sensitivity is well established from historical data, one has to demonstrate both superiority of the test treatment over placebo and equivalence between the test treatment and the reference treatment. Thus, there is urgency for practitioners to derive a practical way to calculate sample size for a three-arm equivalence trial. The primary endpoints of a clinical trial may not always be continuous, but may be discrete. In this paper, the authors derive power function and discuss sample size requirement for a three-arm equivalence trial with Poisson and negative binomial clinical endpoints. In addition, the authors examine the effect of the dispersion parameter on the power and the sample size by varying its coefficient from small to large. In extensive numerical studies, the authors demonstrate that required sample size heavily depends on the dispersion parameter. Therefore, misusing a Poisson model for negative binomial data may easily lose power up to 20%, depending on the value of the dispersion parameter.
Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology
NASA Technical Reports Server (NTRS)
Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan
2012-01-01
A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.
Yoo, Tae Yeon; Adhikari, Aashish; Xia, Zhen; Huynh, Tien; Freed, Karl F.; Zhou, Ruhong; Sosnick, Tobin R.
2012-01-01
Progress in understanding protein folding relies heavily upon an interplay between experiment and theory. In particular, readily interpretable experimental data are required that can be meaningfully compared to simulations. According to standard mutational φ analysis, the transition state for Protein L contains only a single hairpin. However, we demonstrate here using ψ analysis with engineered metal ion binding sites that the transition state is extensive, containing the entire four-stranded β sheet. Underreporting of the structural content of the transition state by φ analysis also occurs for acyl phosphatase1, ubiquitin2 and BdpA3. The carboxy terminal hairpin in the transition state of Protein L is found to be non-native, a significant result that agrees with our PDB-based backbone sampling and all-atom simulations. The non-native character partially explains the failure of accepted experimental and native-centric computational approaches to adequately describe the transition state. Hence, caution is required even when an apparent agreement exists between experiment and theory, thus highlighting the importance of having alternative methods for characterizing transition states. PMID:22522126
High-throughput hyperpolarized 13C metabolic investigations using a multi-channel acquisition system
NASA Astrophysics Data System (ADS)
Lee, Jaehyuk; Ramirez, Marc S.; Walker, Christopher M.; Chen, Yunyun; Yi, Stacey; Sandulache, Vlad C.; Lai, Stephen Y.; Bankson, James A.
2015-11-01
Magnetic resonance imaging and spectroscopy of hyperpolarized (HP) compounds such as [1-13C]-pyruvate have shown tremendous potential for offering new insight into disease and response to therapy. New applications of this technology in clinical research and care will require extensive validation in cells and animal models, a process that may be limited by the high cost and modest throughput associated with dynamic nuclear polarization. Relatively wide spectral separation between [1-13C]-pyruvate and its chemical endpoints in vivo are conducive to simultaneous multi-sample measurements, even in the presence of a suboptimal global shim. Multi-channel acquisitions could conserve costs and accelerate experiments by allowing acquisition from multiple independent samples following a single dissolution. Unfortunately, many existing preclinical MRI systems are equipped with only a single channel for broadband acquisitions. In this work, we examine the feasibility of this concept using a broadband multi-channel digital receiver extension and detector arrays that allow concurrent measurement of dynamic spectroscopic data from ex vivo enzyme phantoms, in vitro anaplastic thyroid carcinoma cells, and in vivo in tumor-bearing mice. Throughput and the cost of consumables were improved by up to a factor of four. These preliminary results demonstrate the potential for efficient multi-sample studies employing hyperpolarized agents.
Bordelon, B M; Hobday, K A; Hunter, J G
1992-01-01
An unsolved problem of laparoscopic cholecystectomy is the optimal method of removing the gallbladder with thick walls and a large stone burden. Proposed solutions include fascial dilatation, stone crushing, and ultrasonic, high-speed rotary, or laser lithotripsy. Our observation was that extension of the fascial incision to remove the impacted gallbladder was time efficient and did not increase postoperative pain. We reviewed the narcotic requirements of 107 consecutive patients undergoing laparoscopic cholecystectomy. Fifty-two patients required extension of the umbilical incision, and 55 patients did not have their fascial incision enlarged. Parenteral meperidine use was 39.5 +/- 63.6 mg in the patients requiring fascial incision extension and 66.3 +/- 79.2 mg in those not requiring fascial incision extension (mean +/- standard deviation). Oral narcotic requirements were 1.1 +/- 1.5 doses vs 1.3 +/- 1.7 doses in patients with and without incision extension, respectively. The wide range of narcotic use in both groups makes these apparent differences not statistically significant. We conclude that protracted attempts at stone crushing or expensive stone fragmentation devices are unnecessary for the extraction of a difficult gallbladder during laparoscopic cholecystectomy.
NASA Astrophysics Data System (ADS)
Ling, Yuye; Hendon, Christine P.
2016-02-01
Functional extensions to optical coherence tomography (OCT) provide useful imaging contrasts that are complementary to conventional OCT. Our goal is to characterize tissue types within the myocardial due to remodeling and therapy. High-speed imaging is necessary to extract mechanical properties and dynamics of fiber orientation changes in a beating heart. Functional extensions of OCT such as polarization sensitive and optical coherence elastography (OCE) require high phase stability of the system, which is a drawback of current mechanically tuned swept source OCT systems. Here we present a high-speed functional imaging platform, which includes an ultrahigh-phase-stable swept source equipped with KTN deflector from NTT-AT. The swept source does not require mechanical movements during the wavelength sweeping; it is electrically tuned. The inter-sweep phase variance of the system was measured to be less than 300 ps at a path length difference of ~2 mm. The axial resolution of the system is 20 µm and the -10 dB fall-off depth is about 3.2 mm. The sample arm has an 8 mmx8 mm field of view with a lateral resolution of approximately 18 µm. The sample arm uses a two-axis MEMS mirror, which is programmable and capable of scanning arbitrary patterns at a sampling rate of 50 kHz. Preliminary imaging results showed differences in polarization properties and image penetration in ablated and normal myocardium. In the future, we will conduct dynamic stretching experiments with strips of human myocardial tissue to characterize mechanical properties using OCE. With high speed imaging of 200 kHz and an all-fiber design, we will work towards catheter-based functional imaging.
Applied 3D printing for microscopy in health science research
NASA Astrophysics Data System (ADS)
Brideau, Craig; Zareinia, Kourosh; Stys, Peter
2015-03-01
The rapid prototyping capability offered by 3D printing is considered advantageous for commercial applications. However, the ability to quickly produce precision custom devices is highly beneficial in the research laboratory setting as well. Biological laboratories require the manipulation and analysis of delicate living samples, thus the ability to create custom holders, support equipment, and adapters allow the extension of existing laboratory machines. Applications include camera adapters and stage sample holders for microscopes, surgical guides for tissue preparation, and small precision tools customized to unique specifications. Where high precision is needed, especially the reproduction of fine features, a printer with a high resolution is needed. However, the introduction of cheaper, lower resolution commercial printers have been shown to be more than adequate for less demanding projects. For direct manipulation of delicate samples, biocompatible raw materials are often required, complicating the printing process. This paper will examine some examples of 3D-printed objects for laboratory use, and provide an overview of the requirements for 3D printing for this application. Materials, printing resolution, production, and ease of use will all be reviewed with an eye to producing better printers and techniques for laboratory applications. Specific case studies will highlight applications for 3D-printed devices in live animal imaging for both microscopy and Magnetic Resonance Imaging.
Pikkemaat, M G; Rapallini, M L B A; Karp, M T; Elferink, J W A
2010-08-01
Tetracyclines are extensively used in veterinary medicine. For the detection of tetracycline residues in animal products, a broad array of methods is available. Luminescent bacterial biosensors represent an attractive inexpensive, simple and fast method for screening large numbers of samples. A previously developed cell-biosensor method was subjected to an evaluation study using over 300 routine poultry samples and the results were compared with a microbial inhibition test. The cell-biosensor assay yielded many more suspect samples, 10.2% versus 2% with the inhibition test, which all could be confirmed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Only one sample contained a concentration above the maximum residue limit (MRL) of 100 microg kg(-1), while residue levels in most of the suspect samples were very low (<10 microg kg(-1)). The method appeared to be specific and robust. Using an experimental set-up comprising the analysis of a series of three sample dilutions allowed an appropriate cut-off for confirmatory analysis, limiting the number of samples and requiring further analysis to a minimum.
Wright, Mark H.; Tung, Chih-Wei; Zhao, Keyan; Reynolds, Andy; McCouch, Susan R.; Bustamante, Carlos D.
2010-01-01
Motivation: The development of new high-throughput genotyping products requires a significant investment in testing and training samples to evaluate and optimize the product before it can be used reliably on new samples. One reason for this is current methods for automated calling of genotypes are based on clustering approaches which require a large number of samples to be analyzed simultaneously, or an extensive training dataset to seed clusters. In systems where inbred samples are of primary interest, current clustering approaches perform poorly due to the inability to clearly identify a heterozygote cluster. Results: As part of the development of two custom single nucleotide polymorphism genotyping products for Oryza sativa (domestic rice), we have developed a new genotype calling algorithm called ‘ALCHEMY’ based on statistical modeling of the raw intensity data rather than modelless clustering. A novel feature of the model is the ability to estimate and incorporate inbreeding information on a per sample basis allowing accurate genotyping of both inbred and heterozygous samples even when analyzed simultaneously. Since clustering is not used explicitly, ALCHEMY performs well on small sample sizes with accuracy exceeding 99% with as few as 18 samples. Availability: ALCHEMY is available for both commercial and academic use free of charge and distributed under the GNU General Public License at http://alchemy.sourceforge.net/ Contact: mhw6@cornell.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20926420
Wotring, Virginia E
2016-01-01
Medications degrade over time, and degradation is hastened by extreme storage conditions. Current procedures ensure that medications aboard the International Space Station (ISS) are restocked before their expiration dates, but resupply may not be possible on future long-duration exploration missions. For this reason, medications stored on the ISS were returned to Earth for analysis. This was an opportunistic, observational pilot-scale investigation to test the hypothesis that ISS-aging does not cause unusual degradation. Nine medications were analyzed for active pharmaceutical ingredient (API) content and degradant amounts; results were compared to 2012 United States Pharmacopeia (USP) requirements. The medications were two sleep aids, two antihistamines/decongestants, three pain relievers, an antidiarrheal, and an alertness medication. Because the samples were obtained opportunistically from unused medical supplies, each medication was available at only 1 time point and no control samples (samples aged for a similar period on Earth) were available. One medication met USP requirements 5 months after its expiration date. Four of the nine (44% of those tested) medications tested met USP requirements 8 months post expiration. Another three medications (33%) met USP guidelines 2-3 months before expiration. One compound, a dietary supplement used as a sleep aid, failed to meet USP requirements at 11 months post expiration. No unusual degradation products were identified. Limited, evidence-based extension of medication shelf-lives may be possible and would be useful in preparation for lengthy exploration missions. Only analysis of flight-aged samples compared to appropriately matched ground controls will permit determination of the spaceflight environment on medication stability.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-09
...] Additional Requirements for Special Dipping and Coating Operations (Dip Tanks); Extension of the Office of Management and Budget's Approval of the Information Collection (Paperwork) Requirement AGENCY: Occupational... requirement specified in its Standard on Dipping and Coating Operations (Dip Tanks) (29 CFR 1910.126(g)(4...
Detection of halogenated flame retardants in polyurethane foam by particle induced X-ray emission
NASA Astrophysics Data System (ADS)
Maley, Adam M.; Falk, Kyle A.; Hoover, Luke; Earlywine, Elly B.; Seymour, Michael D.; DeYoung, Paul A.; Blum, Arlene; Stapleton, Heather M.; Peaslee, Graham F.
2015-09-01
A novel application of particle-induced X-ray emission (PIXE) has been developed to detect the presence of chlorinated and brominated flame retardant chemicals in polyurethane foams. Traditional Gas Chromatography-Mass Spectrometry (GC-MS) methods for the detection and identification of halogenated flame retardants in foams require extensive sample preparation and data acquisition time. The elemental analysis of the halogens in polyurethane foam performed by PIXE offers the opportunity to identify the presence of halogenated flame retardants in a fraction of the time and sample preparation cost. Through comparative GC-MS and PIXE analysis of 215 foam samples, excellent agreement between the two methods was obtained. These results suggest that PIXE could be an ideal rapid screening method for the presence of chlorinated and brominated flame retardants in polyurethane foams.
NASA Technical Reports Server (NTRS)
Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph
2015-01-01
The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.
Sample preparation techniques for the determination of trace residues and contaminants in foods.
Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M
2007-06-15
The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
...-9252-4] Extension of Comment Period on Change to the Reporting Date for Certain Data Elements Required... Change to the Reporting Date for Certain Data Elements Required Under the Mandatory Reporting of... the Reporting Date for Certain Data Elements Required Under the Mandatory Reporting of Greenhouse...
Reliable use of determinants to solve nonlinear structural eigenvalue problems efficiently
NASA Technical Reports Server (NTRS)
Williams, F. W.; Kennedy, D.
1988-01-01
The analytical derivation, numerical implementation, and performance of a multiple-determinant parabolic interpolation method (MDPIM) for use in solving transcendental eigenvalue (critical buckling or undamped free vibration) problems in structural mechanics are presented. The overall bounding, eigenvalue-separation, qualified parabolic interpolation, accuracy-confirmation, and convergence-recovery stages of the MDPIM are described in detail, and the numbers of iterations required to solve sample plane-frame problems using the MDPIM are compared with those for a conventional bisection method and for the Newtonian method of Simpson (1984) in extensive tables. The MDPIM is shown to use 31 percent less computation time than bisection when accuracy of 0.0001 is required, but 62 percent less when accuracy of 10 to the -8th is required; the time savings over the Newtonian method are about 10 percent.
Varadarajan, Divya; Haldar, Justin P
2017-11-01
The data measured in diffusion MRI can be modeled as the Fourier transform of the Ensemble Average Propagator (EAP), a probability distribution that summarizes the molecular diffusion behavior of the spins within each voxel. This Fourier relationship is potentially advantageous because of the extensive theory that has been developed to characterize the sampling requirements, accuracy, and stability of linear Fourier reconstruction methods. However, existing diffusion MRI data sampling and signal estimation methods have largely been developed and tuned without the benefit of such theory, instead relying on approximations, intuition, and extensive empirical evaluation. This paper aims to address this discrepancy by introducing a novel theoretical signal processing framework for diffusion MRI. The new framework can be used to characterize arbitrary linear diffusion estimation methods with arbitrary q-space sampling, and can be used to theoretically evaluate and compare the accuracy, resolution, and noise-resilience of different data acquisition and parameter estimation techniques. The framework is based on the EAP, and makes very limited modeling assumptions. As a result, the approach can even provide new insight into the behavior of model-based linear diffusion estimation methods in contexts where the modeling assumptions are inaccurate. The practical usefulness of the proposed framework is illustrated using both simulated and real diffusion MRI data in applications such as choosing between different parameter estimation methods and choosing between different q-space sampling schemes. Copyright © 2017 Elsevier Inc. All rights reserved.
Is 50 Hz high enough ECG sampling frequency for accurate HRV analysis?
Mahdiani, Shadi; Jeyhani, Vala; Peltokangas, Mikko; Vehkaoja, Antti
2015-01-01
With the worldwide growth of mobile wireless technologies, healthcare services can be provided at anytime and anywhere. Usage of wearable wireless physiological monitoring system has been extensively increasing during the last decade. These mobile devices can continuously measure e.g. the heart activity and wirelessly transfer the data to the mobile phone of the patient. One of the significant restrictions for these devices is usage of energy, which leads to requiring low sampling rate. This article is presented in order to investigate the lowest adequate sampling frequency of ECG signal, for achieving accurate enough time domain heart rate variability (HRV) parameters. For this purpose the ECG signals originally measured with high 5 kHz sampling rate were down-sampled to simulate the measurement with lower sampling rate. Down-sampling loses information, decreases temporal accuracy, which was then restored by interpolating the signals to their original sampling rates. The HRV parameters obtained from the ECG signals with lower sampling rates were compared. The results represent that even when the sampling rate of ECG signal is equal to 50 Hz, the HRV parameters are almost accurate with a reasonable error.
Code of Federal Regulations, 2013 CFR
2013-07-01
... purchase orders at issue and would require extensive training to learn new technology or processes that... require extensive training to learn new technology or processes that would not be required of a new...) Managerial and supervisory employees. This part does not apply to employees who are managerial or supervisory...
Code of Federal Regulations, 2014 CFR
2014-07-01
... purchase orders at issue and would require extensive training to learn new technology or processes that... require extensive training to learn new technology or processes that would not be required of a new...) Managerial and supervisory employees. This part does not apply to employees who are managerial or supervisory...
Code of Federal Regulations, 2012 CFR
2012-07-01
... purchase orders at issue and would require extensive training to learn new technology or processes that... require extensive training to learn new technology or processes that would not be required of a new...) Managerial and supervisory employees. This part does not apply to employees who are managerial or supervisory...
McMullin, David; Mizaikoff, Boris; Krska, Rudolf
2015-01-01
Infrared spectroscopy is a rapid, nondestructive analytical technique that can be applied to the authentication and characterization of food samples in high throughput. In particular, near infrared spectroscopy is commonly utilized in the food quality control industry to monitor the physical attributes of numerous cereal grains for protein, carbohydrate, and lipid content. IR-based methods require little sample preparation, labor, or technical competence if multivariate data mining techniques are implemented; however, they do require extensive calibration. Economically important crops are infected by fungi that can severely reduce crop yields and quality and, in addition, produce mycotoxins. Owing to the health risks associated with mycotoxins in the food chain, regulatory limits have been set by both national and international institutions for specific mycotoxins and mycotoxin classes. This article discusses the progress and potential of IR-based methods as an alternative to existing chemical methods for the determination of fungal contamination in crops, as well as emerging spectroscopic methods.
Nano-plasmonic exosome diagnostics
Im, Hyungsoon; Shao, Huilin; Weissleder, Ralph; Castro, Cesar M.; Lee, Hakho
2015-01-01
Exosomes have emerged as a promising biomarker. These vesicles abound in biofluids and harbor molecular constituents from their parent cells, thereby offering a minimally-invasive avenue for molecular analyses. Despite such clinical potential, routine exosomal analysis, particularly the protein assay, remains challenging, due to requirements for large sample volumes and extensive processing. We have been developing miniaturized systems to facilitate clinical exosome studies. These systems can be categorized into two components: microfluidics for sample preparation and analytical tools for protein analyses. In this report, we review a new assay platform, nano-plasmonic exosome (nPLEX), in which sensing is based on surface plasmon resonance to achieve label-free exosome detection. Looking forward, we also discuss some potential challenges and improvements in exosome studies. PMID:25936957
NASA Technical Reports Server (NTRS)
Griffin, Timothy P.; Naylor, Guy R.; Hritz, Richard J.; Barrett, Carolyn A.
1997-01-01
The main engines of the Space Shuttle use hydrogen and oxygen as the fuel and oxidant. The explosive and fire hazards associated with these two components pose a serious danger to personnel and equipment. Therefore prior to use the main engines undergo extensive leak tests. Instead of using hazardous gases there tests utilize helium as the tracer element. This results in a need to monitor helium in the ppm level continuously for hours. The major challenge in developing such a low level gas monitor is the sample delivery system. This paper discuss a system developed to meet the requirements while also being mobile. Also shown is the calibration technique, stability, and accuracy results for the system.
NASA Astrophysics Data System (ADS)
Nazarzadeh Zare, Mohsen; Dorrani, Kamal; Gholamali Lavasani, Masoud
2012-11-01
Background and purpose : This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample : The statistical population consisted of 5060 farmers and 50 extension agents; all extension agents were studied owing to their small population and a sample of 466 farmers was selected based on the stratified ratio sampling method. For the data analysis, statistical procedures including the t-test and factor analysis were used. Results : The results of factor analysis on the views of farmers indicated that these courses have problems such as inadequate use of instructional materials by extension agents, insufficient employment of knowledgeable and experienced extension agents, bad and inconvenient timing of courses for farmers, lack of logical connection between one curriculum and prior ones, negligence in considering the opinions of farmers in arranging the courses, and lack of information about the time of courses. The findings of factor analysis on the views of extension agents indicated that these courses suffer from problems such as use of consistent methods of instruction for teaching curricula, and lack of continuity between courses and their levels and content. Conclusions : Recommendations include: listening to the views of farmers when planning extension courses; providing audiovisual aids, pamphlets and CDs; arranging courses based on convenient timing for farmers; using incentives to encourage participation; and employing extension agents with knowledge of the latest agricultural issues.
The Text Encoding Initiative: Flexible and Extensible Document Encoding.
ERIC Educational Resources Information Center
Barnard, David T.; Ide, Nancy M.
1997-01-01
The Text Encoding Initiative (TEI), an international collaboration aimed at producing a common encoding scheme for complex texts, examines the requirement for generality versus the requirement to handle specialized text types. Discusses how documents and users tax the limits of fixed schemes requiring flexible extensible encoding to support…
Occurrence of antibiotics in water from 13 fish hatcheries, 2001-2003
Dietze, J.E.; Scribner, E.A.; Meyer, M.T.; Kolpin, D.W.
2005-01-01
A 2-year study of extensive and intensive fish hatcheries was conducted to assess the general temporal occurrence of antibiotics in aquaculture. Antibiotics were detected in 15% of the water samples collected during the 2001-2002 collection period and in 31% of the samples during the 2003 collection period. Antibiotics were detected more frequently in samples from the intensive hatcheries (17 and 39%) than in samples from the extensive hatcheries (14 and 4%) during the 2001-2002 and 2003 collection periods, respectively. The maximum ormetoprim, oxytetracycline, and sulphadimethoxine concentrations were higher in samples from the intensive hatcheries (12, 10, and 36 µg L-1), respectively, than in samples from the extensive hatcheries (<0.05, 0.31, and 1.2 µg L-1), respectively. Sulphadimethoxine persisted for a longer period of time (up to 48 days) than ormetoprim (up to 28 days) and oxytetracycline (less than 20 days).
Use of adaptive walls in 2D tests
NASA Technical Reports Server (NTRS)
Archambaud, J. P.; Chevallier, J. P.
1984-01-01
A new method for computing the wall effects gives precise answers to some questions arising in adaptive wall concept applications: length of adapted regions, fairings with up and downstream regions, residual misadjustments effects, reference conditions. The acceleration of the iterative process convergence and the development of an efficient technology used in CERT T2 wind tunnels give in a single run the required test conditions. Samples taken from CAST 7 tests demonstrate the efficiency of the whole process to obtain significant results with considerations of tridimensional case extension.
Metis: A Pure Metropolis Markov Chain Monte Carlo Bayesian Inference Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Cameron Russell; Mckigney, Edward Allen
The use of Bayesian inference in data analysis has become the standard for large scienti c experiments [1, 2]. The Monte Carlo Codes Group(XCP-3) at Los Alamos has developed a simple set of algorithms currently implemented in C++ and Python to easily perform at-prior Markov Chain Monte Carlo Bayesian inference with pure Metropolis sampling. These implementations are designed to be user friendly and extensible for customization based on speci c application requirements. This document describes the algorithmic choices made and presents two use cases.
A System for Cost and Reimbursement Control in Hospitals
Fetter, Robert B.; Thompson, John D.; Mills, Ronald E.
1976-01-01
This paper approaches the design of a regional or statewide hospital rate-setting system as the underpinning of a larger system which permits a regulatory agency to satisfy the requirements of various public laws now on the books or in process. It aims to generate valid interinstitutional monitoring on the three parameters of cost, utilization, and quality review. Such an approach requires the extension of the usual departmental cost and budgeting system to include consideration of the mix of patients treated and the utilization of various resources, including patient days, in the treatment of these patients. A sampling framework for the application of process-based quality studies and the generation of selected performance measurements is also included. PMID:941461
Aero-Mechanical Design Methodology for Subsonic Civil Transport High-Lift Systems
NASA Technical Reports Server (NTRS)
vanDam, C. P.; Shaw, S. G.; VanderKam, J. C.; Brodeur, R. R.; Rudolph, P. K. C.; Kinney, D.
2000-01-01
In today's highly competitive and economically driven commercial aviation market, the trend is to make aircraft systems simpler and to shorten their design cycle which reduces recurring, non-recurring and operating costs. One such system is the high-lift system. A methodology has been developed which merges aerodynamic data with kinematic analysis of the trailing-edge flap mechanism with minimum mechanism definition required. This methodology provides quick and accurate aerodynamic performance prediction for a given flap deployment mechanism early on in the high-lift system preliminary design stage. Sample analysis results for four different deployment mechanisms are presented as well as descriptions of the aerodynamic and mechanism data required for evaluation. Extensions to interactive design capabilities are also discussed.
Wilson, Christina R; Mulligan, Christopher C; Strueh, Kurt D; Stevenson, Gregory W; Hooser, Stephen B
2014-05-01
Desorption electrospray ionization mass spectrometry (DESI-MS) is an emerging analytical technique that permits the rapid and direct analysis of biological or environmental samples under ambient conditions. Highlighting the versatility of this technique, DESI-MS has been used for the rapid detection of illicit drugs, chemical warfare agents, agricultural chemicals, and pharmaceuticals from a variety of sample matrices. In diagnostic veterinary toxicology, analyzing samples using traditional analytical instrumentation typically includes extensive sample extraction procedures, which can be time consuming and labor intensive. Therefore, efforts to expedite sample analyses are a constant goal for diagnostic toxicology laboratories. In the current report, DESI-MS was used to directly analyze stomach contents from a dog exposed to the organophosphate insecticide terbufos. The total DESI-MS analysis time required to confirm the presence of terbufos and diagnose organophosphate poisoning in this case was approximately 5 min. This highlights the potential of this analytical technique in the field of veterinary toxicology for the rapid diagnosis and detection of toxicants in biological samples. © 2014 The Author(s).
3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples
NASA Technical Reports Server (NTRS)
Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.
2015-01-01
In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible
A collaborative exercise on DNA methylation based body fluid typing.
Jung, Sang-Eun; Cho, Sohee; Antunes, Joana; Gomes, Iva; Uchimoto, Mari L; Oh, Yu Na; Di Giacomo, Lisa; Schneider, Peter M; Park, Min Sun; van der Meer, Dieudonne; Williams, Graham; McCord, Bruce; Ahn, Hee-Jung; Choi, Dong Ho; Lee, Yang Han; Lee, Soong Deok; Lee, Hwan Young
2016-10-01
A collaborative exercise on DNA methylation based body fluid identification was conducted by seven laboratories. For this project, a multiplex methylation SNaPshot reaction composed of seven CpG markers was used for the identification of four body fluids, including blood, saliva, semen, and vaginal fluid. A total of 30 specimens were prepared and distributed to participating laboratories after thorough testing. The required experiments included four increasingly complex tasks: (1) CE of a purified single-base extension reaction product, (2) multiplex PCR and multiplex single-base extension reaction of bisulfite-modified DNA, (3) bisulfite conversion of genomic DNA, and (4) extraction of genomic DNA from body fluid samples. In tasks 2, 3 and 4, one or more mixtures were analyzed, and specimens containing both known and unknown body fluid sources were used. Six of the laboratories generated consistent body fluid typing results for specimens of bisulfite-converted DNA and genomic DNA. One laboratory failed to set up appropriate conditions for capillary analysis of reference single-base extension products. In general, variation in the values obtained for DNA methylation analysis between laboratories increased with the complexity of the required experiments. However, all laboratories concurred on the interpretation of the DNA methylation profiles produced. Although the establishment of interpretational guidelines on DNA methylation based body fluid identification has yet to be performed, this study supports the addition of DNA methylation profiling to forensic body fluid typing. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optimized exploration resource evaluation using the MDT tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zainun, K.; Trice, M.L.
1995-10-01
This paper discusses exploration cost reduction and improved resource delineation benefits that were realized by use of the MDT (Modular Formation Dynamic Tester) tool to evaluate exploration prospects in the Malay Basin of the South China Sea. Frequently, open hole logs do not clearly define fluid content due to low salinity of the connate water and the effect of shale laminae or bioturbation in the silty, shaley sandstones. Therefore, extensive pressure measurements and fluid sampling are required to define fluid type and contacts. This paper briefly describes the features of the MDT tool which were utilized to reduce rig timemore » usage while providing more representative fluid samples and illustrates usage of these features with field examples. The tool has been used on several exploration wells and a comparison of MDT pressures and samples to results obtained with earlier vintage tools and production tests is also discussed.« less
Imaging samples larger than the field of view: the SLS experience
NASA Astrophysics Data System (ADS)
Vogiatzis Oikonomidis, Ioannis; Lovric, Goran; Cremona, Tiziana P.; Arcadu, Filippo; Patera, Alessandra; Schittny, Johannes C.; Stampanoni, Marco
2017-06-01
Volumetric datasets with micrometer spatial and sub-second temporal resolutions are nowadays routinely acquired using synchrotron X-ray tomographic microscopy (SRXTM). Although SRXTM technology allows the examination of multiple samples with short scan times, many specimens are larger than the field-of-view (FOV) provided by the detector. The extension of the FOV in the direction perpendicular to the rotation axis remains non-trivial. We present a method that can efficiently increase the FOV merging volumetric datasets obtained by region-of-interest tomographies in different 3D positions of the sample with a minimal amount of artefacts and with the ability to handle large amounts of data. The method has been successfully applied for the three-dimensional imaging of a small number of mouse lung acini of intact animals, where pixel sizes down to the micrometer range and short exposure times are required.
Kottmann, Renzo; Gray, Tanya; Murphy, Sean; Kagan, Leonid; Kravitz, Saul; Lombardot, Thierry; Field, Dawn; Glöckner, Frank Oliver
2008-06-01
The Genomic Contextual Data Markup Language (GCDML) is a core project of the Genomic Standards Consortium (GSC) that implements the "Minimum Information about a Genome Sequence" (MIGS) specification and its extension, the "Minimum Information about a Metagenome Sequence" (MIMS). GCDML is an XML Schema for generating MIGS/MIMS compliant reports for data entry, exchange, and storage. When mature, this sample-centric, strongly-typed schema will provide a diverse set of descriptors for describing the exact origin and processing of a biological sample, from sampling to sequencing, and subsequent analysis. Here we describe the need for such a project, outline design principles required to support the project, and make an open call for participation in defining the future content of GCDML. GCDML is freely available, and can be downloaded, along with documentation, from the GSC Web site (http://gensc.org).
Shackleton, David; Pagram, Jenny; Ives, Lesley; Vanhinsbergh, Des
2018-06-02
The RapidHIT™ 200 System is a fully automated sample-to-DNA profile system designed to produce high quality DNA profiles within 2h. The use of RapidHIT™ 200 System within the United Kingdom Criminal Justice System (UKCJS) has required extensive development and validation of methods with a focus on AmpFℓSTR ® NGMSElect™ Express PCR kit to comply with specific regulations for loading to the UK National DNA Database (NDNAD). These studies have been carried out using single source reference samples to simulate live reference samples taken from arrestees and victims for elimination. The studies have shown that the system is capable of generating high quality profile and has achieved the accreditations necessary to load to the NDNAD; a first for the UK. Copyright © 2018 Elsevier B.V. All rights reserved.
76 FR 48181 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-08
... employee during the prior week.'' This requirement is implemented by 29 CFR 3.3 and 3.4 and the standard... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice. SUMMARY: The...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Requests for extension of authority to operate without required monitors, indicating instruments, and EBS Attention Signal devices. 1.549 Section 1.549... required monitors, indicating instruments, and EBS Attention Signal devices. See § 73.3549. ...
Wildlife-friendly farming benefits rare birds, bees and plants.
Pywell, Richard F; Heard, Matthew S; Bradbury, Richard B; Hinsley, Shelley; Nowakowski, Marek; Walker, Kevin J; Bullock, James M
2012-10-23
Agricultural intensification is a leading cause of global biodiversity loss, especially for threatened and near-threatened species. One widely implemented response is 'wildlife-friendly farming', involving the close integration of conservation and extensive farming practices within agricultural landscapes. However, the putative benefits from this controversial policy are currently either unknown or thought unlikely to extend to rare and declining species. Here, we show that new, evidence-based approaches to habitat creation on intensively managed farmland in England can achieve large increases in plant, bee and bird species. In particular, we found that habitat enhancement methods designed to provide the requirements of sensitive target biota consistently increased the richness and abundance of both rare and common species, with 10-fold to greater than 100-fold more rare species per sample area than generalized conventional conservation measures. Furthermore, targeting landscapes of high species richness amplified beneficial effects on the least mobile taxa: plants and bees. Our results provide the first unequivocal support for a national wildlife-friendly farming policy and suggest that this approach should be implemented much more extensively to address global biodiversity loss. However, to be effective, these conservation measures must be evidence-based, and developed using sound knowledge of the ecological requirements of key species.
Martin, James; Taljaard, Monica; Girling, Alan; Hemming, Karla
2016-01-01
Background Stepped-wedge cluster randomised trials (SW-CRT) are increasingly being used in health policy and services research, but unless they are conducted and reported to the highest methodological standards, they are unlikely to be useful to decision-makers. Sample size calculations for these designs require allowance for clustering, time effects and repeated measures. Methods We carried out a methodological review of SW-CRTs up to October 2014. We assessed adherence to reporting each of the 9 sample size calculation items recommended in the 2012 extension of the CONSORT statement to cluster trials. Results We identified 32 completed trials and 28 independent protocols published between 1987 and 2014. Of these, 45 (75%) reported a sample size calculation, with a median of 5.0 (IQR 2.5–6.0) of the 9 CONSORT items reported. Of those that reported a sample size calculation, the majority, 33 (73%), allowed for clustering, but just 15 (33%) allowed for time effects. There was a small increase in the proportions reporting a sample size calculation (from 64% before to 84% after publication of the CONSORT extension, p=0.07). The type of design (cohort or cross-sectional) was not reported clearly in the majority of studies, but cohort designs seemed to be most prevalent. Sample size calculations in cohort designs were particularly poor with only 3 out of 24 (13%) of these studies allowing for repeated measures. Discussion The quality of reporting of sample size items in stepped-wedge trials is suboptimal. There is an urgent need for dissemination of the appropriate guidelines for reporting and methodological development to match the proliferation of the use of this design in practice. Time effects and repeated measures should be considered in all SW-CRT power calculations, and there should be clarity in reporting trials as cohort or cross-sectional designs. PMID:26846897
Environmental scanning electron microscopy in cell biology.
McGregor, J E; Staniewicz, L T L; Guthrie Neé Kirk, S E; Donald, A M
2013-01-01
Environmental scanning electron microscopy (ESEM) (1) is an imaging technique which allows hydrated, insulating samples to be imaged under an electron beam. The resolution afforded by this technique is higher than conventional optical microscopy but lower than conventional scanning electron microscopy (CSEM). The major advantage of the technique is the minimal sample preparation needed, making ESEM quick to use and the images less susceptible to the artifacts that the extensive sample preparation usually required for CSEM may introduce. Careful manipulation of both the humidity in the microscope chamber and the beam energy are nevertheless essential to prevent dehydration and beam damage artifacts. In some circumstances it is possible to image live cells in the ESEM (2).In the following sections we introduce the fundamental principles of ESEM imaging before presenting imaging protocols for plant epidermis, mammalian cells, and bacteria. In the first two cases samples are imaged using the secondary electron (topographic) signal, whereas a transmission technique is employed to image bacteria.
Information-Theoretic Assessment of Sample Imaging Systems
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Alter-Gartenberg, Rachel; Park, Stephen K.; Rahman, Zia-ur
1999-01-01
By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.
NASA Astrophysics Data System (ADS)
Ekelöf, Måns; McMurtrie, Erin K.; Nazari, Milad; Johanningsmeier, Suzanne D.; Muddiman, David C.
2017-02-01
High-salt samples present a challenge to mass spectrometry (MS) analysis, particularly when electrospray ionization (ESI) is used, requiring extensive sample preparation steps such as desalting, extraction, and purification. In this study, infrared matrix-assisted laser desorption electrospray ionization (IR-MALDESI) coupled to a Q Exactive Plus mass spectrometer was used to directly analyze 50-μm thick slices of cucumber fermented and stored in 1 M sodium chloride brine. From the several hundred unique substances observed, three triterpenoid lipids produced by cucumbers, β-sitosterol, stigmasterol, and lupeol, were putatively identified based on exact mass and selected for structural analysis. The spatial distribution of the lipids were imaged, and the putative assignments were confirmed by tandem mass spectrometry performed directly on the same cucumber, demonstrating the capacity of the technique to deliver confident identifications from highly complex samples in molar concentrations of salt without the need for sample preparation.
Integrated science and engineering for the OSIRIS-REx asteroid sample return mission
NASA Astrophysics Data System (ADS)
Lauretta, D.
2014-07-01
Introduction: The Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) asteroid sample return mission will survey near-Earth asteroid (101955) Bennu to understand its physical, mineralogical, and chemical properties, assess its resource potential, refine the impact hazard, and return a sample of this body to the Earth [1]. This mission is scheduled for launch in 2016 and will rendezvous with the asteroid in 2018. Sample return to the Earth follows in 2023. The OSIRIS-REx mission has the challenge of visiting asteroid Bennu, characterizing it at global and local scales, then selecting the best site on the asteroid surface to acquire a sample for return to the Earth. Minimizing the risk of exploring an unknown world requires a tight integration of science and engineering to inform flight system and mission design. Defining the Asteroid Environment: We have performed an extensive astronomical campaign in support of OSIRIS-REx. Lightcurve and phase function observations were obtained with UA Observatories telescopes located in southeastern Arizona during the 2005--2006 and 2011--2012 apparitions [2]. We observed Bennu using the 12.6-cm radar at the Arecibo Observatory in 1999, 2005, and 2011 and the 3.5-cm radar at the Goldstone tracking station in 1999 and 2005 [3]. We conducted near-infrared measurements using the NASA Infrared Telescope Facility at the Mauna Kea Observatory in Hawaii in September 2005 [4]. Additional spectral observations were obtained in July 2011 and May 2012 with the Magellan 6.5-m telescope [5]. We used the Spitzer space telescope to observe Bennu in May 2007 [6]. The extensive knowledge gained as a result of our telescopic characterization of Bennu was critical in the selection of this object as the OSIRIS-REx mission target. In addition, we use these data, combined with models of the asteroid, to constrain over 100 different asteroid parameters covering orbital, bulk, rotational, radar, photometric, spectroscopic, thermal, regolith, and asteroid environmental properties. We have captured this information in a mission configuration-controlled document called the Design Reference Asteroid. This information is used across the project to establish the environmental requirements for the flight system and for overall mission design. Maintaining a Pristine Sample: OSIRIS-REx is driven by the top-level science objective to return >60 g of pristine, carbonaceous regolith from asteroid Bennu. We define a "pristine sample" to mean that no foreign material introduced into the sample hampers our scientific analysis. Basically, we know that some contamination will take place --- we just have to document it so that we can subtract it from our analysis of the returned sample. Engineering contamination requirements specify cleanliness in terms of particle counts and thin- films residues --- scientists define it in terms of bulk elemental and organic abundances. After initial discussions with our Contamination Engineers, we agreed on known, albeit challenging, particle and thin-film contamination levels for the Touch-and-Go Sample Acquisition Mechanism (TAGSAM) and the Sample Return Capsule. These levels are achieved using established cleaning procedures while minimizing interferences for sample analysis. Selecting a Sample Site: The Sample Site Selection decision is based on four key data products: Deliverability, Safety, Sampleability, and Science Value Maps. Deliverability quantifies the probability that the Flight Dynamics team can deliver the spacecraft to the desired location on the asteroid surface. Safety maps assess candidate sites against the capabilities of the spacecraft. Sampleability requires an assessment of the asteroid surface properties vs. TAGSAM capabilities. Scientific value maximizes the probability that the collected sample contains organics and volatiles and can be placed in a geological context definitive enough to determine sample history. Science and engineering teams work collaboratively to produce these key decision-making maps.
NASA Astrophysics Data System (ADS)
Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan
2008-03-01
Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.
Precise detection of de novo single nucleotide variants in human genomes.
Gómez-Romero, Laura; Palacios-Flores, Kim; Reyes, José; García, Delfino; Boege, Margareta; Dávila, Guillermo; Flores, Margarita; Schatz, Michael C; Palacios, Rafael
2018-05-22
The precise determination of de novo genetic variants has enormous implications across different fields of biology and medicine, particularly personalized medicine. Currently, de novo variations are identified by mapping sample reads from a parent-offspring trio to a reference genome, allowing for a certain degree of differences. While widely used, this approach often introduces false-positive (FP) results due to misaligned reads and mischaracterized sequencing errors. In a previous study, we developed an alternative approach to accurately identify single nucleotide variants (SNVs) using only perfect matches. However, this approach could be applied only to haploid regions of the genome and was computationally intensive. In this study, we present a unique approach, coverage-based single nucleotide variant identification (COBASI), which allows the exploration of the entire genome using second-generation short sequence reads without extensive computing requirements. COBASI identifies SNVs using changes in coverage of exactly matching unique substrings, and is particularly suited for pinpointing de novo SNVs. Unlike other approaches that require population frequencies across hundreds of samples to filter out any methodological biases, COBASI can be applied to detect de novo SNVs within isolated families. We demonstrate this capability through extensive simulation studies and by studying a parent-offspring trio we sequenced using short reads. Experimental validation of all 58 candidate de novo SNVs and a selection of non-de novo SNVs found in the trio confirmed zero FP calls. COBASI is available as open source at https://github.com/Laura-Gomez/COBASI for any researcher to use. Copyright © 2018 the Author(s). Published by PNAS.
Parks, Donovan H; Beiko, Robert G
2013-01-01
High-throughput sequencing techniques have made large-scale spatial and temporal surveys of microbial communities routine. Gaining insight into microbial diversity requires methods for effectively analyzing and visualizing these extensive data sets. Phylogenetic β-diversity measures address this challenge by allowing the relationship between large numbers of environmental samples to be explored using standard multivariate analysis techniques. Despite the success and widespread use of phylogenetic β-diversity measures, an extensive comparative analysis of these measures has not been performed. Here, we compare 39 measures of phylogenetic β diversity in order to establish the relative similarity of these measures along with key properties and performance characteristics. While many measures are highly correlated, those commonly used within microbial ecology were found to be distinct from those popular within classical ecology, and from the recently recommended Gower and Canberra measures. Many of the measures are surprisingly robust to different rootings of the gene tree, the choice of similarity threshold used to define operational taxonomic units, and the presence of outlying basal lineages. Measures differ considerably in their sensitivity to rare organisms, and the effectiveness of measures can vary substantially under alternative models of differentiation. Consequently, the depth of sequencing required to reveal underlying patterns of relationships between environmental samples depends on the selected measure. Our results demonstrate that using complementary measures of phylogenetic β diversity can further our understanding of how communities are phylogenetically differentiated. Open-source software implementing the phylogenetic β-diversity measures evaluated in this manuscript is available at http://kiwi.cs.dal.ca/Software/ExpressBetaDiversity.
Groundbreaking Mars Sample Return for Science and Human Exploration
NASA Technical Reports Server (NTRS)
Cohen, Barbara; Draper, David; Eppler, Dean; Treiman, Allan
2012-01-01
Partnerships between science and human exploration have recent heritage for the Moon (Lunar Precursor Robotics Program, LPRP) and nearearth objects (Exploration Precursor Robotics Program, xPRP). Both programs spent appreciable time and effort determining measurements needed or desired before human missions to these destinations. These measurements may be crucial to human health or spacecraft design, or may be desired to better optimize systems designs such as spacesuits or operations. Both LPRP and xPRP recommended measurements from orbit, by landed missions and by sample return. LPRP conducted the Lunar Reconnaissance Orbiter (LRO) and Lunar Crater Observation and Sensing Satellite (LCROSS) missions, providing high-resolution visible imagery, surface and subsurface temperatures, global topography, mapping of possible water ice deposits, and the biological effects of radiation [1]. LPRP also initiated a landed mission to provide dust and regolith properties, local lighting conditions, assessment of resources, and demonstration of precision landing [2]. This mission was canceled in 2006 due to funding shortfalls. For the Moon, adequate samples of rocks and regolith were returned by the Apollo and Luna programs to conduct needed investigations. Many near-earth asteroids (NEAs) have been observed from the Earth and several have been more extensively characterized by close-flying missions and landings (NEAR, Hayabusa, Rosetta). The current Joint Robotic Precursor Activity program is considering activities such as partnering with the New Frontiers mission OSIRIS-Rex to visit a NEA and return a sample to the Earth. However, a strong consensus of the NEO User Team within xPRP was that a dedicated mission to the asteroid targeted by humans is required [3], ideally including regolith sample return for more extensive characterization and testing on the Earth.
Gado, Ahmed; Ebeid, Basel; Abdelmohsen, Aida; Axon, Anthony
2011-08-01
Masses discovered by clinical examination, imaging or endoscopic studies that are suspicious for malignancy typically require biopsy confirmation before treatment is initiated. Biopsy specimens may fail to yield a definitive diagnosis if the lesion is extensively ulcerated or otherwise necrotic and viable tumor tissue is not obtained on sampling. The diagnostic yield is improved when multiple biopsy samples (BSs) are taken. A colonoscopy quality-assurance program (CQAP) was instituted in 2003 in our institution. The aim of this study was to determine the effect of instituting a CQAP on the yield of histological sampling in patients with suspected colorectal cancer (CRC) during colonoscopy. Initial assessment of colonoscopy practice was performed in 2003. A total of five patients with suspected CRC during colonoscopy were documented in 2003. BSs confirmed CRC in three (60%) patients and were nondiagnostic in two (40%). A quality-improvement process was instituted which required a minimum six BSs with adequate size of the samples from any suspected CRC during colonoscopy. A total of 37 patients for the period 2004-2010 were prospectively assessed. The diagnosis of CRC was confirmed with histological examination of BSs obtained during colonoscopy in 63% of patients in 2004, 60% in 2005, 50% in 2006, 67% in 2007, 100% in 2008, 67% in 2009 and 100% in 2010. The yield of histological sampling increased significantly ( p <0.02) from 61% in 2004-2007 to 92% in 2008-2010. The implementation of a quality assurance and improvement program increased the yield of histological sampling in patients with suspected CRC during colonoscopy.
Surface drilling technologies for Mars
NASA Technical Reports Server (NTRS)
Blacic, J. D.; Rowley, J. C.; Cort, G. E.
1986-01-01
Rock drilling and coring conceptual designs for the surface activities associated with a manned Mars mission are proposed. Straightforward extensions of equipment and procedures used on Earth are envisioned for the sample coring and shallow high explosive shot holes needed for tunneling and seismic surveying. A novel rocket exhaust jet piercing method is proposed for very rapid drilling of shot holes required for explosive excavation of emergency radiation shelters. Summaries of estimated equipment masses and power requirements are provided, and the indicated rotary coring rigs are scaled from terrestrial equipment and use compressed CO2 from the Martian atmosphere for core bit cooling and cuttings removal. A mass of 120 kg and power of 3 kW(e) are estimated for a 10 m depth capability. A 100 m depth capacity core rig requires about 1150 kg and 32 km(e). The rocket exhaust jet equipment devised for shallow (3m) explosive emplacement shot holes requires no surface power beyond an electrical ignition system, and might have a 15 kg mass.
NASA Astrophysics Data System (ADS)
Straková, Petra; Laiho, Raija
2016-04-01
In this presentation, we assess the merits of using Fourier transform infrared (FTIR) spectra to estimate the organic matter composition in different plant biomass and peat soil samples. Infrared spectroscopy has a great potential in large-scale peatland studies that require low cost and high throughput techniques, as it gives a unique "chemical overview" of a sample, with all the chemical compounds present contributing to the spectrum produced. Our extensive sample sets include soil samples ranging from boreal to tropical peatlands, including sites under different environmental and/or land-use changes; above- and below-ground biomass of different peatland plant species; plant root mixtures. We mainly use FTIR to estimate (1) chemical composition of the samples (e.g., total C and N, C:N ratio, holocellulose, lignin and ash content), (2) proportion of each plant species in root mixtures, and (3) respiration of surface peat. The satisfactory results of our predictive models suggest that this experimental approach can, for example, be used as a screening tool in the evaluation of organic matter composition in peatlands during monitoring of their degradation and/or restoration success.
Is Mars Sample Return Required Prior to Sending Humans to Mars?
NASA Technical Reports Server (NTRS)
Carr, Michael; Abell, Paul; Allwood, Abigail; Baker, John; Barnes, Jeff; Bass, Deborah; Beaty, David; Boston, Penny; Brinkerhoff, Will; Budney, Charles;
2012-01-01
Prior to potentially sending humans to the surface of Mars, it is fundamentally important to return samples from Mars. Analysis in Earth's extensive scientific laboratories would significantly reduce the risk of human Mars exploration and would also support the science and engineering decisions relating to the Mars human flight architecture. The importance of measurements of any returned Mars samples range from critical to desirable, and in all cases these samples will would enhance our understanding of the Martian environment before potentially sending humans to that alien locale. For example, Mars sample return (MSR) could yield information that would enable human exploration related to 1) enabling forward and back planetary protection, 2) characterizing properties of Martian materials relevant for in situ resource utilization (ISRU), 3) assessing any toxicity of Martian materials with respect to human health and performance, and 4) identifying information related to engineering surface hazards such as the corrosive effect of the Martian environment. In addition, MSR would be engineering 'proof of concept' for a potential round trip human mission to the planet, and a potential model for international Mars exploration.
Barriers and Effective Educational Strategies to Develop Extension Agents' Professional Competencies
ERIC Educational Resources Information Center
Lakai, Dona; Jayaratne, K. S. U.; Moore, Gary E.; Kistler, Mark J.
2012-01-01
The study reported here determined the barriers and effective educational strategies to develop Extension agents' professional competencies. This was a descriptive survey research conducted with a random sample of Extension agents. Increased workload and lack of time and funding were identified as the most constraining barriers of Extension agents…
Shelton, Larry R.
1994-01-01
The U.S. Geological Survey's National Water-Quality Assessment program includes extensive data- collection efforts to assess the quality of the Nations's streams. These studies require analyses of stream samples for major ions, nutrients, sediments, and organic contaminants. For the information to be comparable among studies in different parts of the Nation, consistent procedures specifically designed to produce uncontaminated samples for trace analysis in the laboratory are critical. This field guide describes the standard procedures for collecting and processing samples for major ions, nutrients, organic contaminants, sediment, and field analyses of conductivity, pH, alkalinity, and dissolved oxygen. Samples are collected and processed using modified and newly designed equipment made of Teflon to avoid contamination, including nonmetallic samplers (D-77 and DH-81) and a Teflon sample splitter. Field solid-phase extraction procedures developed to process samples for organic constituent analyses produce an extracted sample with stabilized compounds for more accurate results. Improvements to standard operational procedures include the use of processing chambers and capsule filtering systems. A modified collecting and processing procedure for organic carbon is designed to avoid contamination from equipment cleaned with methanol. Quality assurance is maintained by strict collecting and processing procedures, replicate sampling, equipment blank samples, and a rigid cleaning procedure using detergent, hydrochloric acid, and methanol.
On the importance of incorporating sampling weights in ...
Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h
Interfacial RhO{sub x}/CeO{sub 2} sites as locations for low temperature N{sub 2}O dissociation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunningham, J.; Hickey, J.N.; Soria, J.
Temperatures required for extensive N{sub 2}O dissociation to N{sub 2}, or to N{sub 2} plus O{sub 2}, over 0.5% RhO{sub x}/CeO{sub 2} materials, and over polycrystalline Rh{sub 2}O{sub 3} or CeO{sub 2}, are compared for preoxidised and for prereduced samples on the basis of conversions achieved in pulsed-reactant, continuous-flow and recirculatory microcatalytic reactors. Influences of sample prereduction or preoxidation upon those measurements and upon results from parallel ESR and FTIR studies of N{sub 2}O interactions with such materials are presented and compared. Over partially reduced 0.5% RhO{sub x}/CeO{sub 2} materials complete dissociation of N{sub 2}O pulses to N{sub 2} plusmore » O{sub 2} is obtained at temperatures 50-100{degrees} lower than those required for extensive dissociation over prereduced Rh{sub 2}O{sub 3}. Furthermore, N{sub 2} was the sole product from the latter. Higher ongoing N{sub 2}O conversions to N{sub 2} plus O{sub 2} at 623 K over 0.5% Rh/CeO{sub 2} in pulsed-reactant than in continuous-flow mode point to regeneration of active sites under helium flushing between pulses. The TPD profile for dioxygen release from Rhodia containing samples at temperatures 350-550 K is presented. ESR measurements reveal complementary effects of outgassings at temperatures, T{sub v}, {ge} 573 K upon the availability at RhO{sub x}/CeO{sub 2} surfaces of electron-excess sites reactive towards N{sub 2}O. Differences from observations over Rh{sub 2}O{sub 3} and CeO{sub 2} can be understood by attributing the low-temperature activity of RhO{sub x}/CeO{sub 2} to electron excess sites at microinterfaces between the dispersed Rhodia component and the Ceria support.« less
10 CFR 600.171 - Closeout procedures.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Nonprofit Organizations After-The-Award Requirements § 600.171 Closeout procedures. (a) Recipients shall..., and other reports as required by the terms and conditions of the award. DOE may approve extensions when requested by the recipient. (b) Unless DOE authorizes an extension, a recipient shall liquidate...
10 CFR 600.171 - Closeout procedures.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Nonprofit Organizations After-The-Award Requirements § 600.171 Closeout procedures. (a) Recipients shall..., and other reports as required by the terms and conditions of the award. DOE may approve extensions when requested by the recipient. (b) Unless DOE authorizes an extension, a recipient shall liquidate...
10 CFR 600.171 - Closeout procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Nonprofit Organizations After-The-Award Requirements § 600.171 Closeout procedures. (a) Recipients shall..., and other reports as required by the terms and conditions of the award. DOE may approve extensions when requested by the recipient. (b) Unless DOE authorizes an extension, a recipient shall liquidate...
10 CFR 600.171 - Closeout procedures.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Nonprofit Organizations After-The-Award Requirements § 600.171 Closeout procedures. (a) Recipients shall..., and other reports as required by the terms and conditions of the award. DOE may approve extensions when requested by the recipient. (b) Unless DOE authorizes an extension, a recipient shall liquidate...
10 CFR 600.171 - Closeout procedures.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Nonprofit Organizations After-The-Award Requirements § 600.171 Closeout procedures. (a) Recipients shall..., and other reports as required by the terms and conditions of the award. DOE may approve extensions when requested by the recipient. (b) Unless DOE authorizes an extension, a recipient shall liquidate...
45 CFR 150.215 - Extension for good cause.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS CMS Enforcement Processes for Determining Whether States Are Failing To Substantially Enforce HIPAA Requirements § 150.215 Extension for good cause. CMS may...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... Collection for Contractor Information Gathering, Extension Without Revisions AGENCY: Employment and Training... contractor information gathering and reporting requirements (expiration date November 30, 2012). DATES... Job Corps program is such that many activities required of contractors must be coordinated with other...
Multiresponse imaging system design for improved resolution
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Fales, Carl L.; Huck, Friedrich O.; Rahman, Zia-Ur; Reichenbach, Stephen E.
1991-01-01
Multiresponse imaging is a process that acquires A images, each with a different optical response, and reassembles them into a single image with an improved resolution that can approach 1/sq rt A times the photodetector-array sampling lattice. Our goals are to optimize the performance of this process in terms of the resolution and fidelity of the restored image and to assess the amount of information required to do so. The theoretical approach is based on the extension of both image restoration and rate-distortion theories from their traditional realm of signal processing to image processing which includes image gathering and display.
Progress in the application of DNA microarrays.
Lobenhofer, E K; Bushel, P R; Afshari, C A; Hamadeh, H K
2001-01-01
Microarray technology has been applied to a variety of different fields to address fundamental research questions. The use of microarrays, or DNA chips, to study the gene expression profiles of biologic samples began in 1995. Since that time, the fundamental concepts behind the chip, the technology required for making and using these chips, and the multitude of statistical tools for analyzing the data have been extensively reviewed. For this reason, the focus of this review will be not on the technology itself but on the application of microarrays as a research tool and the future challenges of the field. PMID:11673116
Variance in binary stellar population synthesis
NASA Astrophysics Data System (ADS)
Breivik, Katelyn; Larson, Shane L.
2016-03-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations in less than a week, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Studying Variance in the Galactic Ultra-compact Binary Population
NASA Astrophysics Data System (ADS)
Larson, Shane L.; Breivik, Katelyn
2017-01-01
In the years preceding LISA, Milky Way compact binary population simulations can be used to inform the science capabilities of the mission. Galactic population simulation efforts generally focus on high fidelity models that require extensive computational power to produce a single simulated population for each model. Each simulated population represents an incomplete sample of the functions governing compact binary evolution, thus introducing variance from one simulation to another. We present a rapid Monte Carlo population simulation technique that can simulate thousands of populations on week-long timescales, thus allowing a full exploration of the variance associated with a binary stellar evolution model.
Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa.
Siegel, Chloe S; Stevenson, Florence O; Zimmer, Elizabeth A
2017-02-01
An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)-based extraction methods from silica-dried samples. DNA was extracted using FTA cards according to the manufacturer's protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation.
Urinalysis in children and adolescents.
Utsch, Boris; Klaus, Günter
2014-09-12
Urinalysis is the most commonly performed biochemical test in infancy and early childhood. The urine sample should be correctly obtained, age-specific aspects should be considered, and age-dependent reference values should be used. This review is based on a selective literature search in electronic databases, textbooks, and guidelines from Germany and abroad on the acquisition of urine samples and the performance of urinalysis in infancy and early childhood. The timing and mode of acquisition of the urine sample affect the assessment of hematuria, proteinuria, leukocyturia, nitrituria, and the uropathogenic bacterial colony count in the urine culture. Dipstick tests can be used for targeted screening for these features. The test results should be interpreted together with the findings of urine microscopy, the medical history, and the physical examination. Proteinuria should be quantified and differentiated; both of these things can be done either from collected urine or (especially in infants and young children) from a spontaneously voided urine sample, by determination of the protein/creatinine quotient. Orthostatic proteinuria in an adolescent requires no further evaluation or treatment. Hematuria should be characterized as either glomerular or non-glomerular erythrocyturia. Asymptomatic, isolated microhematuria in childhood is not uncommon and often transient; in the absence of a family history, it usually does not require an extensive work-up. Proteinuria combined with hematuria should arouse the suspicion of glomerulonephritis. Urinalysis in infancy and early childhood is a simple and informative diagnostic test as long as the urine sample has been obtained properly and the results are interpreted appropriately for this age group.
Brosteanu, Oana; Schwarz, Gabriele; Houben, Peggy; Paulus, Ursula; Strenge-Hesse, Anke; Zettelmeyer, Ulrike; Schneider, Anja; Hasenclever, Dirk
2017-12-01
Background According to Good Clinical Practice, clinical trials must protect rights and safety of patients and make sure that the trial results are valid and interpretable. Monitoring on-site has an important role in achieving these objectives; it controls trial conduct at trial sites and informs the sponsor on systematic problems. In the past, extensive on-site monitoring with a particular focus on formal source data verification often lost sight of systematic problems in study procedures that endanger Good Clinical Practice objectives. ADAMON is a prospective, stratified, cluster-randomised, controlled study comparing extensive on-site monitoring with risk-adapted monitoring according to a previously published approach. Methods In all, 213 sites from 11 academic trials were cluster-randomised between extensive on-site monitoring (104) and risk-adapted monitoring (109). Independent post-trial audits using structured manuals were performed to determine the frequency of major Good Clinical Practice findings at the patient level. The primary outcome measure is the proportion of audited patients with at least one major audit finding. Analysis relies on logistic regression incorporating trial and monitoring arm as fixed effects and site as random effect. The hypothesis was that risk-adapted monitoring is non-inferior to extensive on-site monitoring with a non-inferiority margin of 0.60 (logit scale). Results Average number of monitoring visits and time spent on-site was 2.1 and 2.7 times higher in extensive on-site monitoring than in risk-adapted monitoring, respectively. A total of 156 (extensive on-site monitoring: 76; risk-adapted monitoring: 80) sites were audited. In 996 of 1618 audited patients, a total of 2456 major audit findings were documented. Depending on the trial, findings were identified in 18%-99% of the audited patients, with no marked monitoring effect in any of the trials. The estimated monitoring effect is -0.04 on the logit scale with two-sided 95% confidence interval (-0.40; 0.33), demonstrating that risk-adapted monitoring is non-inferior to extensive on-site monitoring. At most, extensive on-site monitoring could reduce the frequency of major Good Clinical Practice findings by 8.2% compared with risk-adapted monitoring. Conclusion Compared with risk-adapted monitoring, the potential benefit of extensive on-site monitoring is small relative to overall finding rates, although risk-adapted monitoring requires less than 50% of extensive on-site monitoring resources. Clusters of findings within trials suggest that complicated, overly specific or not properly justified protocol requirements contributed to the overall frequency of findings. Risk-adapted monitoring in only a sample of patients appears sufficient to identify systematic problems in the conduct of clinical trials. Risk-adapted monitoring has a part to play in quality control. However, no monitoring strategy can remedy defects in quality of design. Monitoring should be embedded in a comprehensive quality management approach covering the entire trial lifecycle.
Brosteanu, Oana; Schwarz, Gabriele; Houben, Peggy; Paulus, Ursula; Strenge-Hesse, Anke; Zettelmeyer, Ulrike; Schneider, Anja; Hasenclever, Dirk
2017-01-01
Background According to Good Clinical Practice, clinical trials must protect rights and safety of patients and make sure that the trial results are valid and interpretable. Monitoring on-site has an important role in achieving these objectives; it controls trial conduct at trial sites and informs the sponsor on systematic problems. In the past, extensive on-site monitoring with a particular focus on formal source data verification often lost sight of systematic problems in study procedures that endanger Good Clinical Practice objectives. ADAMON is a prospective, stratified, cluster-randomised, controlled study comparing extensive on-site monitoring with risk-adapted monitoring according to a previously published approach. Methods In all, 213 sites from 11 academic trials were cluster-randomised between extensive on-site monitoring (104) and risk-adapted monitoring (109). Independent post-trial audits using structured manuals were performed to determine the frequency of major Good Clinical Practice findings at the patient level. The primary outcome measure is the proportion of audited patients with at least one major audit finding. Analysis relies on logistic regression incorporating trial and monitoring arm as fixed effects and site as random effect. The hypothesis was that risk-adapted monitoring is non-inferior to extensive on-site monitoring with a non-inferiority margin of 0.60 (logit scale). Results Average number of monitoring visits and time spent on-site was 2.1 and 2.7 times higher in extensive on-site monitoring than in risk-adapted monitoring, respectively. A total of 156 (extensive on-site monitoring: 76; risk-adapted monitoring: 80) sites were audited. In 996 of 1618 audited patients, a total of 2456 major audit findings were documented. Depending on the trial, findings were identified in 18%–99% of the audited patients, with no marked monitoring effect in any of the trials. The estimated monitoring effect is −0.04 on the logit scale with two-sided 95% confidence interval (−0.40; 0.33), demonstrating that risk-adapted monitoring is non-inferior to extensive on-site monitoring. At most, extensive on-site monitoring could reduce the frequency of major Good Clinical Practice findings by 8.2% compared with risk-adapted monitoring. Conclusion Compared with risk-adapted monitoring, the potential benefit of extensive on-site monitoring is small relative to overall finding rates, although risk-adapted monitoring requires less than 50% of extensive on-site monitoring resources. Clusters of findings within trials suggest that complicated, overly specific or not properly justified protocol requirements contributed to the overall frequency of findings. Risk-adapted monitoring in only a sample of patients appears sufficient to identify systematic problems in the conduct of clinical trials. Risk-adapted monitoring has a part to play in quality control. However, no monitoring strategy can remedy defects in quality of design. Monitoring should be embedded in a comprehensive quality management approach covering the entire trial lifecycle. PMID:28786330
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
... Television Signals Pursuant to the Satellite Home Viewer Extension and Reauthorization Act of 2004 AGENCY... Satellite Home Viewer Extension Act of 2004. The information collection requirements were approved on June... Measurement Standards for Digital Television Signals pursuant to the Satellite Home Viewer Extension and...
Agriflection: A Learning Model for Agricultural Extension in South Africa
ERIC Educational Resources Information Center
Worth, S. H.
2006-01-01
Prosperity--continuous and sustainable wealth creation--is an elusive goal in South African smallholder agriculture. This paper suggests that agricultural extension can facilitate realising this objective if an appropriate approach to extension can be developed. To develop such an approach requires that the definition of extension and the…
Dotson, Wesley H; Richman, David M; Abby, Layla; Thompson, Samuel; Plotner, Anthony
2013-08-01
Employment opportunities for people with developmental disabilities (DD) have improved in the last several decades. There is increasing focus on helping people with DD sample more diverse employment options, including running their own businesses. The present study (1) evaluated the effects of a well-established behavioral teaching procedure on the acquisition of a sample of three broad classes of skills related to self-employment (worker, supervisor, and clerical work) in young adults with DD within an analog recycling business, and (2) investigated the extension of that treatment to the natural environment while working in isolation or in peer pairs. Results suggest that the teaching procedure was effective in teaching three broad classes of skills related to many self-employment possibilities, the skills generalized to the natural environment, and peer pairs supported each other to complete tasks with a high degree of accuracy required to run a recycling business. This study represents an initial demonstration that adults with DD can learn skills required to run their own business. Copyright © 2013 Elsevier Ltd. All rights reserved.
78 FR 16299 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... Federal minimum wage, overtime pay, recordkeeping, and youth employment standards of most general... DEPARTMENT OF LABOR Wage and Hour Division RIN 1235-0018 Proposed Extension of the Approval of Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice...
76 FR 28242 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-16
... DEPARTMENT OF LABOR Wage and Hour Division Proposed Extension of the Approval of Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice. SUMMARY: The... be properly assessed. Currently, the Wage and Hour Division is soliciting comments concerning its...
NASA Technical Reports Server (NTRS)
Decker, Ryan K.; Walker, John R.; Barbre, Robert E., Jr.; Leach, Richard D.
2015-01-01
Atmospheric wind data are required by space launch vehicles in order to assess flight vehicle loads and performance on day-of-launch. Space launch ranges at NASA's Kennedy Space Center co-located with the United States Air Force's (USAF) Eastern Range (ER) at Cape Canaveral Air Force Station and USAF's Western Range (WR) at Vandenberg Air Force Base have extensive networks of in-situ and remote sensing instrumentation to measure atmospheric winds. Each instrument's technique to measure winds has advantages and disadvantages in regards to use within vehicle trajectory analyses. Balloons measure wind at all altitudes necessary for vehicle assessments, but two primary disadvantages exist when applying balloon output. First, balloons require approximately one hour to reach required altitudes. Second, balloons are steered by atmospheric winds down range of the launch site that could significantly differ from those winds along the vehicle ascent trajectory. These issues are mitigated by use of vertically pointing Doppler Radar Wind Profilers (DRWPs). However, multiple DRWP instruments are required to provide wind data over altitude ranges necessary for vehicle trajectory assessments. The various DRWP systems have different operating configurations resulting in different temporal and spatial sampling intervals. Therefore, software was developed to combine data from both DRWP-generated profiles into a single profile for use in vehicle trajectory analyses. This paper will present details of the splicing software algorithms and will provide sample output.
Mikolajczuk, Agnieszka; Przyk, Elzbieta Perez; Geypens, Benny; Berglund, Michael; Taylor, Philip
2010-03-01
Compound specific isotopic analysis (CSIA) can provide information about the origin of analysed compounds - in this case, polycyclic aromatic hydrocarbons (PAHs). In the study, PAHs were extracted from three dust samples: winter and summer filter dust and tunnel dust. The measurement was performed using the method validated in our laboratory using pure, solid compounds and EPA 610 reference assortment. CSIA required an appropriate clean-up method to avoid an unresolved complex in the gas chromatographic analysis usually found in the chromatography of PAHs. Extensive sample clean-up for this particular matrix was found to be necessary to obtain good gas chromatography-combustion-isotope ratio mass spectrometry analysis results. The sample purification method included two steps in which the sample is cleaned up and the aliphatic and aromatic hydrocarbons are separated. The concentration of PAHs in the measured samples was low; so a large volume injection technique (100 microl) was applied. The delta(VPDB)(13)C was measured with a final uncertainty smaller than 1 per thousand. Comparison of the delta(VPDB)(13)C signatures of PAHs extracted from different dust samples was feasible with this method and, doing so, significant differences were observed.
Ageing tests and recovery procedures of silica aerogel
NASA Astrophysics Data System (ADS)
Perego, D. L.
2008-09-01
Silica aerogel has been extensively used in RICH detectors for the identification of charged particles over the momentum range between 1 and 10 GeV/c. Tiles of hygroscopic aerogel with large transverse dimensions (20×20 cm2) and refractive index n=1.03 have recently been produced for use in the LHCb experiment, allowing pion-kaon identification up to 10 GeV/c. The tiles have excellent optical properties (clarity factor better than 0.006 μm4/cm and homogeneity σ(n-1)/(n-1)˜1% within the tile). Extensive R&D tests on aerogel samples have been performed. Samples have been exposed to intense irradiation (proton, neutron and gamma), to humid air, to standard black varnish (used to paint the inner surface of RICH detectors), and to C 4F 10 and CO 2 gases. The optical properties of the aerogel have been monitored during these tests and, when required, recovery procedures have been investigated and applied. In particular, regeneration of the tiles has been realized through exposure to dry atmosphere (gaseous N 2) or through baking for several hours at 500C. The measurements demonstrate that the optical properties have been successfully restored to their values at the production stage, and in no case permanent degradation has been observed.
ERIC Educational Resources Information Center
Zare, Mohsen Nazarzadeh; Dorrani, Kamal; Lavasani, Masoud Gholamali
2012-01-01
Background and purpose: This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample: The statistical population consisted of 5060 farmers and 50 extension…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
... Counseling Program: New Certification Requirements; Extension of Public Comment Period AGENCY: Office of the... inviting public comment on proposed changes to the Housing Counseling Program regulations for the purpose... housing counseling statute. This document announces that HUD is extending the public comment period, for...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
...] Telecommunications; Extension of the Office of Management and Budget's (OMB) Approval of Information Collection... the Standard on Telecommunications (29 CFR 1910.268). The purpose of this requirement is to ensure... of the information collection requirement contained in the Standard on Telecommunications (29 CFR...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... Standard on Personal Protective Equipment (PPE) for Shipyard Employment; Extension of the Office of... requirements specified in the Standard on Personal Protective Equipment (PPE) for Shipyard Employment (29 CFR... information collection requirements contained in the Standard on Personal Protective Equipment (PPE) for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
...- 1648. Mail, hand delivery, express mail, messenger, or courier service: When using this method, you... DEPARTMENT OF LABOR Occupational Safety and Health Administration [Docket No. OSHA-2010-0008] Construction Fall Protection Systems Criteria and Practices, and Training Requirements; Extension of the Office...
Inertial subsystem functional and design requirements for the orbiter (Phase B extension baseline)
NASA Technical Reports Server (NTRS)
Flanders, J. H.; Green, J. P., Jr.
1972-01-01
The design requirements use the Phase B extension baseline system definition. This means that a GNC computer is specified for all command control functions instead of a central computer communicating with the ISS through a databus. Forced air cooling is used instead of cold plate cooling.
78 FR 12825 - Petition for Extension of Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-25
... the frequency of the required visual track inspections. FRA issued the initial waiver that granted.... SEPTA requests an extension of approval to reduce the frequency of required, visual track inspections... with continuous welded rail. SEPTA proposes to conduct one visual track inspection per week, instead of...
Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M
2015-03-01
It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
"JOE's" Niche in the Extension Scholarship Movement
ERIC Educational Resources Information Center
Franz, Nancy K.; Stovall, Celvia E.
2012-01-01
Extension's sustainability is tied to relationships with academia. Now more than ever, Extension faculty and staff need to integrate their work into the aims of their university to gain credibility, relevance, and support. This requires Extension workers to more deeply and widely document and share the scholarship of their work with academics…
Kain, Jay; Martorello, Laura; Swanson, Edward; Sego, Sandra
2011-01-01
The purpose of the randomized clinical study was to scientifically assess which intervention increases passive range of motion most effectively: the indirect tri-planar myofascial release (MFR) technique or the application of hot packs for gleno-humeral joint flexion, extension, and abduction. A total of 31 participants from a sample of convenience were randomly assigned to examine whether or not MFR was as effective in increasing range of motion as hot packs. The sample consisted of students at American International College. Students were randomly assigned to two groups: hot pack application (N=13) or MFR technique (N=18). The independent variable was the intervention, either the tri-planar MFR technique or the hot pack application. Group one received the indirect tri-planar MFR technique once for 3min. Group two received one hot pack application for 20min. The dependent variables, passive gleno-humeral shoulder range of motion in shoulder flexion, shoulder extension, and shoulder abduction, were taken pre- and post-intervention for both groups. Data was analyzed through the use of a two-way factorial design with mixed-factors ANOVA. Prior to conducting the study, inter-rater reliability was established using three testers for goniometric measures. A 2 (type of intervention: hot packs or MFR) by 2 (pre-test or post-test) mixed-factors ANOVA was calculated. Significant increases in range of motion were found for flexion, extension and abduction when comparing pre-test scores to post-test scores. The results of the ANOVA showed that for passive range of motion no differences were found for flexion, extension and abduction between the effectiveness of hot packs and MFR. For each of the dependent variables measured, MFR was shown to be as effective as hot packs in increasing range of motion, supporting the hypothesis. Since there was no significant difference between the types of intervention, both the hot pack application and the MFR technique were found to be equally effective in increasing passive range of motion of the joint in flexion, extension, and abduction of the gleno-humeral joint. The indirect tri-planar intervention could be considered more effective as an intervention in terms of time spent with a patient and the number of patients seen in a 20-min period. No equipment is required to carry out the MFR intervention, whereby using a hot pack requires the hot pack, towels, and a hydraculator unit with the use of the indirect tri-planar intervention, a therapist could treat four to five patients in the time it would take for one standard hot pack treatment of 20min, less the hands-on intervention of the therapist. Copyright © 2009 Elsevier Ltd. All rights reserved.
Expedited quantification of mutant ribosomal RNA by binary deoxyribozyme (BiDz) sensors.
Gerasimova, Yulia V; Yakovchuk, Petro; Dedkova, Larisa M; Hecht, Sidney M; Kolpashchikov, Dmitry M
2015-10-01
Mutations in ribosomal RNA (rRNA) have traditionally been detected by the primer extension assay, which is a tedious and multistage procedure. Here, we describe a simple and straightforward fluorescence assay based on binary deoxyribozyme (BiDz) sensors. The assay uses two short DNA oligonucleotides that hybridize specifically to adjacent fragments of rRNA, one of which contains a mutation site. This hybridization results in the formation of a deoxyribozyme catalytic core that produces the fluorescent signal and amplifies it due to multiple rounds of catalytic action. This assay enables us to expedite semi-quantification of mutant rRNA content in cell cultures starting from whole cells, which provides information useful for optimization of culture preparation prior to ribosome isolation. The method requires less than a microliter of a standard Escherichia coli cell culture and decreases analysis time from several days (for primer extension assay) to 1.5 h with hands-on time of ∼10 min. It is sensitive to single-nucleotide mutations. The new assay simplifies the preliminary analysis of RNA samples and cells in molecular biology and cloning experiments and is promising in other applications where fast detection/quantification of specific RNA is required. © 2015 Gerasimova et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Ethical aspects of aging research.
Seppet, Enn; Pääsuke, Mati; Conte, Maria; Capri, Miriam; Franceschi, Claudio
2011-12-01
During the last 50-60 years, due to development of medical care and hygienically safe living conditions, the average life span of European citizens has substantially increased, with a rapid growth of the population older than 65 years. This trend places ever-growing medical and economical burden on society, as many of the older subjects suffer from age-related diseases and frailty. Coping with these problems requires not only appropriate medical treatment and social support but also extensive research in many fields of aging-from biology to sociology, with involvement of older people as the research subjects. This work anticipates development and application of ethical standards suited to dynamic advances in aging research. The aim of this review is to update the knowledge in ethical requirements toward recruitment of older research subjects, obtaining of informed consent, collection of biological samples, and use of stem cells in preclinical and clinical settings. It is concluded that application of adequate ethical platform markedly facilitates recruitment of older persons for participation in research. Currently, the basic ethical concepts are subjected to extensive discussion, with participation of all interested parties, in order to guarantee successful research on problems of human aging, protect older people from undesired interference, and afford their benefits through supporting innovations in research, therapy, and care.
McHugh, S M; Tyrrell, E; Johnson, B; Healy, O; Perry, I J; Normand, C
2015-12-01
This article aims to estimate the workforce and resource implications of the proposed age extension of the national breast screening programme, under the economic constraints of reduced health budgets and staffing levels in the Irish health system. Using a mixed method design, a purposive sample of 20 participants were interviewed and data were analysed thematically (June-September 2012). Quantitative data (programme-level activity data, screening activity, staffing levels and screening plans) were used to model potential workload and resource requirements. The analysis indicates that over 90% operational efficiency was achieved throughout the first six months of 2012. Accounting for maternity leave (10%) and sick leave (3.5%), 16.1 additional radiographers (whole time equivalent) would be required for the workload created by the age extension of the screening programme, at 90% operational efficiency. The results suggest that service expansion is possible with relatively minimal additional radiography resources if the efficiency of the skill mix and the use of equipment are improved. Investing in the appropriate skill mix should not be limited to clinical groups but should also include administrative staff to manage and support the service. Workload modelling may contribute to improved health workforce planning and service efficiency. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Wildlife-friendly farming benefits rare birds, bees and plants
Pywell, Richard F.; Heard, Matthew S.; Bradbury, Richard B.; Hinsley, Shelley; Nowakowski, Marek; Walker, Kevin J.; Bullock, James M.
2012-01-01
Agricultural intensification is a leading cause of global biodiversity loss, especially for threatened and near-threatened species. One widely implemented response is ‘wildlife-friendly farming’, involving the close integration of conservation and extensive farming practices within agricultural landscapes. However, the putative benefits from this controversial policy are currently either unknown or thought unlikely to extend to rare and declining species. Here, we show that new, evidence-based approaches to habitat creation on intensively managed farmland in England can achieve large increases in plant, bee and bird species. In particular, we found that habitat enhancement methods designed to provide the requirements of sensitive target biota consistently increased the richness and abundance of both rare and common species, with 10-fold to greater than 100-fold more rare species per sample area than generalized conventional conservation measures. Furthermore, targeting landscapes of high species richness amplified beneficial effects on the least mobile taxa: plants and bees. Our results provide the first unequivocal support for a national wildlife-friendly farming policy and suggest that this approach should be implemented much more extensively to address global biodiversity loss. However, to be effective, these conservation measures must be evidence-based, and developed using sound knowledge of the ecological requirements of key species. PMID:22675140
Compressing Aviation Data in XML Format
NASA Technical Reports Server (NTRS)
Patel, Hemil; Lau, Derek; Kulkarni, Deepak
2003-01-01
Design, operations and maintenance activities in aviation involve analysis of variety of aviation data. This data is typically in disparate formats making it difficult to use with different software packages. Use of a self-describing and extensible standard called XML provides a solution to this interoperability problem. XML provides a standardized language for describing the contents of an information stream, performing the same kind of definitional role for Web content as a database schema performs for relational databases. XML data can be easily customized for display using Extensible Style Sheets (XSL). While self-describing nature of XML makes it easy to reuse, it also increases the size of data significantly. Therefore, transfemng a dataset in XML form can decrease throughput and increase data transfer time significantly. It also increases storage requirements significantly. A natural solution to the problem is to compress the data using suitable algorithm and transfer it in the compressed form. We found that XML-specific compressors such as Xmill and XMLPPM generally outperform traditional compressors. However, optimal use of Xmill requires of discovery of optimal options to use while running Xmill. This, in turn, depends on the nature of data used. Manual disc0ver.y of optimal setting can require an engineer to experiment for weeks. We have devised an XML compression advisory tool that can analyze sample data files and recommend what compression tool would work the best for this data and what are the optimal settings to be used with a XML compression tool.
NASA Technical Reports Server (NTRS)
Nebenfuhr, A.; Lomax, T. L.
1998-01-01
We have developed an improved method for determination of gene expression levels with RT-PCR. The procedure is rapid and does not require extensive optimization or densitometric analysis. Since the detection of individual transcripts is PCR-based, small amounts of tissue samples are sufficient for the analysis of expression patterns in large gene families. Using this method, we were able to rapidly screen nine members of the Aux/IAA family of auxin-responsive genes and identify those genes which vary in message abundance in a tissue- and light-specific manner. While not offering the accuracy of conventional semi-quantitative or competitive RT-PCR, our method allows quick screening of large numbers of genes in a wide range of RNA samples with just a thermal cycler and standard gel analysis equipment.
7 CFR 3419.7 - Redistribution of funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE MATCHING FUNDS REQUIREMENT FOR AGRICULTURAL RESEARCH AND EXTENSION FORMULA FUNDS AT 1890 LAND-GRANT INSTITUTIONS, INCLUDING TUSKEGEE UNIVERSITY, AND AT... formula funds. Unmatched research and extension funds will be reapportioned in accordance with the...
Howard, Matt; Bakker-Dyos, J; Gallagher, L; O'Hara, J P; Woods, D; Mellor, A
2018-02-01
The British Service Dhaulagiri Research Expedition (BSDMRE) took place from 27 March to 31 May 2016. The expedition involved 129 personnel, with voluntary participation in nine different study protocols. Studies were conducted in three research camps established at 3600, 4600 and 5140 m and involved taking and storing blood samples, cardiac echocardiography and investigations involving a balance plate. Research in this remote environment requires careful planning in order to provide a robust and resilient power plan. In this paper we aim to report the rationale for the choices we made in terms of power supply, the equipment used and potential military applicability. This is a descriptive account from the expedition members involved in planning and conducting the medical research. Power calculations were used to determine estimates of requirement prior to the expedition. The primary sources used to generate power were internal combustion engine (via petrol fuelled electric generators) and solar panels. Having been generated, power was stored using lithium-ion batteries. Special consideration was given to the storage of samples taken in the field, for which electric freezers and dry shippers were used. All equipment used functioned well during the expedition, with the challenges of altitude, temperature and transport all overcome due to extensive prior planning. Power was successfully generated, stored and delivered during the BSDMRE, allowing extensive medical research to be undertaken. The challenges faced and overcome are directly applicable to delivering military medical care in austere environments, and lessons learnt can help with the planning and delivery of future operations, training exercises or expeditions. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Ceppi, Marcello; Gallo, Fabio; Bonassi, Stefano
2011-01-01
The most common study design performed in population studies based on the micronucleus (MN) assay, is the cross-sectional study, which is largely performed to evaluate the DNA damaging effects of exposure to genotoxic agents in the workplace, in the environment, as well as from diet or lifestyle factors. Sample size is still a critical issue in the design of MN studies since most recent studies considering gene-environment interaction, often require a sample size of several hundred subjects, which is in many cases difficult to achieve. The control of confounding is another major threat to the validity of causal inference. The most popular confounders considered in population studies using MN are age, gender and smoking habit. Extensive attention is given to the assessment of effect modification, given the increasing inclusion of biomarkers of genetic susceptibility in the study design. Selected issues concerning the statistical treatment of data have been addressed in this mini-review, starting from data description, which is a critical step of statistical analysis, since it allows to detect possible errors in the dataset to be analysed and to check the validity of assumptions required for more complex analyses. Basic issues dealing with statistical analysis of biomarkers are extensively evaluated, including methods to explore the dose-response relationship among two continuous variables and inferential analysis. A critical approach to the use of parametric and non-parametric methods is presented, before addressing the issue of most suitable multivariate models to fit MN data. In the last decade, the quality of statistical analysis of MN data has certainly evolved, although even nowadays only a small number of studies apply the Poisson model, which is the most suitable method for the analysis of MN data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inserra, C.; Smartt, S. J.; Jerkstrand, A.
We report extensive observational data for five of the lowest redshift Super-Luminous Type Ic Supernovae (SL-SNe Ic) discovered to date, namely, PTF10hgi, SN2011ke, PTF11rks, SN2011kf, and SN2012il. Photometric imaging of the transients at +50 to +230 days after peak combined with host galaxy subtraction reveals a luminous tail phase for four of these SL-SNe. A high-resolution, optical, and near-infrared spectrum from xshooter provides detection of a broad He I {lambda}10830 emission line in the spectrum (+50 days) of SN2012il, revealing that at least some SL-SNe Ic are not completely helium-free. At first sight, the tail luminosity decline rates that wemore » measure are consistent with the radioactive decay of {sup 56}Co, and would require 1-4 M{sub Sun} of {sup 56}Ni to produce the luminosity. These {sup 56}Ni masses cannot be made consistent with the short diffusion times at peak, and indeed are insufficient to power the peak luminosity. We instead favor energy deposition by newborn magnetars as the power source for these objects. A semi-analytical diffusion model with energy input from the spin-down of a magnetar reproduces the extensive light curve data well. The model predictions of ejecta velocities and temperatures which are required are in reasonable agreement with those determined from our observations. We derive magnetar energies of 0.4 {approx}< E(10{sup 51} erg) {approx}< 6.9 and ejecta masses of 2.3 {approx}< M{sub ej}(M{sub Sun }) {approx}< 8.6. The sample of five SL-SNe Ic presented here, combined with SN 2010gx-the best sampled SL-SNe Ic so far-points toward an explosion driven by a magnetar as a viable explanation for all SL-SNe Ic.« less
Automated clinical annotation of tissue bank specimens.
Gilbertson, John R; Gupta, Rajnish; Nie, Yimin; Patel, Ashokkumar A; Becich, Michael J
2004-01-01
Modern, molecular bio-medicine is driving a growing demand for extensively annotated tissue bank specimens. With careful clinical, pathologic and outcomes annotation, samples can be better matched to the research question at hand and experimental results better understood and verified. However, the difficulty and expense of detailed specimen annotation is well beyond the capability of most banks and has made access to well documented tissue a major limitation in medical re-search. In this context, we have implemented automated annotation of banked tissue by integrating data from three clinical systems--the cancer registry, the pathology LIS and the tissue bank inventory system--through a classical data warehouse environment. The project required modification of clinical systems, development of methods to identify patients between and map data elements across systems and the creation of de-identified data in data marts for use by researchers. The result has been much more extensive and accurate initial tissue annotation with less effort in the tissue bank, as well as dynamic ongoing annotation as the cancer registry follows patients over time.
The distribution of galaxies within the 'Great Wall'
NASA Technical Reports Server (NTRS)
Ramella, Massimo; Geller, Margaret J.; Huchra, John P.
1992-01-01
The galaxy distribution within the 'Great Wall', the most striking feature in the first three 'slices' of the CfA redshift survey extension is examined. The Great Wall is extracted from the sample and is analyzed by counting galaxies in cells. The 'local' two-point correlation function within the Great Wall is computed and the local correlation length, is estimated 15/h Mpc, about 3 times larger than the correlation length for the entire sample. The redshift distribution of galaxies in the pencil-beam survey by Broadhurst et al. (1990) shows peaks separated about by large 'voids', at least to a redshift of about 0.3. The peaks might represent the intersections of their about 5/h Mpc pencil beams with structures similar to the Great Wall. Under this hypothesis, sampling of the Great Walls shows that l approximately 12/h Mpc is the minimum projected beam size required to detect all the 'walls' at redshifts between the peak of the selection function and the effective depth of the survey.
Ferromagnetic resonance studies of lunar core stratigraphy
NASA Technical Reports Server (NTRS)
Housley, R. M.; Cirlin, E. H.; Goldberg, I. B.; Crowe, H.
1976-01-01
We first review the evidence which links the characteristic ferromagnetic resonance observed in lunar fines samples with agglutinatic glass produced primarily by micrometeorite impacts and present new results on Apollo 15, 16, and 17 breccias which support this link by showing that only regolith breccias contribute significantly to the characteristic FMR intensity. We then provide a calibration of the amount of Fe metal in the form of uniformly magnetized spheres required to give our observed FMR intensities and discuss the theoretical magnetic behavior to be expected of Fe spheres as a function of size. Finally, we present FMR results on samples from every 5 mm interval in the core segments 60003, 60009, and 70009. These results lead us to suggest: (1) that secondary mixing may generally be extensive during regolith deposition so that buried regolith surfaces are hard to recognize or define; and (2) that local grinding of rocks and pebbles during deposition may lead to short scale fluctuations in grain size, composition, and apparent exposure age of samples.
OSIRIS-REx Touch-And-Go (TAG) Navigation Performance
NASA Technical Reports Server (NTRS)
Berry, Kevin; Antreasian, Peter; Moreau, Michael C.; May, Alex; Sutter, Brian
2015-01-01
The Origins Spectral Interpretation Resource identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in 2016 to rendezvous with the near-Earth asteroid (101955) Bennu in late 2018. Following an extensive campaign of proximity operations activities to characterize the properties of Bennu and select a suitable sample site, OSIRIES-REx will fly a Touch-And-Go (TAG) trajectory to the asteroid's surface to obtain a regolith sample. The paper summarizes the mission design of the TAG sequence, the propulsive required to achieve the trajectory, and the sequence of events leading up to the TAG event. The paper will summarize the Monte-Carlo simulation of the TAG sequence and present analysis results that demonstrate the ability to conduct the TAG within 25 meters of the selected sample site and +-2 cms of the targeted contact velocity. The paper will describe some of the challenges associated with conducting precision navigation operations and ultimately contacting a very small asteroid.
Ultrasound: a subexploited tool for sample preparation in metabolomics.
Luque de Castro, M D; Delgado-Povedano, M M
2014-01-02
Metabolomics, one of the most recently emerged "omics", has taken advantage of ultrasound (US) to improve sample preparation (SP) steps. The metabolomics-US assisted SP step binomial has experienced a dissimilar development that has depended on the area (vegetal or animal) and the SP step. Thus, vegetal metabolomics and US assisted leaching has received the greater attention (encompassing subdisciplines such as metallomics, xenometabolomics and, mainly, lipidomics), but also liquid-liquid extraction and (bio)chemical reactions in metabolomics have taken advantage of US energy. Also clinical and animal samples have benefited from US assisted SP in metabolomics studies but in a lesser extension. The main effects of US have been shortening of the time required for the given step, and/or increase of its efficiency or availability for automation; nevertheless, attention paid to potential degradation caused by US has been scant or nil. Achievements and weak points of the metabolomics-US assisted SP step binomial are discussed and possible solutions to the present shortcomings are exposed. Copyright © 2013 Elsevier B.V. All rights reserved.
OSIRI-REx Touch and Go (TAG) Navigation Performance
NASA Technical Reports Server (NTRS)
Berry, Kevin; Antreasian, Peter; Moreau, Michael C.; May, Alex; Sutter, Brian
2015-01-01
The Origins Spectral Interpretation Resource Identification Security Regolith Explorer (OSIRIS-REx) mission is a NASA New Frontiers mission launching in 2016 to rendezvous with the near-Earth asteroid (101955) Bennu in late 2018. Following an extensive campaign of proximity operations activities to characterize the properties of Bennu and select a suitable sample site, OSIRIS-REx will fly a Touch-And-Go (TAG) trajectory to the asteroid's surface to obtain a regolith sample. The paper summarizes the mission design of the TAG sequence, the propulsive maneuvers required to achieve the trajectory, and the sequence of events leading up to the TAG event. The paper also summarizes the Monte-Carlo simulation of the TAG sequence and presents analysis results that demonstrate the ability to conduct the TAG within 25 meters of the selected sample site and 2 cm/s of the targeted contact velocity. The paper describes some of the challenges associated with conducting precision navigation operations and ultimately contacting a very small asteroid.
NASA Astrophysics Data System (ADS)
Ferkinhoff, Carl; Hershey, Deborah; Scrabeck, Alex; Higdon, Sarah; Higdon, James L.; Tidwell, Hannah; Lamarche, Cody; Vishwas, Amit; Nikola, Thomas; Stacey, Gordon J.; Brisbin, Drew
2018-06-01
Galaxies have evolved significantly from the early Universe until today. Star formation rates, stellar and molecular gas masses, sizes and metal enrichment of galaxies have all changed significantly from early epochs until the present. Probing the physical conditions of galaxy at high redshift is vital to understanding this evolution. ZINGRS, the ZEUS 1 and 2 INvestigated Galaxy Reference Sample, provides a unique and powerful window for this work. The sample consists of more than ~30 galaxies from z ~ 1 - 4.5 for which the far-IR fine-structure lines (e.g. [CII] 158 micron, [NII] 122micron, [OIII] 88 micron) have been observed with the ZEUS-1 and 2 instruments. These lines are ideal for studying high-z systems since they require low energies for excitation, are typically optically thin, and are not susceptible to extinction from dust. ZINGRS is the largest collection of far-IR fine-structure line detections at high-z. Here we describe the sample, including extensive multifrequency supporting observations like CO & radio continuum, and summarize what we have learned so far.
Structural study of the membrane protein MscL using cell-free expression and solid-state NMR
NASA Astrophysics Data System (ADS)
Abdine, Alaa; Verhoeven, Michiel A.; Park, Kyu-Ho; Ghazi, Alexandre; Guittet, Eric; Berrier, Catherine; Van Heijenoort, Carine; Warschawski, Dror E.
2010-05-01
High-resolution structures of membrane proteins have so far been obtained mostly by X-ray crystallography, on samples where the protein is surrounded by detergent. Recent developments of solid-state NMR have opened the way to a new approach for the study of integral membrane proteins inside a membrane. At the same time, the extension of cell-free expression to the production of membrane proteins allows for the production of proteins tailor made for NMR. We present here an in situ solid-state NMR study of a membrane protein selectively labeled through the use of cell-free expression. The sample consists of MscL (mechano-sensitive channel of large conductance), a 75 kDa pentameric α-helical ion channel from Escherichia coli, reconstituted in a hydrated lipid bilayer. Compared to a uniformly labeled protein sample, the spectral crowding is greatly reduced in the cell-free expressed protein sample. This approach may be a decisive step required for spectral assignment and structure determination of membrane proteins by solid-state NMR.
Cross-sensor iris recognition through kernel learning.
Pillai, Jaishanker K; Puertas, Maria; Chellappa, Rama
2014-01-01
Due to the increasing popularity of iris biometrics, new sensors are being developed for acquiring iris images and existing ones are being continuously upgraded. Re-enrolling users every time a new sensor is deployed is expensive and time-consuming, especially in applications with a large number of enrolled users. However, recent studies show that cross-sensor matching, where the test samples are verified using data enrolled with a different sensor, often lead to reduced performance. In this paper, we propose a machine learning technique to mitigate the cross-sensor performance degradation by adapting the iris samples from one sensor to another. We first present a novel optimization framework for learning transformations on iris biometrics. We then utilize this framework for sensor adaptation, by reducing the distance between samples of the same class, and increasing it between samples of different classes, irrespective of the sensors acquiring them. Extensive evaluations on iris data from multiple sensors demonstrate that the proposed method leads to improvement in cross-sensor recognition accuracy. Furthermore, since the proposed technique requires minimal changes to the iris recognition pipeline, it can easily be incorporated into existing iris recognition systems.
Umar, Sulaiman; Man, Norsida; Nawi, Nolila Mohd; Latif, Ismail Abd; Samah, Bahaman Abu
2017-06-01
The study described the perceived importance of, and proficiency in core agricultural extension competencies among extension workers in Peninsular Malaysia; and evaluating the resultant deficits in the competencies. The Borich's Needs Assessment Model was used to achieve the objectives of the study. A sample of 298 respondents was randomly selected and interviewed using a pre-tested structured questionnaire. Thirty-three core competency items were assessed. Instrument validity and reliability were ensured. The cross-sectional data obtained was analysed using SPSS for descriptive statistics including mean weighted discrepancy score (MWDS). Results of the study showed that on a scale of 5, the most important core extension competency items according to respondents' perception were: "Making good use of information and communication technologies/access and use of web-based resources" (M=4.86, SD=0.23); "Conducting needs assessments" (M=4.84, SD=0.16); "organizing extension campaigns" (M=4.82, SD=0.47) and "Managing groups and teamwork" (M=4.81, SD=0.76). In terms of proficiency, the highest competency identified by the respondents was "Conducting farm and home visits (M=3.62, SD=0.82) followed by 'conducting meetings effectively' (M=3.19, SD=0.72); "Conducting focus group discussions" (M=3.16, SD=0.32) and "conducting community forums" (M=3.13, SD=0.64). The discrepancies implying competency deficits were widest in "Acquiring and allocating resources" (MWDS=12.67); use of information and communication technologies (ICTs) and web-based resources in agricultural extension (MWDS=12.59); and report writing and sharing the results and impacts (MWDS=11.92). It is recommended that any intervention aimed at developing the capacity of extension workers in Peninsular Malaysia should prioritize these core competency items in accordance with the deficits established in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.
Exploring the Use of Information Communication Technologies by Selected Caribbean Extension Officers
ERIC Educational Resources Information Center
Strong, Robert; Ganpat, Wayne; Harder, Amy; Irby, Travis L.; Lindner, James R.
2014-01-01
Purpose: The purpose of this study was to describe selected Caribbean extension officers' technology preferences and examine factors that may affect their technology preferences. Design/methodology/approach: The sample consisted of extension officers (N = 119) participating in professional development training sessions in Grenada, Belize and Saint…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-16
...In accordance with the Paperwork Reduction Act of 1995, this notice announces the Animal and Plant Health Inspection Service's intention to request an extension of approval of an information collection associated with the requirements for requests to amend import regulations for plants, plant parts, and plant products.
78 FR 64984 - Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... DEPARTMENT OF LABOR Wage and Hour Division RIN 1235-0018 Extension of the Approval of Information Collection Requirements AGENCY: Wage and Hour Division, Department of Labor. ACTION: Notice. SUMMARY: The Paperwork Reduction Act of 1995 (PRA), 44 U.S.C. 3501 et seq., and its attendant regulations, 5 CFR part...
26 CFR 1.6161-1 - Extension of time for paying tax or deficiency.
Code of Federal Regulations, 2010 CFR
2010-04-01
... tax is required to be paid to the Director of International Operations, such application must be filed... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Extension of time for paying tax or deficiency... (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Extensions of Time for Payment § 1.6161-1 Extension of time...
Wang, Ophelia; Zachmann, Luke J; Sesnie, Steven E; Olsson, Aaryn D; Dickson, Brett G
2014-01-01
Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives.
Wang, Ophelia; Zachmann, Luke J.; Sesnie, Steven E.; Olsson, Aaryn D.; Dickson, Brett G.
2014-01-01
Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives. PMID:25019621
7 CFR 3419.6 - Use of matching funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE MATCHING FUNDS REQUIREMENT FOR AGRICULTURAL RESEARCH AND EXTENSION FORMULA FUNDS AT 1890 LAND-GRANT INSTITUTIONS, INCLUDING TUSKEGEE UNIVERSITY, AND AT...) of the National Agricultural Research, Extension, and Teaching Policy Act of 1977, section 7 of the...
Code of Federal Regulations, 2013 CFR
2013-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
Code of Federal Regulations, 2012 CFR
2012-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
Code of Federal Regulations, 2014 CFR
2014-01-01
... OF COMMERCE NIST EXTRAMURAL PROGRAMS MANUFACTURING EXTENSION PARTNERSHIP; ENVIRONMENTAL PROJECTS... information, NIST manufacturing extension efforts, EPA regulation and guidance, and state requirements. The... addition, consultants providing services to those businesses, the NIST Manufacturing Extension Centers, and...
Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal effort by indicating problems and/or benefits of different approaches and designs.
Space Station Freedom environmental database system (FEDS) for MSFC testing
NASA Technical Reports Server (NTRS)
Story, Gail S.; Williams, Wendy; Chiu, Charles
1991-01-01
The Water Recovery Test (WRT) at Marshall Space Flight Center (MSFC) is the first demonstration of integrated water recovery systems for potable and hygiene water reuse as envisioned for Space Station Freedom (SSF). In order to satisfy the safety and health requirements placed on the SSF program and facilitate test data assessment, an extensive laboratory analysis database was established to provide a central archive and data retrieval function. The database is required to store analysis results for physical, chemical, and microbial parameters measured from water, air and surface samples collected at various locations throughout the test facility. The Oracle Relational Database Management System (RDBMS) was utilized to implement a secured on-line information system with the ECLSS WRT program as the foundation for this system. The database is supported on a VAX/VMS 8810 series mainframe and is accessible from the Marshall Information Network System (MINS). This paper summarizes the database requirements, system design, interfaces, and future enhancements.
Dong, Yuwen; Deshpande, Sunil; Rivera, Daniel E; Downs, Danielle S; Savage, Jennifer S
2014-06-01
Control engineering offers a systematic and efficient method to optimize the effectiveness of individually tailored treatment and prevention policies known as adaptive or "just-in-time" behavioral interventions. The nature of these interventions requires assigning dosages at categorical levels, which has been addressed in prior work using Mixed Logical Dynamical (MLD)-based hybrid model predictive control (HMPC) schemes. However, certain requirements of adaptive behavioral interventions that involve sequential decision making have not been comprehensively explored in the literature. This paper presents an extension of the traditional MLD framework for HMPC by representing the requirements of sequential decision policies as mixed-integer linear constraints. This is accomplished with user-specified dosage sequence tables, manipulation of one input at a time, and a switching time strategy for assigning dosages at time intervals less frequent than the measurement sampling interval. A model developed for a gestational weight gain (GWG) intervention is used to illustrate the generation of these sequential decision policies and their effectiveness for implementing adaptive behavioral interventions involving multiple components.
A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.
Rutledge, Robert G
2011-03-02
Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.
A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification
Rutledge, Robert G.
2011-01-01
Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812
Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa1
Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.
2017-01-01
Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)–based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer’s protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. Results: The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. Discussion: The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation. PMID:28224056
Gandjour, Afschin; Müller, Dirk
2014-10-01
One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.
Recent research about mild cognitive impairment in China
CHENG, Yan; XIAO, Shifu
2014-01-01
Summary: The rapid aging of the Chinese population has spurred interest in research about the cause and prevention of dementia and its precursor, mild cognitive impairment (MCI). This review summarizes the last decade of research in China about MCI. Extensive research about the epidemiology, neuropsychological characteristics, diagnosis, genetic etiology, neuroimaging and electrophysiological changes, and treatment of MCI has provided some new insights but few breakthroughs. Further advances in the prevention and treatment of MCI will require a greater emphasis on multi-disciplinary prospective studies with large, representative samples that use standardized methods to assess and monitor changes in cognitive functioning over time. PMID:25114476
Keskin, Uğur; Karasahin, Kazim Emre; Ulubay, Mustafa; Fidan, Ulaş; Gungor, Sadettin; Ergun, Ali
2015-11-01
Intrauterine fetal transfusion needs extensive experience and requires excellent eye-hand coordination, good equipment and experienced team workers to achieve success. While the needle is in the umbilical vein, an assistant withdraws and/or transfuses blood. The needle point should be kept still to prevent lacerations and dislodging. We propose a simple set for Intrauterine Fetal blood transfusion is constructed by readily available materials in every clinic to minimize needle tip movement and movements during syringe attachments and withdrawals during the intrauterine fetal transfusion. This makes possible to withdraw fetal blood sample, and to transfuse blood with minimal intervention.
Low Gravity Freefall Facilities
NASA Technical Reports Server (NTRS)
1981-01-01
Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.
1981-03-30
Composite of Marshall Space Flight Center's Low-Gravity Free Fall Facilities.These facilities include a 100-meter drop tower and a 100-meter drop tube. The drop tower simulates in-flight microgravity conditions for up to 4.2 seconds for containerless processing experiments, immiscible fluids and materials research, pre-flight hardware design test and flight experiment simulation. The drop tube simulates in-flight microgravity conditions for up to 4.6 seconds and is used extensively for ground-based microgravity convection research in which extremely small samples are studied. The facility can provide deep undercooling for containerless processing experiments that require materials to remain in a liquid phase when cooled below the normal solidification temperature.
Establishing and Maintaining an Extensive Library of Patient-Derived Xenograft Models.
Mattar, Marissa; McCarthy, Craig R; Kulick, Amanda R; Qeriqi, Besnik; Guzman, Sean; de Stanchina, Elisa
2018-01-01
Patient-derived xenograft (PDX) models have recently emerged as a highly desirable platform in oncology and are expected to substantially broaden the way in vivo studies are designed and executed and to reshape drug discovery programs. However, acquisition of patient-derived samples, and propagation, annotation and distribution of PDXs are complex processes that require a high degree of coordination among clinic, surgery and laboratory personnel, and are fraught with challenges that are administrative, procedural and technical. Here, we examine in detail the major aspects of this complex process and relate our experience in establishing a PDX Core Laboratory within a large academic institution.
NASA Astrophysics Data System (ADS)
Yacovitch, Tara; Shorter, Joanne; Nelson, David; Herndon, Scott; Agnese, Mike; McManus, Barry; Zahniser, Mark
2017-04-01
In order to understand how and why methane (CH4 ) concentrations change over time, it is necessary to understand their sources and sinks. Stable isotope measurements of 13 CH4 :12 CH4 and CH3 D:12 CH4 ratios constrain the inventory of these sinks and sources. Current measurements often depend on Isotope Ratio Mass Spectrometry (IRMS), which requires extensive sample preparation including cryogenic separation of methane from air and subsequent conversion to either CO2 or H2 . Here, we detail improvements to a direct-absorption laser spectrometer that enable fast and precise measurements of methane isotope ratios (δ13 C and δ2 H ) of ambient air samples, without such sample preparation. The measurement system consists of a laser-based direct absorption spectrometer configured with a sample manifold for measurement of discrete samples (as opposed to flow-through measurements). Samples are trapped in the instrument using a rapid sample switching technique that compares each flask sample against a monitor tank sample. This approach reduces instrument drift and results in excellent precision. Precisions of 0.054 o/oo for δ13 C and 1.4 o/oo for δ2 H have been achieved (Allan-Werle deviations). These results are obtained in 20 minutes using 4 replicate comparisons to a monitor tank.
NASA Astrophysics Data System (ADS)
Roether, Wolfgang; Vogt, Martin; Vogel, Sandra; Sültenfuß, Jürgen
2013-06-01
We present a new method to obtain samples for the measurement of helium isotopes and neon in water, to replace the classical sampling procedure using clamped-off Cu tubing containers that we have been using so far. The new method saves the gas extraction step prior to admission to the mass spectrometer, which the classical method requires. Water is drawn into evacuated glass ampoules with subsequent flame sealing. Approximately 50% headspace is left, from which admission into the mass spectrometer occurs without further treatment. Extensive testing has shown that, with due care and with small corrections applied, the samples represent the gas concentrations in the water within ±0.07% (95% confidence level; ±0.05% with special handling). Fast evacuation is achieved by pumping on a small charge of water placed in the ampoule. The new method was successfully tested at sea in comparison with Cu-tubing sampling. We found that the ampoule samples were superior in data precision and that a lower percentage of samples were lost prior to measurement. Further measurements revealed agreement between the two methods in helium, 3He and neon within ±0.1%. The new method facilitates the dealing with large sample sets and minimizes the delay between sampling and measurement. The method is applicable also for gases other than helium and neon.
Barriers to Participatory Extension in Egypt: Agricultural Workers' Perspectives
ERIC Educational Resources Information Center
McDonough, Chris; Nuberg, Ian K.; Pitchford, Wayne S.
2015-01-01
Purpose: This paper examines extension practises of agricultural workers within the Egyptian government and the perceived barriers they face in implementing participatory approaches, identifying improvements required in research and extension processes to meet the real needs of Egyptian farming communities. Design/Methodology/Approach: Key…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-21
... Formaldehyde Emissions Standards for Composite Wood Products; Extension of Comment Period AGENCY: Environmental... composite wood products. After receiving requests for an extension, EPA extended the comment period from... Environmental protection, Formaldehyde, Reporting and recordkeeping requirements, Toxic substances, Wood. Dated...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-06
... Collection for Work Application/ Job Order Recordkeeping, Extension Without Revisions AGENCY: Employment and... extension without changes of the data retention required by 20 CFR 652.8(d)(5) of the Wagner-Peyser Act, which requires each state to retain applications and job orders for a minimum of one year. The current...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Collection for Work Application/ Job Order Recordkeeping (OMB 1205-0001), Extension Without Revisions AGENCY... collection of data concerning the extension without changes of the data retention required by 20 CFR 652.8(d)(5) of the Wagner-Peyser Act, which requires each state to retain applications and job orders for a...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-25
...] Reports of Injuries to Employees Operating Mechanical Power Presses; Extension of the Office of Management... requirement contained in the Standard on Reports of Injuries to Employees Operating Mechanical Power Presses... a worker is injured while operating a mechanical power press, 29 CFR 1910.217(g) requires an...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-28
...] Overhead and Gantry Cranes; Extension of the Office of Management and Budget's (OMB) Approval of... requirements specified in the Standard on Overhead and Gantry Cranes (29 CFR 1910.179). DATES: Comments must be... requirements for: Marking the rated load of cranes; preparing certification records to verify the inspection of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-08
...] Standard on Vinyl Chloride; Extension of the Office of Management and Budget's (OMB) Approval of... requirements specified in the Standard on Vinyl Chloride (29 CFR 1910.1017). DATES: Comments must be submitted... of the collection of information requirements contained in the Vinyl Chloride (VC) Standard. (A...
ERIC Educational Resources Information Center
Blum, Abraham; Azencot, Moshe
A study was conducted to determine the contacts between agricultural extension and family farmers in Israel. Structured interviews were conducted with a representative sample of 171 smallholder farmers. Advisers of the official Extension Service and the publications of this service and farmers' monthlies were considered to have contributed to…
ERIC Educational Resources Information Center
Ladebo, Olugbenga Jelil
2004-01-01
This study examined the public stereotypes of HIV-positive persons and the relationship with knowledge about the disease. 164 extension personnel and a convenience sample of 250 undergraduate students from an Agricultural Development Programme and an Agricultural University respectively, were interviewed for the study. Both institutions were…
da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P
2018-04-01
Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.
Singh, Pramila; DeMarini, David M; Dick, Colin A J; Tabor, Dennis G; Ryan, Jeff V; Linak, William P; Kobayashi, Takahiro; Gilmour, M Ian
2004-06-01
Two samples of diesel exhaust particles (DEPs) predominate in health effects research: an automobile-derived DEP (A-DEP) sample and the National Institute of Standards Technology standard reference material (SRM 2975) generated from a forklift engine. A-DEPs have been tested extensively for their effects on pulmonary inflammation and exacerbation of allergic asthmalike responses. In contrast, SRM 2975 has been tested thoroughly for its genotoxicity. In the present study, we combined physical and chemical analyses of both DEP samples with pulmonary toxicity testing in CD-1 mice to compare the two materials and to make associations between their physicochemical properties and their biologic effects. A-DEPs had more than 10 times the amount of extractable organic material and less than one-sixth the amount of elemental carbon compared with SRM 2975. Aspiration of 100 micro g of either DEP sample in saline produced mild acute lung injury; however, A-DEPs induced macrophage influx and activation, whereas SRM 2975 enhanced polymorphonuclear cell inflammation. A-DEPs stimulated an increase in interleukin-6 (IL-6), tumor necrosis factor alpha, macrophage inhibitory protein-2, and the TH2 cytokine IL-5, whereas SRM 2975 only induced significant levels of IL-6. Fractionated organic extracts of the same quantity of DEPs (100 micro g) did not have a discernable effect on lung responses and will require further study. The disparate results obtained highlight the need for chemical, physical, and source characterization of particle samples under investigation. Multidisciplinary toxicity testing of diesel emissions derived from a variety of generation and collection conditions is required to meaningfully assess the health hazards associated with exposures to DEPs. Key words: automobile, diesel exhaust particles, forklift, mice, pulmonary toxicity, SRM 2975.
Singh, Pramila; DeMarini, David M; Dick, Colin A J; Tabor, Dennis G; Ryan, Jeff V; Linak, William P; Kobayashi, Takahiro; Gilmour, M Ian
2004-01-01
Two samples of diesel exhaust particles (DEPs) predominate in health effects research: an automobile-derived DEP (A-DEP) sample and the National Institute of Standards Technology standard reference material (SRM 2975) generated from a forklift engine. A-DEPs have been tested extensively for their effects on pulmonary inflammation and exacerbation of allergic asthmalike responses. In contrast, SRM 2975 has been tested thoroughly for its genotoxicity. In the present study, we combined physical and chemical analyses of both DEP samples with pulmonary toxicity testing in CD-1 mice to compare the two materials and to make associations between their physicochemical properties and their biologic effects. A-DEPs had more than 10 times the amount of extractable organic material and less than one-sixth the amount of elemental carbon compared with SRM 2975. Aspiration of 100 micro g of either DEP sample in saline produced mild acute lung injury; however, A-DEPs induced macrophage influx and activation, whereas SRM 2975 enhanced polymorphonuclear cell inflammation. A-DEPs stimulated an increase in interleukin-6 (IL-6), tumor necrosis factor alpha, macrophage inhibitory protein-2, and the TH2 cytokine IL-5, whereas SRM 2975 only induced significant levels of IL-6. Fractionated organic extracts of the same quantity of DEPs (100 micro g) did not have a discernable effect on lung responses and will require further study. The disparate results obtained highlight the need for chemical, physical, and source characterization of particle samples under investigation. Multidisciplinary toxicity testing of diesel emissions derived from a variety of generation and collection conditions is required to meaningfully assess the health hazards associated with exposures to DEPs. Key words: automobile, diesel exhaust particles, forklift, mice, pulmonary toxicity, SRM 2975. PMID:15175167
ERIC Educational Resources Information Center
VanLengen, Craig Alan
2010-01-01
The Securities and Exchange Commission (SEC) has recently announced a proposal that will require all public companies to report their financial data in Extensible Business Reporting Language (XBRL). XBRL is an extension of Extensible Markup Language (XML). Moving to a standard reporting format makes it easier for organizations to report the…
Takahara, Michiyo; Sakaue, Haruka; Onishi, Yukiko; Yamagishi, Marifu; Kida, Yuichiro; Sakaguchi, Masao
2013-01-11
Nascent chain release from membrane-bound ribosomes by the termination codon was investigated using a cell-free translation system from rabbit supplemented with rough microsomal membrane vesicles. Chain release was extremely slow when mRNA ended with only the termination codon. Tail extension after the termination codon enhanced the release of the nascent chain. Release reached plateau levels with tail extension of 10 bases. This requirement was observed with all termination codons: TAA, TGA and TAG. Rapid release was also achieved by puromycin even in the absence of the extension. Efficient translation termination cannot be achieved in the presence of only a termination codon on the mRNA. Tail extension might be required for correct positioning of the termination codon in the ribosome and/or efficient recognition by release factors. Copyright © 2012. Published by Elsevier Inc.
The scale dependence of optical diversity in a prairie ecosystem
NASA Astrophysics Data System (ADS)
Gamon, J. A.; Wang, R.; Stilwell, A.; Zygielbaum, A. I.; Cavender-Bares, J.; Townsend, P. A.
2015-12-01
Biodiversity loss, one of the most crucial challenges of our time, endangers ecosystem services that maintain human wellbeing. Traditional methods of measuring biodiversity require extensive and costly field sampling by biologists with extensive experience in species identification. Remote sensing can be used for such assessment based upon patterns of optical variation. This provides efficient and cost-effective means to determine ecosystem diversity at different scales and over large areas. Sampling scale has been described as a "fundamental conceptual problem" in ecology, and is an important practical consideration in both remote sensing and traditional biodiversity studies. On the one hand, with decreasing spatial and spectral resolution, the differences among different optical types may become weak or even disappear. Alternately, high spatial and/or spectral resolution may introduce redundant or contradictory information. For example, at high resolution, the variation within optical types (e.g., between leaves on a single plant canopy) may add complexity unrelated to specie richness. We studied the scale-dependence of optical diversity in a prairie ecosystem at Cedar Creek Ecosystem Science Reserve, Minnesota, USA using a variety of spectrometers from several platforms on the ground and in the air. Using the coefficient of variation (CV) of spectra as an indicator of optical diversity, we found that high richness plots generally have a higher coefficient of variation. High resolution imaging spectrometer data (1 mm pixels) showed the highest sensitivity to richness level. With decreasing spatial resolution, the difference in CV between richness levels decreased, but remained significant. These findings can be used to guide airborne studies of biodiversity and develop more effective large-scale biodiversity sampling methods.
NASA Astrophysics Data System (ADS)
Darvill, Christopher M.; Bentley, Michael J.; Stokes, Chris R.; Hein, Andrew S.; Rodés, Ángel
2015-11-01
The timing and extent of former glacial advances can demonstrate leads and lags during periods of climatic change and their forcing, but this requires robust glacial chronologies. In parts of southernmost Patagonia, dating pre-global Last Glacial Maximum (gLGM) ice limits has proven difficult due to post-deposition processes affecting the build-up of cosmogenic nuclides in moraine boulders. Here we provide ages for the Río Cullen and San Sebastián glacial limits of the former Bahía Inútil-San Sebastián (BI-SSb) ice lobe on Tierra del Fuego (53-54°S), previously hypothesised to represent advances during Marine Isotope Stages (MIS) 12 and 10, respectively. Our approach uses cosmogenic 10Be and 26Al exposure dating, but targets glacial outwash associated with these limits and uses depth-profiles and surface cobble samples, thereby accounting for surface deflation and inheritance. The data reveal that the limits formed more recently than previously thought, giving ages of 45.6 ka (+139.9/-14.3) for the Río Cullen, and 30.1 ka (+45.6/-23.1) for the San Sebastián limits. These dates indicate extensive glaciation in southern Patagonia during MIS 3, prior to the well-constrained, but much less extensive MIS 2 (gLGM) limit. This suggests the pattern of ice advances in the region was different to northern Patagonia, with the terrestrial limits relating to the last glacial cycle, rather than progressively less extensive glaciations over hundreds of thousands of years. However, the dates are consistent with MIS 3 glaciation elsewhere in the southern mid-latitudes, and the combination of cooler summers and warmer winters with increased precipitation, may have caused extensive glaciation prior to the gLGM.
Weiser, Douglas C; Pyati, Ujwal J; Kimelman, David
2007-06-15
Convergent extension of the mesoderm is the major driving force of vertebrate gastrulation. During this process, mesodermal cells move toward the future dorsal side of the embryo, then radically change behavior as they initiate extension of the body axis. How cells make this transition in behavior is unknown. We have identified the scaffolding protein and tumor suppressor Gravin as a key regulator of this process in zebrafish embryos. We show that Gravin is required for the conversion of mesodermal cells from a highly migratory behavior to the medio-laterally intercalative behavior required for body axis extension. In the absence of Gravin, paraxial mesodermal cells fail to shut down the protrusive activity mediated by the Rho/ROCK/Myosin II pathway, resulting in embryos with severe extension defects. We propose that Gravin functions as an essential scaffold for regulatory proteins that suppress the migratory behavior of the mesoderm during gastrulation, and suggest that this function also explains how Gravin inhibits invasive behaviors in metastatic cells.
Van Laere, Sven; Nyssen, Marc; Verbeke, Frank
2017-01-01
Clinical coding is a requirement to provide valuable data for billing, epidemiology and health care resource allocation. In sub-Saharan Africa, we observe a growing awareness of the need for coding of clinical data, not only in health insurances, but also in governments and the hospitals. Presently, coding systems in sub-Saharan Africa are often used for billing purposes. In this paper we consider the use of a nomenclature to also have a clinical impact. Often coding systems are assumed to be complex and too extensive to be used in daily practice. Here, we present a method for constructing a new nomenclature based on existing coding systems by considering a minimal subset in the sub-Saharan region. Evaluation of completeness will be done nationally using the requirements of national registries. The nomenclature requires an extension character for dealing with codes that have to be used for multiple registries. Hospitals will benefit most by using this extension character.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Everett, W.R.; Rechnitz, G.A.
1999-01-01
A mini review of enzyme-based electrochemical biosensors for inhibition analysis of organophosphorus and carbamate pesticides is presented. Discussion includes the most recent literature to present advances in detection limits, selectivity and real sample analysis. Recent reviews on the monitoring of pesticides and their residues suggest that the classical analytical techniques of gas and liquid chromatography are the most widely used methods of detection. These techniques, although very accurate in their determinations, can be quite time consuming and expensive and usually require extensive sample clean up and pro-concentration. For these and many other reasons, the classical techniques are very difficult tomore » adapt for field use. Numerous researchers, in the past decade, have developed and made improvements on biosensors for use in pesticide analysis. This mini review will focus on recent advances made in enzyme-based electrochemical biosensors for the determinations of organophosphorus and carbamate pesticides.« less
An aptamer-based paper microfluidic device for the colorimetric determination of cocaine.
Wang, Ling; Musile, Giacomo; McCord, Bruce R
2018-02-01
A method utilizing paper microfluidics coupled with gold nanoparticles and two anticocaine aptamers has been developed to detect seized cocaine samples. The ready-to-use format involves the use of a paper strip that produces a color change resulting from the salt-induced aggregation of gold nanoparticles producing a visible color change indicating the presence of the drug. This format is specific for the detection of cocaine. The visual LOD for the method was 2.5 μg and the camera based LOD was 2.36 μg. The operation of the device is easy and rapid, and does not require extensive training or instrumentation. All of the materials utilized in the device are safe and environmental friendly. This device should prove a useful tool for the screening of forensic samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Time of travel of solutes in selected reaches of the Sandusky River Basin, Ohio, 1972 and 1973
Westfall, Arthur O.
1976-01-01
A time of travel study of a 106-mile (171-kilometer) reach of the Sandusky River and a 39-mile (63-kilometer) reach of Tymochtee Creek was made to determine the time required for water released from Killdeer Reservoir on Tymochtee Creek to reach selected downstream points. In general, two dye sample runs were made through each subreach to define the time-discharge relation for approximating travel times at selected discharges within the measured range, and time-discharge graphs are presented for 38 subreaches. Graphs of dye dispersion and variation in relation to time are given for three selected sampling sites. For estimating travel time and velocities between points in the study reach, tables for selected flow durations are given. Duration curves of daily discharge for four index stations are presented to indicate the lo-flow characteristics and for use in shaping downward extensions of the time-discharge curves.
Scanning electron microscope fine tuning using four-bar piezoelectric actuated mechanism
NASA Astrophysics Data System (ADS)
Hatamleh, Khaled S.; Khasawneh, Qais A.; Al-Ghasem, Adnan; Jaradat, Mohammad A.; Sawaqed, Laith; Al-Shabi, Mohammad
2018-01-01
Scanning Electron Microscopes are extensively used for accurate micro/nano images exploring. Several strategies have been proposed to fine tune those microscopes in the past few years. This work presents a new fine tuning strategy of a scanning electron microscope sample table using four bar piezoelectric actuated mechanisms. The introduced paper presents an algorithm to find all possible inverse kinematics solutions of the proposed mechanism. In addition, another algorithm is presented to search for the optimal inverse kinematic solution. Both algorithms are used simultaneously by means of a simulation study to fine tune a scanning electron microscope sample table through a pre-specified circular or linear path of motion. Results of the study shows that, proposed algorithms were able to minimize the power required to drive the piezoelectric actuated mechanism by a ratio of 97.5% for all simulated paths of motion when compared to general non-optimized solution.
Dark-field hyperspectral X-ray imaging
Egan, Christopher K.; Jacques, Simon D. M.; Connolley, Thomas; Wilson, Matthew D.; Veale, Matthew C.; Seller, Paul; Cernik, Robert J.
2014-01-01
In recent times, there has been a drive to develop non-destructive X-ray imaging techniques that provide chemical or physical insight. To date, these methods have generally been limited; either requiring raster scanning of pencil beams, using narrow bandwidth radiation and/or limited to small samples. We have developed a novel full-field radiographic imaging technique that enables the entire physio-chemical state of an object to be imaged in a single snapshot. The method is sensitive to emitted and scattered radiation, using a spectral imaging detector and polychromatic hard X-radiation, making it particularly useful for studying large dense samples for materials science and engineering applications. The method and its extension to three-dimensional imaging is validated with a series of test objects and demonstrated to directly image the crystallographic preferred orientation and formed precipitates across an aluminium alloy friction stir weld section. PMID:24808753
Dehydration-driven stress transfer triggers intermediate-depth earthquakes
NASA Astrophysics Data System (ADS)
Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; Deldicque, Damien; Labrousse, Loïc; Gasc, Julien; Renner, Joerg; Wang, Yanbin; Green, Harry W., II; Schubnel, Alexandre
2017-05-01
Intermediate-depth earthquakes (30-300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearing micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediate-depth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.
Stochastic Optical Reconstruction Microscopy (STORM).
Xu, Jianquan; Ma, Hongqiang; Liu, Yang
2017-07-05
Super-resolution (SR) fluorescence microscopy, a class of optical microscopy techniques at a spatial resolution below the diffraction limit, has revolutionized the way we study biology, as recognized by the Nobel Prize in Chemistry in 2014. Stochastic optical reconstruction microscopy (STORM), a widely used SR technique, is based on the principle of single molecule localization. STORM routinely achieves a spatial resolution of 20 to 30 nm, a ten-fold improvement compared to conventional optical microscopy. Among all SR techniques, STORM offers a high spatial resolution with simple optical instrumentation and standard organic fluorescent dyes, but it is also prone to image artifacts and degraded image resolution due to improper sample preparation or imaging conditions. It requires careful optimization of all three aspects-sample preparation, image acquisition, and image reconstruction-to ensure a high-quality STORM image, which will be extensively discussed in this unit. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Just the right age: well-clustered exposure ages from a global glacial 10Be compilation
NASA Astrophysics Data System (ADS)
Heyman, Jakob; Margold, Martin
2017-04-01
Cosmogenic exposure dating has been used extensively for defining glacial chronologies, both in ice sheet and alpine settings, and the global set of published ages today reaches well beyond 10,000 samples. Over the last few years, a number of important developments have improved the measurements (with well-defined AMS standards) and exposure age calculations (with updated data and methods for calculating production rates), in the best case enabling high precision dating of past glacial events. A remaining problem, however, is the fact that a large portion of all dated samples have been affected by prior and/or incomplete exposure, yielding erroneous exposure ages under the standard assumptions. One way to address this issue is to only use exposure ages that can be confidently considered as unaffected by prior/incomplete exposure, such as groups of samples with statistically identical ages. Here we use objective statistical criteria to identify groups of well-clustered exposure ages from the global glacial "expage" 10Be compilation. Out of ˜1700 groups with at least 3 individual samples ˜30% are well-clustered, increasing to ˜45% if allowing outlier rejection of a maximum of 1/3 of the samples (still requiring a minimum of 3 well-clustered ages). The dataset of well-clustered ages is heavily dominated by ages <30 ka, showing that well-defined cosmogenic chronologies primarily exist for the last glaciation. We observe a large-scale global synchronicity in the timing of the last deglaciation from ˜20 to 10 ka. There is also a general correlation between the timing of deglaciation and latitude (or size of the individual ice mass), with earlier deglaciation in lower latitudes and later deglaciation towards the poles. Grouping the data into regions and comparing with available paleoclimate data we can start to untangle regional differences in the last deglaciation and the climate events controlling the ice mass loss. The extensive dataset and the statistical analysis enables an unprecedented global view on the last deglaciation.
Enumerating Sparse Organisms in Ships’ Ballast Water: Why Counting to 10 Is Not So Easy
2011-01-01
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships’ ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed. PMID:21434685
Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.
2016-01-01
Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323
Enumerating sparse organisms in ships' ballast water: why counting to 10 is not so easy.
Miller, A Whitman; Frazier, Melanie; Smith, George E; Perry, Elgin S; Ruiz, Gregory M; Tamburri, Mario N
2011-04-15
To reduce ballast water-borne aquatic invasions worldwide, the International Maritime Organization and United States Coast Guard have each proposed discharge standards specifying maximum concentrations of living biota that may be released in ships' ballast water (BW), but these regulations still lack guidance for standardized type approval and compliance testing of treatment systems. Verifying whether BW meets a discharge standard poses significant challenges. Properly treated BW will contain extremely sparse numbers of live organisms, and robust estimates of rare events require extensive sampling efforts. A balance of analytical rigor and practicality is essential to determine the volume of BW that can be reasonably sampled and processed, yet yield accurate live counts. We applied statistical modeling to a range of sample volumes, plankton concentrations, and regulatory scenarios (i.e., levels of type I and type II errors), and calculated the statistical power of each combination to detect noncompliant discharge concentrations. The model expressly addresses the roles of sampling error, BW volume, and burden of proof on the detection of noncompliant discharges in order to establish a rigorous lower limit of sampling volume. The potential effects of recovery errors (i.e., incomplete recovery and detection of live biota) in relation to sample volume are also discussed.
NASA Technical Reports Server (NTRS)
Mittlefehldt, D. W.
2012-01-01
The capability of scientific instrumentation flown on planetary orbiters and landers has made great advances since the signature Viking mission of the seventies. At some point, however, the science return from orbital remote sensing, and even in situ measurements, becomes incremental, rather than revolutionary. This is primarily caused by the low spatial resolution of such measurements, even for landed instrumentation, the incomplete mineralogical record derived from such measurements, the inability to do the detailed textural, mineralogical and compositional characterization needed to demonstrate equilibrium or reaction paths, and the lack of chronological characterization. For the foreseeable future, flight instruments will suffer from this limitation. In order to make the next revolutionary breakthrough in understanding the early geological and climatological history of Mars, samples must be available for interrogation using the full panoply of laboratory-housed analytical instrumentation. Laboratory studies of samples allow for determination of parageneses of rocks through microscopic identification of mineral assemblages, evaluation of equilibrium through electron microbeam analyses of mineral compositions and structures, determination of formation temperatures through secondary ion or thermal ionization mass spectrometry (SIMS or TIMS) analyses of stable isotope compositions. Such details are poorly constrained by orbital data (e.g. phyllosilicate formation at Mawrth Vallis), and incompletely described by in situ measurements (e.g. genesis of Burns formation sediments at Meridiani Planum). Laboratory studies can determine formation, metamorphism and/or alteration ages of samples through SIMS or TIMS of radiogenic isotope systems; a capability well-beyond flight instrumentation. Ideally, sample return should be from a location first scouted by landers such that fairly mature hypotheses have been formulated that can be tested. However, samples from clastic sediments derived from an extensive region of Mars can provide important, detailed understanding of early martian geological and climatological history. Interrogating clastic "sediments" from the Earth, Moon and asteroids has allowed discovery of new crustal units, identification of now-vanished crust, and determination of the geological history of extensive, remote regions. Returned sample of martian fluvial and/or aeolian sediments, for example from Gale crater, could be "read like a book" in terrestrial laboratories to provide truly revolutionary new insights into early martian geological and climatological evolution.
Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de
2017-11-05
Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.
Shu, Tongxin; Xia, Min; Chen, Jiahong; de Silva, Clarence
2017-01-01
Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy. PMID:29113087
14 CFR 117.19 - Flight duty period extensions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Flight duty period extensions. 117.19... (CONTINUED) AIR CARRIERS AND OPERATORS FOR COMPENSATION OR HIRE: CERTIFICATION AND OPERATIONS FLIGHT AND DUTY LIMITATIONS AND REST REQUIREMENTS: FLIGHTCREW MEMBERS (EFF. 1-4-14) § 117.19 Flight duty period extensions. (a...
Agricultural Extension Services and the Issue of Equity in Agricultural Development.
ERIC Educational Resources Information Center
Monu, Erasmus D.
1981-01-01
Reviews experiments in Kenya and Nigeria attempting to modify the progressive-farmer strategy. Success requires that extension services recognize small farmers' ability to make their own rational decisions and involve farmers in planning and implementing extension programs. Available from: Rural Sociological Society, 325 Morgan Hall, University of…
Community Health: FCS Extension Educators Deliver Diabetes Education in PA
ERIC Educational Resources Information Center
Cox, Jill N.; Corbin, Marilyn
2011-01-01
For decades, family and consumer sciences (FCS) Extension educators have provided health related education to consumers through Cooperative Extension programming at land grant universities. However, offering diabetes education can be extra challenging due to the complicated nature of the disease and the multi-faceted treatment required. Faced with…
Defining and Developing Curricula in the Context of Cooperative Extension
ERIC Educational Resources Information Center
Smith, Martin H.; Worker, Steven M.; Meehan, Cheryl L.; Schmitt-McQuitty, Lynn; Ambrose, Andrea; Brian, Kelley; Schoenfelder, Emily
2017-01-01
Effective curricula are considered to be the cornerstone of successful programming in Extension. However, there is no universal operationalized definition of the term "curriculum" as it applies to Extension. Additionally, the development of curricula requires a systematic process that takes into account numerous factors. We provide an…
77 FR 74775 - Extension of Dates for Certain Requirements and Amendment of Form 19b-4
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
...] RIN 3235-AK87 Extension of Dates for Certain Requirements and Amendment of Form 19b-4 AGENCY... security-based swap submissions with the Commission in an electronic format to dedicated email addresses to December 10, 2013, and amend the General Instructions to Form 19b-4 to clarify the process for submitting...
ERIC Educational Resources Information Center
Elsey, Barry; Sirichoti, Kittipong
2002-01-01
A sample of 120 Thai fruit growers reported that agricultural extension workers were influential in their adoption of integrated pest management, which balances cultural tradition and progressive practice. Extension workers used discussion and reflection on practical experience, a participatory and collaborative approach to the adoption of…
ERIC Educational Resources Information Center
Villard, Judith A.; Earnest, Garee W.
2006-01-01
This descriptive-correlational study used a census of Ohio State University Extension county directors and a random sample of county staff throughout the State of Ohio. Data were collected utilizing Bar-On's Emotional Intelligence Quotient instrument (county directors) and Warner's job satisfaction instrument (county staff). The study examined the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
A systems engineering management approach to resource management applications
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda Shaller
1989-01-01
The author presents a program management response to the following question: How can the traditional practice of systems engineering management, including requirements specification, be adapted, enhanced, or modified to build future planning and scheduling systems for effective operations? The systems engineering management process, as traditionally practiced, is examined. Extensible resource management systems are discussed. It is concluded that extensible systems are a partial solution to problems presented by requirements that are incomplete, partially immeasurable, and often dynamic. There are positive indications that resource management systems have been characterized and modeled sufficiently to allow their implementation as extensible systems.
A New Electromagnetic Instrument for Thickness Gauging of Conductive Materials
NASA Technical Reports Server (NTRS)
Fulton, J. P.; Wincheski, B.; Nath, S.; Reilly, J.; Namkung, M.
1994-01-01
Eddy current techniques are widely used to measure the thickness of electrically conducting materials. The approach, however, requires an extensive set of calibration standards and can be quite time consuming to set up and perform. Recently, an electromagnetic sensor was developed which eliminates the need for impedance measurements. The ability to monitor the magnitude of a voltage output independent of the phase enables the use of extremely simple instrumentation. Using this new sensor a portable hand-held instrument was developed. The device makes single point measurements of the thickness of nonferromagnetic conductive materials. The technique utilized by this instrument requires calibration with two samples of known thicknesses that are representative of the upper and lower thickness values to be measured. The accuracy of the instrument depends upon the calibration range, with a larger range giving a larger error. The measured thicknesses are typically within 2-3% of the calibration range (the difference between the thin and thick sample) of their actual values. In this paper the design, operational and performance characteristics of the instrument along with a detailed description of the thickness gauging algorithm used in the device are presented.
Spatial-temporal discriminant analysis for ERP-based brain-computer interface.
Zhang, Yu; Zhou, Guoxu; Zhao, Qibin; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej
2013-03-01
Linear discriminant analysis (LDA) has been widely adopted to classify event-related potential (ERP) in brain-computer interface (BCI). Good classification performance of the ERP-based BCI usually requires sufficient data recordings for effective training of the LDA classifier, and hence a long system calibration time which however may depress the system practicability and cause the users resistance to the BCI system. In this study, we introduce a spatial-temporal discriminant analysis (STDA) to ERP classification. As a multiway extension of the LDA, the STDA method tries to maximize the discriminant information between target and nontarget classes through finding two projection matrices from spatial and temporal dimensions collaboratively, which reduces effectively the feature dimensionality in the discriminant analysis, and hence decreases significantly the number of required training samples. The proposed STDA method was validated with dataset II of the BCI Competition III and dataset recorded from our own experiments, and compared to the state-of-the-art algorithms for ERP classification. Online experiments were additionally implemented for the validation. The superior classification performance in using few training samples shows that the STDA is effective to reduce the system calibration time and improve the classification accuracy, thereby enhancing the practicability of ERP-based BCI.
A New Test Method for Determining the Strength and Fracture Toughness of Cement Mortar and Concrete
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jy-An John; Liu, Ken C; Naus, Dan J
2010-01-01
The Spiral Notch Torsion Fracture Toughness Test (SNTT) was developed recently to determine the intrinsic fracture toughness (KIC) of structural materials. The SNTT system operates by applying pure torsion to uniform cylindrical specimens with a notch line that spirals around the specimen at a 45 a pitch. KIC values are obtained with the aid of a three-dimensional finite-element computer code, TOR3D-KIC. The SNTT method is uniquely suitable for testing a wide variety of materials used extensively in pressure vessel and piping structural components and weldments. Application of the method to metallic, ceramic, and graphite materials has been demonstrated. One importantmore » characteristic of SNTT is that neither a fatigue precrack or a deep notch are required for the evaluation of brittle materials, which significantly reduces the sample size requirement. In this paper we report results for a Portland cement-based mortar to demonstrate applicability of the SNTT method to cementitious materials. The estimated KIC of the tested mortar samples with compressive strength of 34.45 MPa was found to be 0.19 MPa m^(1/2).« less
Some practical universal noiseless coding techniques, part 2
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1983-01-01
This report is an extension of earlier work (Part 1) which provided practical adaptive techniques for the efficient noiseless coding of a broad class of data sources characterized by only partially known and varying statistics (JPL Publication 79-22). The results here, while still claiming such general applicability, focus primarily on the noiseless coding of image data. A fairly complete and self-contained treatment is provided. Particular emphasis is given to the requirements of the forthcoming Voyager II encounters of Uranus and Neptune. Performance evaluations are supported both graphically and pictorially. Expanded definitions of the algorithms in Part 1 yield a computationally improved set of options for applications requiring efficient performance at entropies above 4 bits/sample. These expanded definitions include as an important subset, a somewhat less efficient but extremely simple "FAST' compressor which will be used at the Voyager Uranus encounter. Additionally, options are provided which enhance performance when atypical data spikes may be present.
Measurement of surface microtopography
NASA Technical Reports Server (NTRS)
Wall, S. D.; Farr, T. G.; Muller, J.-P.; Lewis, P.; Leberl, F. W.
1991-01-01
Acquisition of ground truth data for use in microwave interaction modeling requires measurement of surface roughness sampled at intervals comparable to a fraction of the microwave wavelength and extensive enough to adequately represent the statistics of a surface unit. Sub-centimetric measurement accuracy is thus required over large areas, and existing techniques are usually inadequate. A technique is discussed for acquiring the necessary photogrammetric data using twin film cameras mounted on a helicopter. In an attempt to eliminate tedious data reduction, an automated technique was applied to the helicopter photographs, and results were compared to those produced by conventional stereogrammetry. Derived root-mean-square (RMS) roughness for the same stereo-pair was 7.5 cm for the automated technique versus 6.5 cm for the manual method. The principal source of error is probably due to vegetation in the scene, which affects the automated technique but is ignored by a human operator.
The forensic aspects of sexual violence.
Newton, Mary
2013-02-01
Complainants of sexual assault may disclose to different agencies, the police and health professionals being the most likely. It is possible for certain evidence types to be collected before a clinical forensic assessment takes place that do not require the need for a Forensic Medical Practitioner. If the time frames after the incident and the nature of assault warrant the need for a forensic medical examination of either a complainant or a suspect, this should only be conducted by doctors and nurses who have received relevant, up-to-date specialist theoretical and practical training. Clear evidence shows that few other criminal offences require as extensive an examination and collection of forensic evidence as that of a sexual assault. The forensic evidence in a case may identify an assailant, eliminate a nominated suspect(s), and assist in the prosecution of a case. The elements of forensic medical examination, reviewed in this chapter, are those that are the most varied across jurisdictions around the world currently. Key focus points of this chapter are considerations for early evidence collection, utilising dedicated medical examination facilities for sample collection, contamination issues associated with evidence collection and certain practical aspects of forensic sampling methods which have evolved given results identified by Forensic Scientists processing evidential samples in sexual assault cases, Some of the problems encountered by the forensic science provider will also be discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fast de novo discovery of low-energy protein loop conformations.
Wong, Samuel W K; Liu, Jun S; Kou, S C
2017-08-01
In the prediction of protein structure from amino acid sequence, loops are challenging regions for computational methods. Since loops are often located on the protein surface, they can have significant roles in determining protein functions and binding properties. Loop prediction without the aid of a structural template requires extensive conformational sampling and energy minimization, which are computationally difficult. In this article we present a new de novo loop sampling method, the Parallely filtered Energy Targeted All-atom Loop Sampler (PETALS) to rapidly locate low energy conformations. PETALS explores both backbone and side-chain positions of the loop region simultaneously according to the energy function selected by the user, and constructs a nonredundant ensemble of low energy loop conformations using filtering criteria. The method is illustrated with the DFIRE potential and DiSGro energy function for loops, and shown to be highly effective at discovering conformations with near-native (or better) energy. Using the same energy function as the DiSGro algorithm, PETALS samples conformations with both lower RMSDs and lower energies. PETALS is also useful for assessing the accuracy of different energy functions. PETALS runs rapidly, requiring an average time cost of 10 minutes for a length 12 loop on a single 3.2 GHz processor core, comparable to the fastest existing de novo methods for generating an ensemble of conformations. Proteins 2017; 85:1402-1412. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Nuclear data uncertainty propagation by the XSUSA method in the HELIOS2 lattice code
NASA Astrophysics Data System (ADS)
Wemple, Charles; Zwermann, Winfried
2017-09-01
Uncertainty quantification has been extensively applied to nuclear criticality analyses for many years and has recently begun to be applied to depletion calculations. However, regulatory bodies worldwide are trending toward requiring such analyses for reactor fuel cycle calculations, which also requires uncertainty propagation for isotopics and nuclear reaction rates. XSUSA is a proven methodology for cross section uncertainty propagation based on random sampling of the nuclear data according to covariance data in multi-group representation; HELIOS2 is a lattice code widely used for commercial and research reactor fuel cycle calculations. This work describes a technique to automatically propagate the nuclear data uncertainties via the XSUSA approach through fuel lattice calculations in HELIOS2. Application of the XSUSA methodology in HELIOS2 presented some unusual challenges because of the highly-processed multi-group cross section data used in commercial lattice codes. Currently, uncertainties based on the SCALE 6.1 covariance data file are being used, but the implementation can be adapted to other covariance data in multi-group structure. Pin-cell and assembly depletion calculations, based on models described in the UAM-LWR Phase I and II benchmarks, are performed and uncertainties in multiplication factor, reaction rates, isotope concentrations, and delayed-neutron data are calculated. With this extension, it will be possible for HELIOS2 users to propagate nuclear data uncertainties directly from the microscopic cross sections to subsequent core simulations.
NASA Astrophysics Data System (ADS)
Zeeshan, M. A.; Esqué-de Los Ojos, D.; Castro-Hartmann, P.; Guerrero, M.; Nogués, J.; Suriñach, S.; Baró, M. D.; Nelson, B. J.; Pané, S.; Pellicer, E.; Sort, J.
2016-01-01
The effects of constrained sample dimensions on the mechanical behavior of crystalline materials have been extensively investigated. However, there is no clear understanding of these effects in nano-sized amorphous samples. Herein, nanoindentation together with finite element simulations are used to compare the properties of crystalline and glassy CoNi(Re)P electrodeposited nanowires (φ ~ 100 nm) with films (3 μm thick) of analogous composition and structure. The results reveal that amorphous nanowires exhibit a larger hardness, lower Young's modulus and higher plasticity index than glassy films. Conversely, the very large hardness and higher Young's modulus of crystalline nanowires are accompanied by a decrease in plasticity with respect to the homologous crystalline films. Remarkably, proper interpretation of the mechanical properties of the nanowires requires taking the curved geometry of the indented surface and sink-in effects into account. These findings are of high relevance for optimizing the performance of new, mechanically-robust, nanoscale materials for increasingly complex miniaturized devices.The effects of constrained sample dimensions on the mechanical behavior of crystalline materials have been extensively investigated. However, there is no clear understanding of these effects in nano-sized amorphous samples. Herein, nanoindentation together with finite element simulations are used to compare the properties of crystalline and glassy CoNi(Re)P electrodeposited nanowires (φ ~ 100 nm) with films (3 μm thick) of analogous composition and structure. The results reveal that amorphous nanowires exhibit a larger hardness, lower Young's modulus and higher plasticity index than glassy films. Conversely, the very large hardness and higher Young's modulus of crystalline nanowires are accompanied by a decrease in plasticity with respect to the homologous crystalline films. Remarkably, proper interpretation of the mechanical properties of the nanowires requires taking the curved geometry of the indented surface and sink-in effects into account. These findings are of high relevance for optimizing the performance of new, mechanically-robust, nanoscale materials for increasingly complex miniaturized devices. Electronic supplementary information (ESI) available: Additional details on experimental and analysis methods, additional results on crystalline CoNi(Re)P alloys and two movies to illustrate the stress distribution during deformation of the amorphous and crystalline nanowires. See DOI: 10.1039/c5nr04398k
NASA Astrophysics Data System (ADS)
Mabit, Lionel; Meusburger, Katrin; Iurian, Andra-Rada; Owens, Philip N.; Toloza, Arsenio; Alewell, Christine
2014-05-01
Soil and sediment related research for terrestrial agri-environmental assessments requires accurate depth incremental sampling of soil and exposed sediment profiles. Existing coring equipment does not allow collecting soil/sediment increments at millimetre resolution. Therefore, the authors have designed an economic, portable, hand-operated surface soil/sediment sampler - the Fine Increment Soil Collector (FISC) - which allows extensive control of soil/sediment sampling process and easy recovery of the material collected by using a simple screw-thread extraction system. In comparison with existing sampling tools, the FISC has the following advantages and benefits: (i) it permits sampling of soil/sediment samples at the top of the profile; (ii) it is easy to adjust so as to collect soil/sediment at mm resolution; (iii) it is simple to operate by one single person; (iv) incremental samples can be performed in the field or at the laboratory; (v) it permits precise evaluation of bulk density at millimetre vertical resolution; and (vi) sample size can be tailored to analytical requirements. To illustrate the usefulness of the FISC in sampling soil and sediments for 7Be - a well-known cosmogenic soil tracer and fingerprinting tool - measurements, the sampler was tested in a forested soil located 45 km southeast of Vienna in Austria. The fine resolution increments of 7Be (i.e. 2.5 mm) affects directly the measurement of the 7Be total inventory but above all impacts the shape of the 7Be exponential profile which is needed to assess soil movement rates. The FISC can improve the determination of the depth distributions of other Fallout Radionuclides (FRN) - such as 137Cs, 210Pbexand239+240Pu - which are frequently used for soil erosion and sediment transport studies and/or sediment fingerprinting. Such a device also offers great potential to investigate FRN depth distributions associated with fallout events such as that associated with nuclear emergencies. Furthermore, prior to remediation activities - such as topsoil removal - in contaminated soils and sediments (e.g. by heavy metals, pesticides or nuclear power plant accident releases), basic environmental assessment often requires the determination of the extent and the depth penetration of the different contaminants, precision that can be provided by using the FISC.
Hazen, Nancy L; Allen, Sydnye D; Christopher, Caroline Heaton; Umemura, Tomotaka; Jacobvitz, Deborah B
2015-08-01
We examined whether a maximum threshold of time spent in nonmaternal care exists, beyond which infants have an increased risk of forming a disorganized infant-mother attachment. The hours per week infants spent in nonmaternal care at 7-8 months were examined as a continuous measure and as a dichotomous threshold (over 40, 50 and 60 hr/week) to predict infant disorganization at 12-15 months. Two different samples (Austin and NICHD) were used to replicate findings and control for critical covariates: mothers' unresolved status and frightening behavior (assessed in the Austin sample, N = 125), quality of nonmaternal caregiving (assessed in the NICHD sample, N = 1,135), and family income and infant temperament (assessed in both samples). Only very extensive hours of nonmaternal care (over 60 hr/week) and mothers' frightening behavior independently predicted attachment disorganization. A polynomial logistic regression performed on the larger NICHD sample indicated that the risk of disorganized attachment exponentially increased after exceeding 60 hr/week. In addition, very extensive hours of nonmaternal care only predicted attachment disorganization after age 6 months (not prior). Findings suggest that during a sensitive period of attachment formation, infants who spend more than 60 hr/week in nonmaternal care may be at an increased risk of forming a disorganized attachment.
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-18
...: 84.326Z.] Final Waiver and Extension of the Project Period for the Technical Assistance Coordination... project period. SUMMARY: The Secretary waives the requirements in the Education Department General Administrative Regulations that generally prohibit project periods exceeding five years and extensions of project...
ERIC Educational Resources Information Center
Chowdhury, Ataharul Huq; Odame, Helen Hambly; Leeuwis, Cees
2014-01-01
Purpose: The rapidly evolving nature of agricultural innovation processes in low-income countries requires agricultural extension agencies to transform the classical roles that previously supported linear information dissemination and adoption of innovation. In Bangladesh, strengthening agricultural innovation calls for facilitation of interactive…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-25
...: Extension of Time Limit for Preliminary Results of Antidumping Duty New Shipper Review AGENCY: Import... results of this review is July 24, 2011. Extension of Time Limits for Preliminary Results of Review... time to complete the preliminary results. Specifically, the Department requires additional time to...
NASA Technical Reports Server (NTRS)
1979-01-01
Contractor information requirements necessary to support the power extension package project of the space shuttle program are specified for the following categories of data: project management; configuration management; systems engineering and test; manufacturing; reliability, quality assurance and safety; logistics; training; and operations.
Conducting a Statewide Dual-Purpose Program for Pesticide Applicators and County Extension Agents
ERIC Educational Resources Information Center
Fishel, Fred; Liu, Guodong David
2014-01-01
The University of Florida Cooperative Extension conducted a statewide program with a dual role during 2013 and 2014 to enhance efficiency. The program provided in-service training to county Extension agents and provided continuing education to meet requirements needed by licensed pesticide applicators. Using Polycom distance technology, the event…
75 FR 443 - Public Land Order No. 7738; Extension of Public Land Order No. 6760, Nevada
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... extension is necessary to continue protection of the Federal investment in the U.S. Forest Service Austin... made requires this extension to continue protection of the Federal investment in the Austin... leasing laws, to protect the Federal investment in the Austin Administrative Site, is hereby extended for...
Out-of-Sample Extensions for Non-Parametric Kernel Methods.
Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang
2017-02-01
Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.
Wildlife habitats of the north coast of California: new techniques for extensive forest inventory.
Janet L. Ohmann
1992-01-01
A study was undertaken to develop methods for extensive inventory and analysis of wildlife habitats. The objective was to provide information about amounts and conditions of wildlife habitats from extensive, sample based inventories so that wildlife can be better considered in forest planning and policy decisions at the regional scale. The new analytical approach...
ERIC Educational Resources Information Center
Maienthal, E. J.; Becker, D. A.
This report presents the results of an extensive literature survey undertaken to establish optimum sampling, sample handling and long-term storage techniques for a wide variety of environmental samples to retain sample integrity. The components of interest are trace elements, organics, pesticides, radionuclides and microbiologicals. A bibliography…
NASA Astrophysics Data System (ADS)
Dobosy, R.; Dumas, E. J.; Sayres, D. S.; Healy, C. E.; Munster, J. B.; Baker, B.; Anderson, J. G.
2014-12-01
Detailed process-oriented study of the mechanisms of conversion in the Arctic of fossil carbon to atmospheric gas is progressing, but necessarily limited to a few point locations and requiring detailed subsurface measurements inaccessible to remote sensing. Airborne measurements of concentration, transport and flux of these carbon gases at sufficiently low altitude to reflect surface variations can tie such local measurements to remotely observable features of the landscape. Carbon dioxide and water vapor have been observable for over 20 years from low-altitude small aircraft in the Arctic and elsewhere. Methane has been more difficult, requiring large powerful aircraft or limited flask samples. Recent developments in spectroscopy, however, have reduced the power and weight required to measure methane at rates suitable for eddy-covariance flux estimates. The Flux Observations of Carbon from an Airborne Laboratory (FOCAL) takes advantage of Integrated Cavity-Output Spectroscopy (ICOS) to measure CH4, CO2, and water vapor in a new airborne system. The system, moreover, measures these gases' stable isotopologues every two seconds or faster helping to separate thermogenic from biogenic emissions. Paired with the Best Airborne Turbulence (BAT) probe developed for small aircraft by NOAA's Air Resources Laboratory and a light twin-engine aircraft adapted by Aurora Flight Sciences Inc., the FOCAL measures at 6 m spacing, covering 100 km in less than 30 minutes. It flies between 10 m and 50 m above ground interspersed with profiles to the top of the boundary layer and beyond. This presentation gives an overview of the magnitude and variation in fluxes and concentrations of CH4, CO2, and H2O with space, time, and time of day in a spatially extensive survey, more than 7500 km total in 15 flights over roughly a 100 km square during the month of August 2013. An extensive data set such as this at low altitude with high-rate sampling addresses features that repeat on 1 km scale or smaller such as thermokarst lakes as well as landscape changes on the 100 km scale.
APPLICATION OF SEMIPERMEABLE MEMBRANE DEVICES TO INDOOR AIR SAMPLING
Semipermeable membrane devices (SPMDs) are a relatively new passive sampling technique for nonpolar organic compounds that have been extensively used for surface water sampling. A small body of literature indicates that SPMDs are also useful for air sampling. Because SPMDs ha...
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-01-01
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies. PMID:26047146
Singh, Kumar Saurabh; Thual, Dominique; Spurio, Roberto; Cannata, Nicola
2015-06-03
One of the most crucial characteristics of day-to-day laboratory information management is the collection, storage and retrieval of information about research subjects and environmental or biomedical samples. An efficient link between sample data and experimental results is absolutely important for the successful outcome of a collaborative project. Currently available software solutions are largely limited to large scale, expensive commercial Laboratory Information Management Systems (LIMS). Acquiring such LIMS indeed can bring laboratory information management to a higher level, but most of the times this requires a sufficient investment of money, time and technical efforts. There is a clear need for a light weighted open source system which can easily be managed on local servers and handled by individual researchers. Here we present a software named SaDA for storing, retrieving and analyzing data originated from microorganism monitoring experiments. SaDA is fully integrated in the management of environmental samples, oligonucleotide sequences, microarray data and the subsequent downstream analysis procedures. It is simple and generic software, and can be extended and customized for various environmental and biomedical studies.
2016-03-02
Nyquist tiles and sampling groups in Euclidean geometry, and discussed the extension of these concepts to hyperbolic and spherical geometry and...hyperbolic or spherical spaces. We look to develop a structure for the tiling of frequency spaces in both Euclidean and non-Euclidean domains. In particular...we establish Nyquist tiles and sampling groups in Euclidean geometry, and discuss the extension of these concepts to hyperbolic and spherical geometry
Hong, Mineui; Bang, Heejin; Van Vrancken, Michael; Kim, Seungtae; Lee, Jeeyun; Park, Se Hoon; Park, Joon Oh; Park, Young Suk; Lim, Ho Yeong; Kang, Won Ki; Sun, Jong-Mu; Lee, Se Hoon; Ahn, Myung-Ju; Park, Keunchil; Kim, Duk Hwan; Lee, Seunggwan; Park, Woongyang; Kim, Kyoung-Mee
2017-01-01
To generate accurate next-generation sequencing (NGS) data, the amount and quality of DNA extracted is critical. We analyzed 1564 tissue samples from patients with metastatic or recurrent solid tumor submitted for NGS according to their sample size, acquisition method, organ, and fixation to propose appropriate tissue requirements. Of the 1564 tissue samples, 481 (30.8%) consisted of fresh-frozen (FF) tissue, and 1,083 (69.2%) consisted of formalin-fixed paraffin-embedded (FFPE) tissue. We obtained successful NGS results in 95.9% of cases. Out of 481 FF biopsies, 262 tissue samples were from lung, and the mean fragment size was 2.4 mm. Compared to lung, GI tract tumor fragments showed a significantly lower DNA extraction failure rate (2.1 % versus 6.1%, p = 0.04). For FFPE biopsy samples, the size of biopsy tissue was similar regardless of tumor type with a mean of 0.8 × 0.3 cm, and the mean DNA yield per one unstained slide was 114 ng. We obtained highest amount of DNA from the colorectum (2353 ng) and the lowest amount from the hepatobiliary tract (760.3 ng) likely due to a relatively smaller biopsy size, extensive hemorrhage and necrosis, and lower tumor volume. On one unstained slide from FFPE operation specimens, the mean size of the specimen was 2.0 × 1.0 cm, and the mean DNA yield per one unstained slide was 1800 ng. In conclusions, we present our experiences on tissue requirements for appropriate NGS workflow: > 1 mm2 for FF biopsy, > 5 unstained slides for FFPE biopsy, and > 1 unstained slide for FFPE operation specimens for successful test results in 95.9% of cases. PMID:28477007
Shirazi, Mohammadali; Reddy Geedipally, Srinivas; Lord, Dominique
2017-01-01
Severity distribution functions (SDFs) are used in highway safety to estimate the severity of crashes and conduct different types of safety evaluations and analyses. Developing a new SDF is a difficult task and demands significant time and resources. To simplify the process, the Highway Safety Manual (HSM) has started to document SDF models for different types of facilities. As such, SDF models have recently been introduced for freeway and ramps in HSM addendum. However, since these functions or models are fitted and validated using data from a few selected number of states, they are required to be calibrated to the local conditions when applied to a new jurisdiction. The HSM provides a methodology to calibrate the models through a scalar calibration factor. However, the proposed methodology to calibrate SDFs was never validated through research. Furthermore, there are no concrete guidelines to select a reliable sample size. Using extensive simulation, this paper documents an analysis that examined the bias between the 'true' and 'estimated' calibration factors. It was indicated that as the value of the true calibration factor deviates further away from '1', more bias is observed between the 'true' and 'estimated' calibration factors. In addition, simulation studies were performed to determine the calibration sample size for various conditions. It was found that, as the average of the coefficient of variation (CV) of the 'KAB' and 'C' crashes increases, the analyst needs to collect a larger sample size to calibrate SDF models. Taking this observation into account, sample-size guidelines are proposed based on the average CV of crash severities that are used for the calibration process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Extension-twist coupling of composite circular tubes with application to tilt rotor blade design
NASA Technical Reports Server (NTRS)
Nixon, Mark W.
1987-01-01
This investigation was conducted to determine if twist deformation required for the design of full-scale extension-twist-coupled tilt-rotor blades can be achieved within material design limit loads, and to demonstrate the accuracy of a coupled-beam analysis in predicting twist deformations. Two extension-twist-coupled tilt-rotor blade designs were developed based on theoretically optimum aerodynamic twist distributions. The designs indicated a twist rate requirement of between .216 and .333 deg/in. Agreement between axial tests and analytical predictions was within 10 percent at design limit loads. Agreement between the torsion tests and predictions was within 11 percent.
A fast sequence assembly method based on compressed data structures.
Liang, Peifeng; Zhang, Yancong; Lin, Kui; Hu, Jinglu
2014-01-01
Assembling a large genome using next generation sequencing reads requires large computer memory and a long execution time. To reduce these requirements, a memory and time efficient assembler is presented from applying FM-index in JR-Assembler, called FMJ-Assembler, where FM stand for FMR-index derived from the FM-index and BWT and J for jumping extension. The FMJ-Assembler uses expanded FM-index and BWT to compress data of reads to save memory and jumping extension method make it faster in CPU time. An extensive comparison of the FMJ-Assembler with current assemblers shows that the FMJ-Assembler achieves a better or comparable overall assembly quality and requires lower memory use and less CPU time. All these advantages of the FMJ-Assembler indicate that the FMJ-Assembler will be an efficient assembly method in next generation sequencing technology.
Microbial Groundwater Sampling Protocol for Fecal-Rich Environments
Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William
2014-01-01
Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186
NASA Technical Reports Server (NTRS)
Barnes, J. C.; Bowley, C. J.
1974-01-01
Because of the effect of sea ice on the heat balance of the Arctic and because of the expanding economic interest in arctic oil and other minerals, extensive monitoring and further study of sea ice is required. The application of ERTS data for mapping ice is evaluated for several arctic areas, including the Bering Sea, the eastern Beaufort Sea, parts of the Canadian Archipelago, and the Greenland Sea. Interpretive techniques are discussed, and the scales and types of ice features that can be detected are described. For the Bering Sea, a sample of ERTS imagery is compared with visual ice reports and aerial photography from the NASA CV-990 aircraft.
NASA Astrophysics Data System (ADS)
Fuqua, Peter D.; Presser, Nathan; Barrie, James D.; Meshishnek, Michael J.; Coleman, Dianne J.
2002-06-01
Certain spaceborne telescope designs require that dielectric-coated lenses be exposed to the energetic electrons and protons associated with the space environment. Test coupons that were exposed to a simulated space environment showed extensive pitting as a result of dielectric breakdown. A typical pit was 50-100 mum at the surface and extended to the substrate material, in which a 10-mum-diameter melt region was found. Pitting was not observed on similar samples that had also been overcoated with a transparent conductive thin film. Measurement of the bidirectional reflectance distribution transfer function showed that pitting caused a fivefold to tenfold increase in the scattering of visible light.
Recent Advances in Microbial Single Cell Genomics Technology and Applications
NASA Astrophysics Data System (ADS)
Stepanauskas, R.
2016-02-01
Single cell genomics is increasingly utilized as a powerful tool to decipher the metabolic potential, evolutionary histories and in situ interactions of environmental microorganisms. This transformative technology recovers extensive information from cultivation-unbiased samples of individual, unicellular organisms. Thus, it does not require data binning into arbitrary phylogenetic or functional groups and therefore is highly compatible with agent-based modeling approaches. I will present several technological advances in this field, which significantly improve genomic data recovery from individual cells and provide direct linkages between cell's genomic and phenotypic properties. I will also demonstrate how these new technical capabilities help understanding the metabolic potential and viral infections of the "microbial dark matter" inhabiting aquatic and subsurface environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, M.K.
1999-05-10
Using ORNL information on the characterization of the tank waste sludges, SRTC performed extensive bench-scale vitrification studies using simulants. Several glass systems were tested to ensure the optimum glass composition (based on the glass liquidus temperature, viscosity and durability) is determined. This optimum composition will balance waste loading, melt temperature, waste form performance and disposal requirements. By optimizing the glass composition, a cost savings can be realized during vitrification of the waste. The preferred glass formulation was selected from the bench-scale studies and recommended to ORNL for further testing with samples of actual OR waste tank sludges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; He, Yunteng; Kong, Wei, E-mail: wei.kong@oregonstate.edu
We report electron diffraction of ferrocene doped in superfluid helium droplets. By taking advantage of the velocity slip in our pulsed droplet beam using a pulsed electron gun, and by doping with a high concentration of ferrocene delivered via a pulsed valve, we can obtain high quality diffraction images from singly doped droplets. Under the optimal doping conditions, 80% of the droplets sampled in the electron beam are doped with just one ferrocene molecule. Extension of this size selection method to dopant clusters has also been demonstrated. However, incomplete separation of dopant clusters might require deconvolution and modeling of themore » doping process. This method can be used for studies of nucleation processes in superfluid helium droplets.« less
Design considerations for case series models with exposure onset measurement error.
Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V
2013-02-28
The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.
Development of a passive sampler for gaseous mercury
NASA Astrophysics Data System (ADS)
Gustin, M. S.; Lyman, S. N.; Kilner, P.; Prestbo, E.
2011-10-01
Here we describe work toward development of the components of a cost effective passive sampling system for gaseous Hg that could be broadly deployed by nontechnical staff. The passive sampling system included an external shield to reduce turbulence and exposure to precipitation and dust, a diffusive housing that directly protects the collection surface during deployment and handling, and a collection surface. A protocol for cleaning and deploying the sampler and an analytical method were developed. Our final design consisted of a polycarbonate external shield enclosing a custom diffusive housing made from expanded PTFE tubing. Two collection surfaces were investigated, gold sputter-coated quartz plates and silver wires. Research showed the former would require extensive quality control for use, while the latter had interferences with other atmosphere constituents. Although the gold surface exhibited the best performance over space and time, gradual passivation would limit reuse. For both surfaces lack of contamination during shipping, deployment and storage indicated that the handling protocols developed worked well with nontechnical staff. We suggest that the basis for this passive sampling system is sound, but further exploration and development of a reliable collection surface is needed.
Gas chromatographic/mass spectrometric and microbiological analyses on irradiated chicken
NASA Astrophysics Data System (ADS)
Parlato, A.; Calderaro, E.; Bartolotta, A.; D'Oca, M. C.; Giuffrida, S. A.; Brai, M.; Tranchina, L.; Agozzino, P.; Avellone, G.; Ferrugia, M.; Di Noto, A. M.; Caracappa, S.
2007-08-01
Ionizing radiation is widely used as treatment technique for food preservation. It involves among others reduction of microbial contamination, disinfestations, sprout inhibition and extension of shelf life of food. However, the commercialization of irradiated food requires the availability of reliable methods to identify irradiated foodstuffs. In this paper, we present results on the application to irradiated chicken of this method, based on the detection, in muscle and skin samples, of the peaks of ions 98 Da and 112 Da, in a ratio approximately 4:1, typical of radiation induced 2-dodecylcyclobutanones (2-DCB). Aim of the work was also to study the time stability of the measured parameters in samples irradiated at 3 and 5 kGy, and to verify the efficacy of the treatment from a microbiological point of view. Our results show that, one month after irradiation at 3 kGy, the method is suitable using the skin but not the muscle, while the measured parameters are detectable in both samples irradiated at 5 kGy. The microbial population was substantially reduced even at 3 kGy.
Cervical cytology biobanking in Europe.
Arbyn, Marc; Van Veen, Evert-Ben; Andersson, Kristin; Bogers, Johannes; Boulet, Gaëlle; Bergeron, Christine; von Knebel-Doeberitz, Magnus; Dillner, Joakim
2010-01-01
A cervical cytology biobank (CCB) is an extension of current cytopathology laboratory practice consisting in the systematic storage of Pap smears or liquid-based cytology samples from women participating in cervical cancer screening with the explicit purpose to facilitate future scientific research and quality audit of preventive services. A CCB should use an internationally agreed uniform cytology terminology, be integrated in a national or regional screening registry, and be linked to other registries (histology, cancer, vaccination). Legal and ethical principles concerning personal integrity and data safety must be respected strictly. Biobank-based studies require approval of ethical review boards. A CCB is an almost inexhaustible resource for fundamental and applied biological research. In particular, it can contribute to answering questions on the natural history of HPV infection and HPV-induced lesions and cancers, screening effectiveness, exploration of new biomarkers, and surveillance of the short- and long-term effects of the introduction of HPV vaccination. To understand the limitations of CCB, more studies are needed on the quality of samples in relation to sample type, storage procedures, and duration of storage.
Crawshaw, Timothy R; Chanter, Jeremy I; McGoldrick, Adrian; Line, Kirsty
2014-02-07
Cases of Mycobacterium bovis infection South American camelids have been increasing in Great Britain. Current antemortem immunological tests have some limitations. Cases at post mortem examination frequently show extensive pathology. The feasibility of detecting Mycobacterium bovis DNA in clinical samples was investigated. A sensitive extraction methodology was developed and used on nasal swabs and faeces taken post-mortem to assess the potential for a PCR test to detect Mycobacterium bovis in clinical samples. The gross pathology of the studied South American camelids was scored and a significantly greater proportion of South American camelids with more severe pathology were positive in both the nasal swab and faecal PCR tests. A combination of the nasal swab and faecal PCR tests detected 63.9% of all the South American camelids with pathology that were tested. The results suggest that antemortem diagnosis of Mycobacterium bovis in South American camelids may be possible using a PCR test on clinical samples, however more work is required to determine sensitivity and specificity, and the practicalities of applying the test in the field.
The Viking X ray fluorescence experiment - Analytical methods and early results
NASA Technical Reports Server (NTRS)
Clark, B. C., III; Castro, A. J.; Rowe, C. D.; Baird, A. K.; Rose, H. J., Jr.; Toulmin, P., III; Christian, R. P.; Kelliher, W. C.; Keil, K.; Huss, G. R.
1977-01-01
Ten samples of the Martian regolith have been analyzed by the Viking lander X ray fluorescence spectrometers. Because of high-stability electronics, inclusion of calibration targets, and special data encoding within the instruments the quality of the analyses performed on Mars is closely equivalent to that attainable with the same instruments operated in the laboratory. Determination of absolute elemental concentrations requires gain drift adjustments, subtraction of background components, and use of a mathematical response model with adjustable parameters set by prelaunch measurements on selected rock standards. Bulk fines at both Viking landing sites are quite similar in composition, implying that a chemically and mineralogically homogeneous regolith covers much of the surface of the planet. Important differences between samples include a higher sulfur content in what appear to be duricrust fragments than in fines and a lower iron content in fines taken from beneath large rocks than those taken from unprotected surface material. Further extensive reduction of these data will allow more precise and more accurate analytical numbers to be determined and thus a more comprehensive understanding of elemental trends between samples.
Nagarajan, Mahesh B.; Huber, Markus B.; Schlossbauer, Thomas; Leinsinger, Gerda; Krol, Andrzej; Wismüller, Axel
2014-01-01
Objective While dimension reduction has been previously explored in computer aided diagnosis (CADx) as an alternative to feature selection, previous implementations of its integration into CADx do not ensure strict separation between training and test data required for the machine learning task. This compromises the integrity of the independent test set, which serves as the basis for evaluating classifier performance. Methods and Materials We propose, implement and evaluate an improved CADx methodology where strict separation is maintained. This is achieved by subjecting the training data alone to dimension reduction; the test data is subsequently processed with out-of-sample extension methods. Our approach is demonstrated in the research context of classifying small diagnostically challenging lesions annotated on dynamic breast magnetic resonance imaging (MRI) studies. The lesions were dynamically characterized through topological feature vectors derived from Minkowski functionals. These feature vectors were then subject to dimension reduction with different linear and non-linear algorithms applied in conjunction with out-of-sample extension techniques. This was followed by classification through supervised learning with support vector regression. Area under the receiver-operating characteristic curve (AUC) was evaluated as the metric of classifier performance. Results Of the feature vectors investigated, the best performance was observed with Minkowski functional ’perimeter’ while comparable performance was observed with ’area’. Of the dimension reduction algorithms tested with ’perimeter’, the best performance was observed with Sammon’s mapping (0.84 ± 0.10) while comparable performance was achieved with exploratory observation machine (0.82 ± 0.09) and principal component analysis (0.80 ± 0.10). Conclusions The results reported in this study with the proposed CADx methodology present a significant improvement over previous results reported with such small lesions on dynamic breast MRI. In particular, non-linear algorithms for dimension reduction exhibited better classification performance than linear approaches, when integrated into our CADx methodology. We also note that while dimension reduction techniques may not necessarily provide an improvement in classification performance over feature selection, they do allow for a higher degree of feature compaction. PMID:24355697
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
Preliminary flight results from the second U.S. Microgravity Payload (USMP-2)
NASA Technical Reports Server (NTRS)
Curreri, Peter; Reiss, Donald
1994-01-01
The second U.S. Microgravity Payload (USMP-2) was flown on the Space Shuttle in March 1994. It carried four major microgravity experiments plus a sophisticated accelerometer system to record the microgravity environment during USMP-2 operations. The USMP program is designed to accommodate experiments requiring extensive resources short of a full Spacelab mission, and the experiments are remotely operated and monitored. Results are reviewed from the four experiments: the Advanced Automated Directional Solidification Facility (AADSF), the Isothermal Dendrite Growth Experiment (IDGE), the Materiel por Etude des Phenomenes Interessant la Soldification sur Terre et en Orbite (MEPHISTO), and the Critical Fluid Light Scattering Experiment (Zeno). AASDF grew what is expected to be the largest steady-state sample ever of HgCdTe during 240 hours of operation. IDGE provided 60 growth cycles over a wide range of supercooling conditions studying the dendritic solidification of succinonitrile. MEPHISTO achieved 55 melt-solidify cycles and grew over 1 m of Bi/Sn alloy. Zeno located the critical point temperature for liquid Xe to 0.00001 K. IDGE and Zeno also provided the most extensive demonstrations to date of telescience.
DNA Micromanipulation Using Novel High-Force, In-Plane Magnetic Tweezer
NASA Astrophysics Data System (ADS)
McAndrew, Christopher; Mehl, Patrick; Sarkar, Abhijit
2010-03-01
We report the development of a magnetic force transducer that can apply piconewton forces on single DNA molecules in the focus plane allowing continuous high precision tethered-bead tracking. The DNA constructs, proteins, and buffer are introduced into a 200μL closed cell created using two glass slides separated by rigid spacers interspersed within a thin viscoelastic perimeter wall. This closed cell configuration isolates our sample and produces low-noise force-extension measurements. Specially-drawn micropipettes are used for capturing the polystyrene bead, pulling on the magnetic sphere, introducing proteins of interest, and maintaining flow. Various high-precision micromanipulators allow us to move pipettes and stage as required. The polystyrene bead is first grabbed, and held using suction; then the magnetic particle at the other end of the DNA is pulled by a force created by either two small (1mm x 2mm x 4mm) bar magnets or a micro magnet-tipped pipette. Changes in the end-to-end length of the DNA are observable in real time. We will present force extension data obtained using the magnetic tweezer.
Rutterford, Clare; Taljaard, Monica; Dixon, Stephanie; Copas, Andrew; Eldridge, Sandra
2015-06-01
To assess the quality of reporting and accuracy of a priori estimates used in sample size calculations for cluster randomized trials (CRTs). We reviewed 300 CRTs published between 2000 and 2008. The prevalence of reporting sample size elements from the 2004 CONSORT recommendations was evaluated and a priori estimates compared with those observed in the trial. Of the 300 trials, 166 (55%) reported a sample size calculation. Only 36 of 166 (22%) reported all recommended descriptive elements. Elements specific to CRTs were the worst reported: a measure of within-cluster correlation was specified in only 58 of 166 (35%). Only 18 of 166 articles (11%) reported both a priori and observed within-cluster correlation values. Except in two cases, observed within-cluster correlation values were either close to or less than a priori values. Even with the CONSORT extension for cluster randomization, the reporting of sample size elements specific to these trials remains below that necessary for transparent reporting. Journal editors and peer reviewers should implement stricter requirements for authors to follow CONSORT recommendations. Authors should report observed and a priori within-cluster correlation values to enable comparisons between these over a wider range of trials. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Next-Generation Sequencing of Aquatic Oligochaetes: Comparison of Experimental Communities
Vivien, Régis; Lejzerowicz, Franck; Pawlowski, Jan
2016-01-01
Aquatic oligochaetes are a common group of freshwater benthic invertebrates known to be very sensitive to environmental changes and currently used as bioindicators in some countries. However, more extensive application of oligochaetes for assessing the ecological quality of sediments in watercourses and lakes would require overcoming the difficulties related to morphology-based identification of oligochaetes species. This study tested the Next-Generation Sequencing (NGS) of a standard cytochrome c oxydase I (COI) barcode as a tool for the rapid assessment of oligochaete diversity in environmental samples, based on mixed specimen samples. To know the composition of each sample we Sanger sequenced every specimen present in these samples. Our study showed that a large majority of OTUs (Operational Taxonomic Unit) could be detected by NGS analyses. We also observed congruence between the NGS and specimen abundance data for several but not all OTUs. Because the differences in sequence abundance data were consistent across samples, we exploited these variations to empirically design correction factors. We showed that such factors increased the congruence between the values of oligochaetes-based indices inferred from the NGS and the Sanger-sequenced specimen data. The validation of these correction factors by further experimental studies will be needed for the adaptation and use of NGS technology in biomonitoring studies based on oligochaete communities. PMID:26866802
Measurement of plutonium isotope ratios in nuclear fuel samples by HPLC-MC-ICP-MS
NASA Astrophysics Data System (ADS)
Günther-Leopold, I.; Waldis, J. Kobler; Wernli, B.; Kopajtic, Z.
2005-04-01
Radioactive isotopes are traditionally quantified by means of radioactivity counting techniques ([alpha], [beta], [gamma]). However, these methods often require extensive matrix separation and sample purification before the identification of specific isotopes and their relative abundance is possible as it is necessary in the frame of post-irradiation examinations on nuclear fuel samples. The technique of multicollector inductively coupled plasma mass spectrometry (MC-ICP-MS) is attracting much attention because it permits the precise measurement of the isotope compositions for a wide range of elements combined with excellent limits of detection due to high ionization efficiencies. The present paper describes one of the first applications of an online high-performance liquid chromatographic separation system coupled to a MC-ICP-MS in order to overcome isobaric interferences for the determination of the plutonium isotope composition and concentrations in irradiated nuclear fuels. The described chromatographic separation is sufficient to prevent any isobaric interference between 238Pu present at trace concentrations and 238U present as the main component of the fuel samples. The external reproducibility of the uncorrected plutonium isotope ratios was determined to be between 0.04 and 0.2% (2 s) resulting in a precision in the [per mille sign] range for the isotopic vectors of the irradiated fuel samples.
40 CFR 76.12 - Phase I NOX compliance extension.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.12 Phase I NOX compliance extension. (a... outage. (iii) Fuel and energy balance summaries and power and other consumption requirements (including...
40 CFR 76.12 - Phase I NOX compliance extension.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.12 Phase I NOX compliance extension. (a... outage. (iii) Fuel and energy balance summaries and power and other consumption requirements (including...
40 CFR 76.12 - Phase I NOX compliance extension.
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.12 Phase I NOX compliance extension. (a... outage. (iii) Fuel and energy balance summaries and power and other consumption requirements (including...
40 CFR 76.12 - Phase I NOX compliance extension.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.12 Phase I NOX compliance extension. (a... outage. (iii) Fuel and energy balance summaries and power and other consumption requirements (including...
40 CFR 76.12 - Phase I NOX compliance extension.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) ACID RAIN NITROGEN OXIDES EMISSION REDUCTION PROGRAM § 76.12 Phase I NOX compliance extension. (a... outage. (iii) Fuel and energy balance summaries and power and other consumption requirements (including...
75 FR 4585 - Proposed Extension of the Approval of Information Collection Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-28
... students. Type of Review: Extension. Agency: Office of Workers' Compensation Programs. Title: Certification... Burden Hours: 50. Total Burden Cost (capital/startup): $0. Total Burden Cost (operating/maintenance): $0...
Friedman, Jannice; Willis, John H
2013-07-01
Species with extensive ranges experience highly variable environments with respect to temperature, light and soil moisture. Synchronizing the transition from vegetative to floral growth is important to employ favorable conditions for reproduction. Optimal timing of this transition might be different for semelparous annual plants and iteroparous perennial plants. We studied variation in the critical photoperiod necessary for floral induction and the requirement for a period of cold-chilling (vernalization) in 46 populations of annuals and perennials in the Mimulus guttatus species complex. We then examined critical photoperiod and vernalization QTLs in growth chambers using F(2) progeny from annual and perennial parents that differed in their requirements for flowering. We identify extensive variation in critical photoperiod, with most annual populations requiring substantially shorter day lengths to initiate flowering than perennial populations. We discover a novel type of vernalization requirement in perennial populations that is contingent on plants experiencing short days first. QTL analyses identify two large-effect QTLs which influence critical photoperiod. In two separate vernalization experiments we discover each set of crosses contain different large-effect QTLs for vernalization. Mimulus guttatus harbors extensive variation in critical photoperiod and vernalization that may be a consequence of local adaptation. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Joyce, G. C.; Rack, Peter M. H.; Ross, H. F.
1974-01-01
1. The mechanical resistance of the human forearm has been measured during imposed sinusoidal flexion-extension movements of the elbow joint. 2. The force required to move the limb can be divided into components required to move the mass, and components required to overcome the resistance offered by elastic and frictional properties of the muscles and other soft tissues. 3. When during a vigorous flexing effort the limb was subjected to a small amplitude sinusoidal movement each extension was followed by a considerable reflex contraction of the flexor muscles. At low frequencies of movement this reflex provided an added resistance to extension, but at 8-12 Hz the delay in the reflex pathway was such that the reflex response to extension occurred after the extension phase of the movement was over and during the subsequent flexion movement. The reflex activity then assisted the movement whereas at other frequencies it impeded it. 4. The reflex response to movement increased as the subject exerted a greater flexing force. 5. Small movements generated a relatively larger reflex response than big ones. 6. Even with large amplitudes of movement when the reflex activity was relatively small, the limb resisted extension with a high level of stiffness; this was comparable with the short range stiffness of muscles in experimental animals. 7. The fact that at some frequencies the reflex response assisted the movement implies that with appropriate loading the limb could undergo a self-sustaining oscillation at those frequencies. PMID:4420490
Multi-Reader ROC studies with Split-Plot Designs: A Comparison of Statistical Methods
Obuchowski, Nancy A.; Gallas, Brandon D.; Hillis, Stephen L.
2012-01-01
Rationale and Objectives Multi-reader imaging trials often use a factorial design, where study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of the design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper we compare three methods of analysis for the split-plot design. Materials and Methods Three statistical methods are presented: Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean ANOVA approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power and confidence interval coverage of the three test statistics. Results The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% CIs fall close to the nominal coverage for small and large sample sizes. Conclusions The split-plot MRMC study design can be statistically efficient compared with the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rate, similar power, and nominal CI coverage, are available for this study design. PMID:23122570
Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.
Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L
2012-12-01
Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.
NASA Technical Reports Server (NTRS)
1979-01-01
User power, duration, and orbit requirements, which were the prime factors influencing power extension package (PEP) design, are discussed. A representative configuration of the PEP concept is presented and the major elements of the system are described as well as the PEP-to-Orbiter and remote manipulator interface provisions.
11 CFR 9038.4 - Extensions of time.
Code of Federal Regulations, 2010 CFR
2010-01-01
... has a right or is required to take action within a period of time prescribed by 11 CFR part 9038 or by... which to exercise such right or take such action. The candidate shall demonstrate in the application for extension that good cause exists for his or her request. (c) An application for extension of time shall be...
78 FR 22281 - Public Land Order No. 7810; Extension of Public Land Order No. 6963; Oregon
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-15
... period. The extension is necessary to continue protection of the natural values of the Florence Sand... Information Relay Service (FIRS) at 1-800-877-8339 to contact either of the above individuals. The FIRS is... which the withdrawal was first made requires this extension to continue the protection of the Florence...
77 FR 66479 - Public Land Order No. 7805; Extension of Public Land Order No. 6952; WA
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-05
...-year period. The extension is necessary to continue protection of the tree improvement and forest... telecommunications device for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800- 877-8339... withdrawal was first made requires this extension to continue protection of the Peony, Pole Pick, and Frank...
Bedrossian, Manuel; Lindensmith, Chris
2017-01-01
Abstract Detection of extant microbial life on Earth and elsewhere in the Solar System requires the ability to identify and enumerate micrometer-scale, essentially featureless cells. On Earth, bacteria are usually enumerated by culture plating or epifluorescence microscopy. Culture plates require long incubation times and can only count culturable strains, and epifluorescence microscopy requires extensive staining and concentration of the sample and instrumentation that is not readily miniaturized for space. Digital holographic microscopy (DHM) represents an alternative technique with no moving parts and higher throughput than traditional microscopy, making it potentially useful in space for detection of extant microorganisms provided that sufficient numbers of cells can be collected. Because sample collection is expected to be the limiting factor for space missions, especially to outer planets, it is important to quantify the limits of detection of any proposed technique for extant life detection. Here we use both laboratory and field samples to measure the limits of detection of an off-axis digital holographic microscope (DHM). A statistical model is used to estimate any instrument's probability of detection at various bacterial concentrations based on the optical performance characteristics of the instrument, as well as estimate the confidence interval of detection. This statistical model agrees well with the limit of detection of 103 cells/mL that was found experimentally with laboratory samples. In environmental samples, active cells were immediately evident at concentrations of 104 cells/mL. Published estimates of cell densities for Enceladus plumes yield up to 104 cells/mL, which are well within the off-axis DHM's limits of detection to confidence intervals greater than or equal to 95%, assuming sufficient sample volumes can be collected. The quantitative phase imaging provided by DHM allowed minerals to be distinguished from cells. Off-axis DHM's ability for rapid low-level bacterial detection and counting shows its viability as a technique for detection of extant microbial life provided that the cells can be captured intact and delivered to the sample chamber in a sufficient volume of liquid for imaging. Key Words: In situ life detection—Extant microorganisms—Holographic microscopy—Ocean Worlds—Enceladus—Imaging. Astrobiology 17, 913–925. PMID:28708412
Two techniques enable sampling of filtered and unfiltered molten metals
NASA Technical Reports Server (NTRS)
Burris, L., Jr.; Pierce, R. D.; Tobias, K. R.; Winsch, I. O.
1967-01-01
Filtered samples of molten metals are obtained by filtering through a plug of porous material fitted in the end of a sample tube, and unfiltered samples are obtained by using a capillary-tube extension rod with a perforated bucket. With these methods there are no sampling errors or loss of liquid.
Hettich, Robert L.; Pan, Chongle; Chourey, Karuna; Giannone, Richard J.
2013-01-01
Summary The availability of extensive genome information for many different microbes, including unculturable species in mixed communities from environmental samples, has enabled systems-biology interrogation by providing a means to access genomic, transcriptomic, and proteomic information. To this end, metaproteomics exploits the power of high performance mass spectrometry for extensive characterization of the complete suite of proteins expressed by a microbial community in an environmental sample. PMID:23469896
EVALUATION OF SOLID SORBENTS FOR WATER SAMPLING
The report describes a systematic evaluation of the applicability of macroreticular resins for general and compound-specific sampling of organics. The first portion is an extensive review of current pertinent literature concerned with the use of macroreticular resins for sampling...
Bieler, Noah S; Tschopp, Jan P; Hünenberger, Philippe H
2015-06-09
An extension of the λ-local-elevation umbrella-sampling (λ-LEUS) scheme [ Bieler et al. J. Chem. Theory Comput. 2014 , 10 , 3006 ] is proposed to handle the multistate (MS) situation, i.e. the calculation of the relative free energies of multiple physical states based on a single simulation. The key element of the MS-λ-LEUS approach is to use a single coupling variable Λ controlling successive pairwise mutations between the states of interest in a cyclic fashion. The Λ variable is propagated dynamically as an extended-system variable, using a coordinate transformation with plateaus and a memory-based biasing potential as in λ-LEUS. Compared to other available MS schemes (one-step perturbation, enveloping distribution sampling and conventional λ-dynamics) the proposed method presents a number of important advantages, namely: (i) the physical states are visited explicitly and over finite time periods; (ii) the extent of unphysical space required to ensure transitions is kept minimal and, in particular, one-dimensional; (iii) the setup protocol solely requires the topologies of the physical states; and (iv) the method only requires limited modifications in a simulation code capable of handling two-state mutations. As an initial application, the absolute binding free energies of five alkali cations to three crown ethers in three different solvents are calculated. The results are found to reproduce qualitatively the main experimental trends and, in particular, the experimental selectivity of 18C6 for K(+) in water and methanol, which is interpreted in terms of opposing trends along the cation series between the solvation free energy of the cation and the direct electrostatic interactions within the complex.
Kerr, William; Rowe, Philip; Pierce, Stephen Gareth
2017-06-01
Robotically guided knee arthroplasty systems generally require an individualized, preoperative 3D model of the knee joint. This is typically measured using Computed Tomography (CT) which provides the required accuracy for preoperative surgical intervention planning. Ultrasound imaging presents an attractive alternative to CT, allowing for reductions in cost and the elimination of doses of ionizing radiation, whilst maintaining the accuracy of the 3D model reconstruction of the joint. Traditional phased array ultrasound imaging methods, however, are susceptible to poor resolution and signal to noise ratios (SNR). Alleviating these weaknesses by offering superior focusing power, synthetic aperture methods have been investigated extensively within ultrasonic non-destructive testing. Despite this, they have yet to be fully exploited in medical imaging. In this paper, the ability of a robotic deployed ultrasound imaging system based on synthetic aperture methods to accurately reconstruct bony surfaces is investigated. Employing the Total Focussing Method (TFM) and the Synthetic Aperture Focussing Technique (SAFT), two samples were imaged which were representative of the bones of the knee joint: a human-shaped, composite distal femur and a bovine distal femur. Data were captured using a 5MHz, 128 element 1D phased array, which was manipulated around the samples using a robotic positioning system. Three dimensional surface reconstructions were then produced and compared with reference models measured using a precision laser scanner. Mean errors of 0.82mm and 0.88mm were obtained for the composite and bovine samples, respectively, thus demonstrating the feasibility of the approach to deliver the sub-millimetre accuracy required for the application. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
On the analysis of time-of-flight spin-echo modulated dark-field imaging data
NASA Astrophysics Data System (ADS)
Sales, Morten; Plomp, Jeroen; Bouwman, Wim G.; Tremsin, Anton S.; Habicht, Klaus; Strobl, Markus
2017-06-01
Spin-Echo Modulated Small Angle Neutron Scattering with spatial resolution, i.e. quantitative Spin-Echo Dark Field Imaging, is an emerging technique coupling neutron imaging with spatially resolved quantitative small angle scattering information. However, the currently achieved relatively large modulation periods of the order of millimeters are superimposed to the images of the samples. So far this required an independent reduction and analyses of the image and scattering information encoded in the measured data and is involving extensive curve fitting routines. Apart from requiring a priori decisions potentially limiting the information content that is extractable also a straightforward judgment of the data quality and information content is hindered. In contrast we propose a significantly simplified routine directly applied to the measured data, which does not only allow an immediate first assessment of data quality and delaying decisions on potentially information content limiting further reduction steps to a later and better informed state, but also, as results suggest, generally better analyses. In addition the method enables to drop the spatial resolution detector requirement for non-spatially resolved Spin-Echo Modulated Small Angle Neutron Scattering.
5 CFR 362.203 - Appointment and extensions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Section 362.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PRESIDENTIAL MANAGEMENT FELLOWS PROGRAM Program Administration § 362.203 Appointment and extensions. (a... requirements (general leadership, managerial, or specialized experience, academic credentials, professional...
THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES
Song, Chi; Min, Xiaoyi; Zhang, Heping
2016-01-01
The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239
MarcoPolo-R: Mission and Spacecraft Design
NASA Astrophysics Data System (ADS)
Peacocke, L.; Kemble, S.; Chapuy, M.; Scheer, H.
2013-09-01
The MarcoPolo-R mission is a candidate for the European Space Agency's medium-class Cosmic Vision programme, with the aim to obtain a 100 g sample of asteroid surface material and return it safely to the Earth. Astrium is one of two industrial contractors currently studying the mission to Phase A level, and the team has been working on the mission and spacecraft design since January 2012. Asteroids are some of the most primitive bodies in our solar system and are key to understanding the formation of the Earth, Sun and other planetary bodies. A returned sample would allow extensive analyses in the large laboratory-sized instruments here on Earth that are not possible with in-situ instruments. This analysis would also increase our understanding of the composition and structure of asteroids, and aid in plans for asteroid deflection techniques. In addition, the mission would be a valuable precursor for missions such as Mars Sample Return, demonstrating a high speed Earth re-entry and hard landing of an entry capsule. Following extensive mission analysis of both the baseline asteroid target 1996 FG3 and alternatives, a particularly favourable trajectory was found to the asteroid 2008 EV5 resulting in a mission duration of 4.5 to 6 years. In October 2012, the MarcoPolo-R baseline target was changed to 2008 EV5 due to its extremely primitive nature, which may pre-date the Sun. This change has a number of advantages: reduced DeltaV requirements, an orbit with a more benign thermal environment, reduced communications distances, and a reduced complexity propulsion system - all of which simplify the spacecraft design significantly. The single spacecraft would launch between 2022 and 2024 on a Soyuz-Fregat launch vehicle from Kourou. Solar electric propulsion is necessary for the outward and return transfers due to the DeltaV requirements, to minimise propellant mass. Once rendezvous with the asteroid is achieved, an observation campaign will begin to characterise the asteroid properties and map the surface in detail. Five potential sampling sites will be selected and closely observed in a local characterisation phase, leading to a single preferred sampling site being identified. The baseline instruments are a Narrow Angle Camera, a Mid-Infrared Spectrometer, a Visible Near-Infrared Spectrometer, a Radio Science Experiment, and a Close-up Camera. For the sampling phase, the spacecraft will perform a touch-and-go manoeuvre. A boom with a sampling mechanism at the end will be deployed, and the spacecraft will descend using visual navigation to touch the asteroid for some seconds. The rotary brush sampling mechanism will be activated on touchdown to obtain a good quality sample comprising regolith dust and pebbles. Low touchdown velocities and collision avoidance are critical at this point to prevent damage to the spacecraft and solar arrays. The spacecraft will then move away, returning to a safe orbit, and the sample will be transferred to an Earth Re-entry Capsule. After a final post-sampling characterisation campaign, the spacecraft will perform the return transfer to Earth. The Earth Re-entry Capsule will be released to directly enter the Earth's atmosphere, and is designed to survive a hard landing with no parachute deceleration. Once recovered, the asteroid sample would be extracted in a sample curation facility in preparation for the full analysis campaign. This presentation will describe Astrium's MarcoPolo-R mission and spacecraft design, with a focus on the innovative aspects of the design.
A hybrid approach to device integration on a genetic analysis platform
NASA Astrophysics Data System (ADS)
Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul
2012-10-01
Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.
Reinforcer control by comparison-stimulus color and location in a delayed matching-to-sample task.
Alsop, Brent; Jones, B Max
2008-05-01
Six pigeons were trained in a delayed matching-to-sample task involving bright- and dim-yellow samples on a central key, a five-peck response requirement to either sample, a constant 1.5-s delay, and the presentation of comparison stimuli composed of red on the left key and green on the right key or vice versa. Green-key responses were occasionally reinforced following the dimmer-yellow sample, and red-key responses were occasionally reinforced following the brighter-yellow sample. Reinforcer delivery was controlled such that the distribution of reinforcers across both comparison-stimulus color and comparison-stimulus location could be varied systematically and independently across conditions. Matching accuracy was high throughout. The ratio of left to right side-key responses increased as the ratio of left to right reinforcers increased, the ratio of red to green responses increased as the ratio of red to green reinforcers increased, and there was no interaction between these variables. However, side-key biases were more sensitive to the distribution of reinforcers across key location than were comparison-color biases to the distribution of reinforcers across key color. An extension of Davison and Tustin's (1978) model of DMTS performance fit the data well, but the results were also consistent with an alternative theory of conditional discrimination performance (Jones, 2003) that calls for a conceptually distinct quantitative model.
NASA Technical Reports Server (NTRS)
1979-01-01
The functional, performance, design, and test requirements for the Orbiter power extension package and its associated ground support equipment are defined. Both government and nongovernment standards and specifications are cited for the following subsystems: electrical power, structural/mechanical, avionics, and thermal control. Quality control assurance provisions and preparation for delivery are also discussed.
You're on Camera---in Color; A Television Handbook for Extension Workers.
ERIC Educational Resources Information Center
Tonkin, Joe
Color television has brought about new concepts of programming and new production requirements. This handbook is designed to aid those Extension workers who are concerned with or will appear on Extension television programs. The book discusses how to make the most of color, what to wear and how to apply makeup for color TV, how colors appear on…
76 FR 62831 - Public Land Order No. 7784; Extension of Public Land Order No. 6886; Wyoming
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-11
.... 6886 for an additional 20-year period. This extension is necessary to continue the protection of the... device for the deaf (TDD) may call the Federal Information Relay Service (FIRS) at 1-800-877-8339 to... withdrawal was first made requires this extension in order to continue the protection of the unique...
76 FR 61737 - Public Land Order No. 7782; Extension of Public Land Order No. 6880; Oregon
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-05
... an additional 20-year period. The extension is necessary to continue protection of the scientific and... Information Relay Service (FIRS) at 1-800- 877-8339 to reach the BLM contact during normal business hours. The... which the withdrawal was first made requires this extension in order to continue the protection of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loo, B.W. Jr.
High resolution x-ray microscopy has been made possible in recent years primarily by two new technologies: microfabricated diffractive lenses for soft x-rays with about 30-50 nm resolution, and high brightness synchrotron x-ray sources. X-ray microscopy occupies a special niche in the array of biological microscopic imaging methods. It extends the capabilities of existing techniques mainly in two areas: a previously unachievable combination of sub-visible resolution and multi-micrometer sample size, and new contrast mechanisms. Because of the soft x-ray wavelengths used in biological imaging (about 1-4 nm), XM is intermediate in resolution between visible light and electron microscopies. Similarly, the penetrationmore » depth of soft x-rays in biological materials is such that the ideal sample thickness for XM falls in the range of 0.25 - 10 {mu}m, between that of VLM and EM. XM is therefore valuable for imaging of intermediate level ultrastructure, requiring sub-visible resolutions, in intact cells and subcellular organelles, without artifacts produced by thin sectioning. Many of the contrast producing and sample preparation techniques developed for VLM and EM also work well with XM. These include, for example, molecule specific staining by antibodies with heavy metal or fluorescent labels attached, and sectioning of both frozen and plastic embedded tissue. However, there is also a contrast mechanism unique to XM that exists naturally because a number of elemental absorption edges lie in the wavelength range used. In particular, between the oxygen and carbon absorption edges (2.3 and 4.4 nm wavelength), organic molecules absorb photons much more strongly than does water, permitting element-specific imaging of cellular structure in aqueous media, with no artifically introduced contrast agents. For three-dimensional imaging applications requiring the capabilities of XM, an obvious extension of the technique would therefore be computerized x-ray microtomography (XMT).« less
Monitoring crack extension in fracture toughness tests by ultrasonics
NASA Technical Reports Server (NTRS)
Klima, S. J.; Fisher, D. M.; Buzzard, R. J.
1975-01-01
An ultrasonic method was used to observe the onset of crack extension and to monitor continued crack growth in fracture toughness specimens during three point bend tests. A 20 MHz transducer was used with commercially available equipment to detect average crack extension less than 0.09 mm. The material tested was a 300-grade maraging steel in the annealed condition. A crack extension resistance curve was developed to demonstrate the usefulness of the ultrasonic method for minimizing the number of tests required to generate such curves.
7 CFR 3419.4 - Limited waiver authority.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE MATCHING FUNDS REQUIREMENT FOR AGRICULTURAL RESEARCH AND EXTENSION FORMULA FUNDS AT 1890 LAND-GRANT INSTITUTIONS, INCLUDING TUSKEGEE UNIVERSITY, AND AT...
7 CFR 1717.603 - RUS approval of extensions and additions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... the acquisition or start of construction. (b) Power supply borrowers. Prior written approval by RUS is required for a power supply borrower to extend or add to its electric system if the extension or addition...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-11
... Hydrostatic Testing Provision of the Portable Fire Extinguishers Standard; Extension of the Office of... the information collection requirements contained in the Hydrostatic Testing provision of the Portable... 48729
NASA Astrophysics Data System (ADS)
Jing, Changfeng; Liang, Song; Ruan, Yong; Huang, Jie
2008-10-01
During the urbanization process, when facing complex requirements of city development, ever-growing urban data, rapid development of planning business and increasing planning complexity, a scalable, extensible urban planning management information system is needed urgently. PM2006 is such a system that can deal with these problems. In response to the status and problems in urban planning, the scalability and extensibility of PM2006 are introduced which can be seen as business-oriented workflow extensibility, scalability of DLL-based architecture, flexibility on platforms of GIS and database, scalability of data updating and maintenance and so on. It is verified that PM2006 system has good extensibility and scalability which can meet the requirements of all levels of administrative divisions and can adapt to ever-growing changes in urban planning business. At the end of this paper, the application of PM2006 in Urban Planning Bureau of Suzhou city is described.
On impact testing of subsize Charpy V-notch type specimens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhail, A.S.; Nanstad, R.K.
1994-12-31
The potential for using subsize specimens to determine the actual properties of reactor pressure vessel steels is receiving increasing attention for improved vessel condition monitoring that could be beneficial for light-water reactor plant-life extension. This potential is made conditional upon, on the one hand, by the possibility of cutting samples of small volume from the internal surface of the pressure vessel for determination of actual properties of the operating pressure vessel. The plant-life extension will require supplemental surveillance data that cannot be provided by the existing surveillance programs. Testing of subsize specimens manufactured from broken halves of previously tested surveillancemore » Charpy V-notch (CVN) specimens offers an attractive means of extending existing surveillance programs. Using subsize CVN type specimens requires the establishment of a specimen geometry that is adequate to obtain a ductile-to-brittle transition curve similar to that obtained from full-size specimens. This requires the development of a correlation of transition temperature and upper-shelf toughness between subsize and full-size specimens. The present study was conducted under the Heavy-Section Steel Irradiation Program. Different published approaches to the use of subsize specimens were analyzed and five different geometries of subsize specimens were selected for testing and evaluation. The specimens were made from several types of pressure vessel steels with a wide range of yield strengths, transition temperatures, and upper-shelf energies (USEs). Effects of specimen dimensions, including depth, angle, and radius of notch have been studied. The correlation of transition temperature determined from different types of subsize specimens and the full-size specimen is presented. A new procedure for transforming data from subsize specimens was developed and is presented.« less
Hendriks, Jan; Stojanovic, Ivan; Schasfoort, Richard B M; Saris, Daniël B F; Karperien, Marcel
2018-06-05
There is a large unmet need for reliable biomarker measurement systems for clinical application. Such systems should meet challenging requirements for large scale use, including a large dynamic detection range, multiplexing capacity, and both high specificity and sensitivity. More importantly, these requirements need to apply to complex biological samples, which require extensive quality control. In this paper, we present the development of an enhancement detection cascade for surface plasmon resonance imaging (SPRi). The cascade applies an antibody sandwich assay, followed by neutravidin and a gold nanoparticle enhancement for quantitative biomarker measurements in small volumes of complex fluids. We present a feasibility study both in simple buffers and in spiked equine synovial fluid with four cytokines, IL-1β, IL-6, IFN-γ, and TNF-α. Our enhancement cascade leads to an antibody dependent improvement in sensitivity up to 40 000 times, resulting in a limit of detection as low as 50 fg/mL and a dynamic detection range of more than 7 logs. Additionally, measurements at these low concentrations are highly reliable with intra- and interassay CVs between 2% and 20%. We subsequently showed this assay is suitable for multiplex measurements with good specificity and limited cross-reactivity. Moreover, we demonstrated robust detection of IL-6 and IL-1β in spiked undiluted equine synovial fluid with small variation compared to buffer controls. In addition, the availability of real time measurements provides extensive quality control opportunities, essential for clinical applications. Therefore, we consider this method is suitable for broad application in SPRi for multiplex biomarker detection in both research and clinical settings.
Wang, Yuker; Carlton, Victoria EH; Karlin-Neumann, George; Sapolsky, Ronald; Zhang, Li; Moorhead, Martin; Wang, Zhigang C; Richardson, Andrea L; Warren, Robert; Walther, Axel; Bondy, Melissa; Sahin, Aysegul; Krahe, Ralf; Tuna, Musaffe; Thompson, Patricia A; Spellman, Paul T; Gray, Joe W; Mills, Gordon B; Faham, Malek
2009-01-01
Background A major challenge facing DNA copy number (CN) studies of tumors is that most banked samples with extensive clinical follow-up information are Formalin-Fixed Paraffin Embedded (FFPE). DNA from FFPE samples generally underperforms or suffers high failure rates compared to fresh frozen samples because of DNA degradation and cross-linking during FFPE fixation and processing. As FFPE protocols may vary widely between labs and samples may be stored for decades at room temperature, an ideal FFPE CN technology should work on diverse sample sets. Molecular Inversion Probe (MIP) technology has been applied successfully to obtain high quality CN and genotype data from cell line and frozen tumor DNA. Since the MIP probes require only a small (~40 bp) target binding site, we reasoned they may be well suited to assess degraded FFPE DNA. We assessed CN with a MIP panel of 50,000 markers in 93 FFPE tumor samples from 7 diverse collections. For 38 FFPE samples from three collections we were also able to asses CN in matched fresh frozen tumor tissue. Results Using an input of 37 ng genomic DNA, we generated high quality CN data with MIP technology in 88% of FFPE samples from seven diverse collections. When matched fresh frozen tissue was available, the performance of FFPE DNA was comparable to that of DNA obtained from matched frozen tumor (genotype concordance averaged 99.9%), with only a modest loss in performance in FFPE. Conclusion MIP technology can be used to generate high quality CN and genotype data in FFPE as well as fresh frozen samples. PMID:19228381
Ferrone, Cristina R; Ting, David T; Shahid, Mohammed; Konstantinidis, Ioannis T; Sabbatino, Francesco; Goyal, Lipika; Rice-Stitt, Travis; Mubeen, Ayesha; Arora, Kshitij; Bardeesey, Nabeel; Miura, John; Gamblin, T Clark; Zhu, Andrew X; Borger, Darrell; Lillemoe, Keith D; Rivera, Miguel N; Deshpande, Vikram
2016-01-01
Intrahepatic cholangiocarcinoma (ICC) often is a diagnosis determined by exclusion. Distinguishing ICC from other metastatic adenocarcinomas based on histopathologic or immunohistochemical analysis often is difficult and requires an extensive workup. This study aimed to determine whether albumin, whose expression is restricted to the liver, has potential as a biomarker for ICC using a novel and highly sensitive RNA in situ hybridization (ISH) platform. Modified branched DNA probes were developed for albumin RNA ISH. The study evaluated 467 patient samples of primary and metastatic lesions. Of the 467 samples evaluated, 83 were ICCs, 42 were hepatocellular carcinomas (HCCs), and 332 were nonhepatic carcinomas including tumors arising from the perihilar region and bile duct, pancreas, stomach, esophagus, colon, breast, ovary, endometrium, kidney, and urinary bladder. Albumin RNA ISH was highly sensitive for cancers of liver origin, staining positive in 82 (99 %) of 83 ICCs and in 42 HCCs (100 %). Perihilar and distal bile duct carcinomas as well as carcinomas arising at other sites tested negative for albumin. Notably, 6 (22 %) of 27 intrahepatic tumors previously diagnosed as carcinomas of undetermined origin tested positive for albumin. Albumin RNA ISH is a sensitive and highly specific diagnostic tool for distinguishing ICC from metastatic adenocarcinoma to the liver or carcinoma of unknown origin. Albumin RNA ISH could replace the extensive diagnostic workup, leading to timely confirmation of the ICC diagnosis. Additionally, the assay could serve as a guide to distinguish ICC from perihilar adenocarcinoma.
Hunt, Kathleen E; Rolland, Rosalind M; Kraus, Scott D
2015-10-01
The North Atlantic right whale, Eubalaena glacialis (NARW), a critically endangered species that has been under intensive study for nearly four decades, provides an excellent case study for applying modern methods of conservation physiology to large whales. By combining long-term sighting histories of known individuals with physiological data from newer techniques (e.g., body condition estimated from photographs; endocrine status derived from fecal samples), physiological state and levels of stress can be estimated despite the lack of any method for nonlethal capture of large whales. Since traditional techniques for validating blood assays cannot be used in large whales, assays of fecal hormones have been validated using information on age, sex, and reproductive state derived from an extensive NARW photo-identification catalog. Using this approach, fecal glucocorticoids have been found to vary dramatically with reproductive state. It is therefore essential that glucocorticoid data be interpreted in conjunction with reproductive data. A case study correlating glucocorticoids with chronic noise is presented as an example. Keys to a successful research program for this uncatchable species have included: consistent population monitoring over decades, data-sharing across institutions, an extensive photo-identification catalog that documents individual histories, and consistent efforts at noninvasive collection of samples over years. Future research will require flexibility to adjust to changing distributions of populations. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Modified dough preparation for Alveograph analysis with limited flour sample size
USDA-ARS?s Scientific Manuscript database
Dough rheological characteristics, such as resistance-to-extension and extensibility, obtained by alveograph testing are important traits for determination of wheat and flour quality. A challenging issue that faces wheat breeding programs and some wheat-research projects is the relatively large flou...
Boja, Emily S; Rodriguez, Henry
2012-04-01
Traditional shotgun proteomics used to detect a mixture of hundreds to thousands of proteins through mass spectrometric analysis, has been the standard approach in research to profile protein content in a biological sample which could lead to the discovery of new (and all) protein candidates with diagnostic, prognostic, and therapeutic values. In practice, this approach requires significant resources and time, and does not necessarily represent the goal of the researcher who would rather study a subset of such discovered proteins (including their variations or posttranslational modifications) under different biological conditions. In this context, targeted proteomics is playing an increasingly important role in the accurate measurement of protein targets in biological samples in the hope of elucidating the molecular mechanism of cellular function via the understanding of intricate protein networks and pathways. One such (targeted) approach, selected reaction monitoring (or multiple reaction monitoring) mass spectrometry (MRM-MS), offers the capability of measuring multiple proteins with higher sensitivity and throughput than shotgun proteomics. Developing and validating MRM-MS-based assays, however, is an extensive and iterative process, requiring a coordinated and collaborative effort by the scientific community through the sharing of publicly accessible data and datasets, bioinformatic tools, standard operating procedures, and well characterized reagents. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chua, Hoe-Chee; Lee, Hoi-Sim; Sng, Mui-Tiang
2006-01-13
Analysing nitrogen mustards and their degradation products in decontamination emulsions posed a significant challenge due to the different phases present in such matrices. Extensive sample preparation may be required to isolate target analytes. Furthermore, numerous reaction products are formed in the decontamination emulsion. A fast and effective qualitative screening procedure was developed for these compounds, using liquid chromatography-mass spectrometry (LC-MS). This eliminated the need for additional sample handling and derivatisation that are required for gas chromatographic-mass spectrometric (GC-MS) analysis. A liquid chromatograph with mixed mode column and isocratic elution gave good chromatography. The feasibility of applying this technique for detecting these compounds in spiked water and decontamination emulsion was demonstrated. Detailed characterisation of the degradation products in these two matrices was carried out. The results demonstrated that N-methyldiethanolamine (MDEA), N-ethyldiethanolamine (EDEA) and triethanolamine (TEA) are not the major degradation products of their respective nitrogen mustards. Degradation profiles of nitrogen mustards in water were also established. In verification analysis, it is important not only to develop methods for the identification of the actual chemical agents; the methods must also encompass degradation products of the chemical agents as well so as to exclude false negatives. This study demonstrated the increasingly pivotal role that LC-MS play in verification analysis.
Whitney, G. A.; Mansour, J. M.; Dennis, J. E.
2015-01-01
The mechanical loading environment encountered by articular cartilage in situ makes frictional-shear testing an invaluable technique for assessing engineered cartilage. Despite the important information that is gained from this testing, it remains under-utilized, especially for determining damage behavior. Currently, extensive visual inspection is required to assess damage; this is cumbersome and subjective. Tools to simplify, automate, and remove subjectivity from the analysis may increase the accessibility and usefulness of frictional-shear testing as an evaluation method. The objective of this study was to determine if the friction signal could be used to detect damage that occurred during the testing. This study proceeded in two phases: first, a simplified model of biphasic lubrication that does not require knowledge of interstitial fluid pressure was developed. In the second phase, frictional-shear tests were performed on 74 cartilage samples, and the simplified model was used to extract characteristic features from the friction signals. Using support vector machine classifiers, the extracted features were able to detect damage with a median accuracy of approximately 90%. The accuracy remained high even in samples with minimal damage. In conclusion, the friction signal acquired during frictional-shear testing can be used to detect resultant damage to a high level of accuracy. PMID:25691395
Nakazawa, Hiroyuki; Iwasaki, Yusuke; Ito, Rie
2014-01-01
Our modern society has created a large number of chemicals that are used for the production of everyday commodities including toys, food packaging, cosmetic products, and building materials. We enjoy a comfortable and convenient lifestyle with access to these items. In addition, in specialized areas, such as experimental science and various medical fields, laboratory equipment and devices that are manufactured using a wide range of chemical substances are also extensively employed. The association between human exposure to trace hazardous chemicals and an increased incidence of endocrine disease has been recognized. However, the evaluation of human exposure to such endocrine disrupting chemicals is therefore imperative, and the determination of exposure levels requires the analysis of human biological materials, such as blood and urine. To obtain as much information as possible from limited sample sizes, highly sensitive and reliable analytical methods are also required for exposure assessments. The present review focuses on effective analytical methods for the quantification of bisphenol A (BPA), alkylphenols (APs), phthalate esters (PEs), and perfluoronated chemicals (PFCs), which are chemicals used in the production of everyday commodities. Using data obtained from liquid chromatography/mass spectrometry (LC/MS) and LC/MS/MS analyses, assessments of the risks to humans were also presented based on the estimated levels of exposure to PFCs.
Design of a composite wing extension for a general aviation aircraft
NASA Technical Reports Server (NTRS)
Adney, P. S.; Horn, W. J.
1984-01-01
A composite wing extension was designed for a typical general aviation aircraft to improve lift curve slope, dihedral effect, and lift to drag ratio. Advanced composite materials were used in the design to evaluate their use as primary structural components in general aviation aircraft. Extensive wind tunnel tests were used to evaluate six extension shapes. The extension shape chosen as the best choice was 28 inches long with a total area of 17 square feet. Subsequent flight tests showed the wing extension's predicted aerodynamic improvements to be correct. The structural design of the wing extension consisted of a hybrid laminate carbon core with outer layers of Kevlar - layed up over a foam interior which acted as an internal support. The laminate skin of the wing extension was designed from strength requirements, and the foam core was included to prevent buckling. A joint lap was recommended to attach the wing extension to the main wing structure.
7 CFR 3419.5 - Certification of matching funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
....5 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE MATCHING FUNDS REQUIREMENT FOR AGRICULTURAL RESEARCH AND EXTENSION FORMULA FUNDS AT 1890 LAND-GRANT INSTITUTIONS, INCLUDING TUSKEGEE UNIVERSITY, AND AT...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-27
... education system. Focused on national issues, its purpose is to represent the Secretary of Agriculture and... research, extension, and education. Before awards can be made, certain information is required from...
A Novel Videography Method for Generating Crack-Extension Resistance Curves in Small Bone Samples
Katsamenis, Orestis L.; Jenkins, Thomas; Quinci, Federico; Michopoulou, Sofia; Sinclair, Ian; Thurner, Philipp J.
2013-01-01
Assessment of bone quality is an emerging solution for quantifying the effects of bone pathology or treatment. Perhaps one of the most important parameters characterising bone quality is the toughness behaviour of bone. Particularly, fracture toughness, is becoming a popular means for evaluating bone quality. The method is moving from a single value approach that models bone as a linear-elastic material (using the stress intensity factor, K) towards full crack extension resistance curves (R-curves) using a non-linear model (the strain energy release rate in J-R curves). However, for explanted human bone or small animal bones, there are difficulties in measuring crack-extension resistance curves due to size constraints at the millimetre and sub-millimetre scale. This research proposes a novel “whitening front tracking” method that uses videography to generate full fracture resistance curves in small bone samples where crack propagation cannot typically be observed. Here we present this method on sharp edge notched samples (<1 mm×1 mm×Length) prepared from four human femora tested in three-point bending. Each sample was loaded in a mechanical tester with the crack propagation recorded using videography and analysed using an algorithm to track the whitening (damage) zone. Using the “whitening front tracking” method, full R-curves and J-R curves could be generated for these samples. The curves for this antiplane longitudinal orientation were similar to those found in the literature, being between the published longitudinal and transverse orientations. The proposed technique shows the ability to generate full “crack” extension resistance curves by tracking the whitening front propagation to overcome the small size limitations and the single value approach. PMID:23405186
Code of Federal Regulations, 2011 CFR
2011-10-01
... the first 10 years of my lease for BLM to grant the initial extension of the primary term of my lease... Lease Terms and Extensions § 3207.11 What work am I required to perform during the first 10 years of my... section, your lease will expire at the end of the 10-year primary term. (e) If you complied with paragraph...
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
Calabria, Andrea; Spinozzi, Giulio; Benedicenti, Fabrizio; Tenderini, Erika; Montini, Eugenio
2015-01-01
Many biological laboratories that deal with genomic samples are facing the problem of sample tracking, both for pure laboratory management and for efficiency. Our laboratory exploits PCR techniques and Next Generation Sequencing (NGS) methods to perform high-throughput integration site monitoring in different clinical trials and scientific projects. Because of the huge amount of samples that we process every year, which result in hundreds of millions of sequencing reads, we need to standardize data management and tracking systems, building up a scalable and flexible structure with web-based interfaces, which are usually called Laboratory Information Management System (LIMS). We started collecting end-users' requirements, composed of desired functionalities of the system and Graphical User Interfaces (GUI), and then we evaluated available tools that could address our requirements, spanning from pure LIMS to Content Management Systems (CMS) up to enterprise information systems. Our analysis identified ADempiere ERP, an open source Enterprise Resource Planning written in Java J2EE, as the best software that also natively implements some highly desirable technological advances, such as the high usability and modularity that grants high use-case flexibility and software scalability for custom solutions. We extended and customized ADempiere ERP to fulfil LIMS requirements and we developed adLIMS. It has been validated by our end-users verifying functionalities and GUIs through test cases for PCRs samples and pre-sequencing data and it is currently in use in our laboratories. adLIMS implements authorization and authentication policies, allowing multiple users management and roles definition that enables specific permissions, operations and data views to each user. For example, adLIMS allows creating sample sheets from stored data using available exporting operations. This simplicity and process standardization may avoid manual errors and information backtracking, features that are not granted using track recording on files or spreadsheets. adLIMS aims to combine sample tracking and data reporting features with higher accessibility and usability of GUIs, thus allowing time to be saved on doing repetitive laboratory tasks, and reducing errors with respect to manual data collection methods. Moreover, adLIMS implements automated data entry, exploiting sample data multiplexing and parallel/transactional processing. adLIMS is natively extensible to cope with laboratory automation through platform-dependent API interfaces, and could be extended to genomic facilities due to the ERP functionalities.
Epidural extension failure in obese women is comparable to that of non-obese women.
Eley, V A; Chin, A; Tham, I; Poh, J; Aujla, P; Glasgow, E; Brown, H; Steele, K; Webb, L; van Zundert, A
2018-07-01
Management of labor epidurals in obese women is difficult and extension to surgical anesthesia is not always successful. Our previous retrospective pilot study found epidural extension was more likely to fail in obese women. This study used a prospective cohort to compare the failure rate of epidural extension in obese and non-obese women and to identify risk factors for extension failure. One hundred obese participants (Group O, body mass index ≥ 40 kg/m 2 ) were prospectively identified and allocated two sequential controls (Group C, body mass index ≤ 30 kg/m 2 ). All subjects utilized epidural labor analgesia and subsequently required anesthesia for cesarean section. The primary outcome measure was failure of the labor epidural to be used as the primary anesthetic technique. Risk factors for extension failure were identified using Chi-squared and logistic regression. The odds ratio (OR) of extension failure was 1.69 in Group O (20% vs. 13%; 95% CI: 0.88-3.21, P = 0.11). Risk factors for failure in obese women included ineffective labor analgesia requiring anesthesiologist intervention, (OR 3.94, 95% CI: 1.16-13.45, P = 0.028) and BMI > 50 kg/m 2 (OR 3.42, 95% CI: 1.07-10.96, P = 0.038). The failure rate of epidural extension did not differ significantly between the groups. Further research is needed to determine the influence of body mass index > 50 kg/m 2 on epidural extension for cesarean section. © 2018 The Authors. Acta Anaesthesiologica Scandinavica published by John Wiley & Sons Ltd on behalf of Acta Anaesthesiologica Scandinavica Foundation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Z. J.; Wells, D.; Green, J.
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switchingmore » the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.« less
Flexible approach to vibrational sum-frequency generation using shaped near-infrared light
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhury, Azhad U.; Liu, Fangjie; Watson, Brianna R.
We describe a new approach that expands the utility of vibrational sum-frequency generation (vSFG) spectroscopy using shaped near-infrared (NIR) laser pulses. Here, we demonstrate that arbitrary pulse shapes can be specified to match experimental requirements without the need for changes to the optical alignment. In this way, narrowband NIR pulses as long as 5.75 ps are readily generated, with a spectral resolution of about 2.5 cm -1, an improvement of approximately a factor of 3 compared to a typical vSFG system. Moreover, the utility of having complete control over the NIR pulse characteristics is demonstrated through nonresonant background suppression frommore » a metallic substrate by generating an etalon waveform in the pulse shaper. The flexibility afforded by switching between arbitrary NIR waveforms at the sample position with the same instrument geometry expands the type of samples that can be studied without extensive modifications to existing apparatuses or large investments in specialty optics.« less
Optimal structure and parameter learning of Ising models
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...
2018-03-16
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
Effect of Next-Generation Exome Sequencing Depth for Discovery of Diagnostic Variants.
Kim, Kyung; Seong, Moon-Woo; Chung, Won-Hyong; Park, Sung Sup; Leem, Sangseob; Park, Won; Kim, Jihyun; Lee, KiYoung; Park, Rae Woong; Kim, Namshin
2015-06-01
Sequencing depth, which is directly related to the cost and time required for the generation, processing, and maintenance of next-generation sequencing data, is an important factor in the practical utilization of such data in clinical fields. Unfortunately, identifying an exome sequencing depth adequate for clinical use is a challenge that has not been addressed extensively. Here, we investigate the effect of exome sequencing depth on the discovery of sequence variants for clinical use. Toward this, we sequenced ten germ-line blood samples from breast cancer patients on the Illumina platform GAII(x) at a high depth of ~200×. We observed that most function-related diverse variants in the human exonic regions could be detected at a sequencing depth of 120×. Furthermore, investigation using a diagnostic gene set showed that the number of clinical variants identified using exome sequencing reached a plateau at an average sequencing depth of about 120×. Moreover, the phenomena were consistent across the breast cancer samples.
Optimal structure and parameter learning of Ising models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
Dehydration-driven stress transfer triggers intermediate-depth earthquakes
Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; Deldicque, Damien; Labrousse, Loïc; Gasc, Julien; Renner, Joerg; Wang, Yanbin; Green II, Harry W.; Schubnel, Alexandre
2017-01-01
Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearing micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediate-depth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement. PMID:28504263
Dehydration-driven stress transfer triggers intermediate-depth earthquakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah
Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here in this paper we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearingmore » micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediatedepth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.« less
A short treatise concerning a musical approach for the interpretation of gene expression data
Staege, Martin S.
2015-01-01
Recent technical developments allow the genome-wide and near-complete analysis of gene expression in a given sample, e.g. by usage of high-density DNA microarrays or next generation sequencing. The generated data structure is usually multi-dimensional and requires extensive processing not only for analysis but also for presentation of the results. Today, such data are usually presented graphically, e.g. in the form of heat maps. In the present paper, we propose an alternative form of analysis and presentation which is based on the transformation of gene expression data into sounds that are characterized by their frequency (pitch) and tone duration. Using DNA microarray data from a panel of neuroblastoma and Ewing sarcoma cell lines as well as from Hodgkin’s lymphoma cell lines and normal B cells, we demonstrate that this Gene Expression Music Algorithm (GEMusicA) can be used for discrimination between samples with different biology and for the characterization of differentially expressed genes. PMID:26472273
Dehydration-driven stress transfer triggers intermediate-depth earthquakes
Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; ...
2017-05-15
Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here in this paper we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearingmore » micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediatedepth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.« less
Flexible approach to vibrational sum-frequency generation using shaped near-infrared light
Chowdhury, Azhad U.; Liu, Fangjie; Watson, Brianna R.; ...
2018-04-23
We describe a new approach that expands the utility of vibrational sum-frequency generation (vSFG) spectroscopy using shaped near-infrared (NIR) laser pulses. Here, we demonstrate that arbitrary pulse shapes can be specified to match experimental requirements without the need for changes to the optical alignment. In this way, narrowband NIR pulses as long as 5.75 ps are readily generated, with a spectral resolution of about 2.5 cm -1, an improvement of approximately a factor of 3 compared to a typical vSFG system. Moreover, the utility of having complete control over the NIR pulse characteristics is demonstrated through nonresonant background suppression frommore » a metallic substrate by generating an etalon waveform in the pulse shaper. The flexibility afforded by switching between arbitrary NIR waveforms at the sample position with the same instrument geometry expands the type of samples that can be studied without extensive modifications to existing apparatuses or large investments in specialty optics.« less
Wang, Qiuyan; Wu, Huili; Wang, Anming; Du, Pengfei; Pei, Xiaolin; Li, Haifeng; Yin, Xiaopu; Huang, Lifeng; Xiong, Xiaolong
2010-01-01
DNA family shuffling is a powerful method for enzyme engineering, which utilizes recombination of naturally occurring functional diversity to accelerate laboratory-directed evolution. However, the use of this technique has been hindered by the scarcity of family genes with the required level of sequence identity in the genome database. We describe here a strategy for collecting metagenomic homologous genes for DNA shuffling from environmental samples by truncated metagenomic gene-specific PCR (TMGS-PCR). Using identified metagenomic gene-specific primers, twenty-three 921-bp truncated lipase gene fragments, which shared 64–99% identity with each other and formed a distinct subfamily of lipases, were retrieved from 60 metagenomic samples. These lipase genes were shuffled, and selected active clones were characterized. The chimeric clones show extensive functional and genetic diversity, as demonstrated by functional characterization and sequence analysis. Our results indicate that homologous sequences of genes captured by TMGS-PCR can be used as suitable genetic material for DNA family shuffling with broad applications in enzyme engineering. PMID:20962349
An evolution based biosensor receptor DNA sequence generation algorithm.
Kim, Eungyeong; Lee, Malrey; Gatton, Thomas M; Lee, Jaewan; Zang, Yupeng
2010-01-01
A biosensor is composed of a bioreceptor, an associated recognition molecule, and a signal transducer that can selectively detect target substances for analysis. DNA based biosensors utilize receptor molecules that allow hybridization with the target analyte. However, most DNA biosensor research uses oligonucleotides as the target analytes and does not address the potential problems of real samples. The identification of recognition molecules suitable for real target analyte samples is an important step towards further development of DNA biosensors. This study examines the characteristics of DNA used as bioreceptors and proposes a hybrid evolution-based DNA sequence generating algorithm, based on DNA computing, to identify suitable DNA bioreceptor recognition molecules for stable hybridization with real target substances. The Traveling Salesman Problem (TSP) approach is applied in the proposed algorithm to evaluate the safety and fitness of the generated DNA sequences. This approach improves efficiency and stability for enhanced and variable-length DNA sequence generation and allows extension to generation of variable-length DNA sequences with diverse receptor recognition requirements.
Two modes of orogenic collapse of the Pamir plateau recorded by titanite
NASA Astrophysics Data System (ADS)
Stearns, M. A.; Hacker, B. R.; Ratschbacher, L.; Rutte, D.; Kylander-Clark, A. R.
2013-12-01
Processes that operate in the mid- to lower crust during and following continent-continent collision are important for understanding how orogenic plateaux transition from thickening to collapse. In the central and southern Pamir, mid- to lower crustal rocks crop out in two belts of extensional domes. The central Pamir domes were exhumed by symmetrical N-S extension. In contrast, the southern Pamir domes were exhumed by asymmetrical top to the south (NNW-SSE) extension via a rolling-hinge detachment. To investigate the high-temperature exhumation history, titanites were dated using LASS (laser ablation split stream-ICP-MS). A multi-collector ICP was used to collect U-Pb isotopic ratios and a single collector ICP-MS was used to measure trace-element abundances. The data indicate that the central Pamir domes began exhumation synchronously at ~17 Ma. Titanite from the southern Pamir record two periods of protracted (re)crystallization: older metamorphic dates ranging from ~35-18 Ma and younger igneous and metamorphic dates from ~15-7 Ma. Samples with single populations of titanite dates are present throughout both groups. Samples with more-complex date populations typically have distinct trace-element (e.g., Sr, Y, Zr, and Nb) groups that can be used to distinguish different date populations (e.g., older dates may have higher Zr and younger dates lower Zr). The distinct early exhumation histories of the north and south Pamir require either a diachronous single process or two semi-independent processes. The N to S sequence of exhumation, ranges of dates, and overall extension directions may be related to two important plate-tectonic events inferred from seismic data: 1) breakoff of the northward subducting Indian slab around ~20 Ma, and 2) southward subduction and northwestward rollback of the Asian lithosphere between ~15-10 Ma based on geodetic convergence rates and Benioff zone length. We interpret these two lithospheric-detachment events to have driven the exhumation in the Pamir by changing the gravitational potential energy and boundary forces of the plateau.
Extended 60 μm Emission from Nearby Mira Variables
NASA Astrophysics Data System (ADS)
Bauer, W. H.; Stencel, R. E.
1993-01-01
Circumstellar dust envelopes around some optically visible late-type stars are so extensive that they are detectable as extended at an arc-minute scale by the IRAS survey observations (Stencel, Pesce and Bauer 1988, Astron. J 95, 141; Hawkins 1990, Astron. Ap. 229, L8). The width of the IRAS scan profiles at 10% of peak intensity is an indicator of source extension. Wyatt and Cahn (1983, Ap. J. 275, 225) presented a sample of 124 Mira variables in the solar neighborhood. Of this sample, 11 Miras which show silicate emission are bright enough at 60 microns for a significant determination of the width of a scan at 10% of peak flux. Individual scans and maps were examined in order to determine whether any observed extension was associated with the central star. Five stars showed significant extension apparently due to mass loss from the central star: R Leo, o Cet, U Ori, R Cas and R Hor. IRAS LRS spectra, point source fluxes and observed extensions of these sources are compared to the predictions of model dust shells which assume steady mass loss. This work was supported in part by NASA grant NAG 5-1213 to Wellesley College.
Screenometer: a device for sampling vegetative screening in forested areas
Victor A. Rudis
1985-01-01
A-device for estimating the degree to which vegetation and other obstructions screen forested areas has been adapted to an extensive sampling design for forest surveys. Procedures are recommended to assure that uniform measurements can be made. Examination of sources of sampling variation (observers, points within sampled locations, series of observations within points...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-31
... Activities: Extension of a Currently Approved Collection; Comments Requested National Crime Victimization... collection. (2) Title of the Form/Collection: National Crime Victimization Survey (NCVS). (3) Agency form... NCVS sampled households located throughout the United States. The National Crime Victimization Survey...
46 CFR 11.707 - Examination requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 1 2010-10-01 2010-10-01 false Examination requirements. 11.707 Section 11.707 Shipping... OFFICER ENDORSEMENTS Professional Requirements for Pilots § 11.707 Examination requirements. (a) An... required to pass the examination described in subpart I of this part. (b) An applicant for an extension of...
Curation and Analysis of Samples from Comet Wild-2 Returned by NASA's Stardust Mission
NASA Technical Reports Server (NTRS)
Nakamura-Messenger, Keiko; Walker, Robert M.
2015-01-01
The NASA Stardust mission returned the first direct samples of a cometary coma from comet 81P/Wild-2 in 2006. Intact capture of samples encountered at 6 km/s was enabled by the use of aerogel, an ultralow dense silica polymer. Approximately 1000 particles were captured, with micron and submicron materials distributed along mm scale length tracks. This sample collection method and the fine scale of the samples posed new challenges to the curation and cosmochemistry communities. Sample curation involved extensive, detailed photo-documentation and delicate micro-surgery to remove particles without loss from the aerogel tracks. This work had to be performed in highly clean facility to minimize the potential of contamination. JSC Curation provided samples ranging from entire tracks to micrometer-sized particles to external investigators. From the analysis perspective, distinguishing cometary materials from aerogel and identifying the potential alteration from the capture process were essential. Here, transmission electron microscopy (TEM) proved to be the key technique that would make this possible. Based on TEM work by ourselves and others, a variety of surprising findings were reported, such as the observation of high temperature phases resembling those found in meteorites, rarely intact presolar grains and scarce organic grains and submicrometer silicates. An important lesson from this experience is that curation and analysis teams must work closely together to understand the requirements and challenges of each task. The Stardust Mission also has laid important foundation to future sample returns including OSIRIS-REx and Hayabusa II and future cometary nucleus sample return missions.
Narragansett Bay (NB) has been extensively sampled over the last 50 years by various government agencies, academic institutions, and private groups. To date, most spatial research conducted within the estuary has employed deterministic sampling designs. Several studies have used ...
Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...
2017-09-01
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reddington, C. L.; Carslaw, K. S.; Stier, P.
The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zanzoni, Serena; D'Onofrio, Mariapina; Molinari, Henriette
2012-10-26
Highlights: Black-Right-Pointing-Pointer Bile acid binding proteins from different constructs retain structural integrity. Black-Right-Pointing-Pointer NMR {sup 15}N-T{sub 1} relaxation data of BABPs show differences if LVPR extension is present. Black-Right-Pointing-Pointer Deviations from a {sup 15}N-T{sub 1}/molecular-weight calibration curve indicate aggregation. -- Abstract: The use of a recombinant protein to investigate the function of the native molecule requires that the former be obtained with the same amino acid sequence as the template. However, in many cases few additional residues are artificially introduced for cloning or purification purposes, possibly resulting in altered physico-chemical properties that may escape routine characterization. For example, increased aggregationmore » propensity without visible protein precipitation is hardly detected by most analytical techniques but its investigation may be of great importance for optimizing the yield of recombinant protein production in biotechnological and structural biology applications. In this work we show that bile acid binding proteins incorporating the common C-terminal LeuValProArg extension display different hydrodynamic properties from those of the corresponding molecules without such additional amino acids. The proteins were produced enriched in nitrogen-15 for analysis via heteronuclear NMR spectroscopy. Residue-specific spin relaxation rates were measured and related to rotational tumbling time and molecular size. While the native-like recombinant proteins show spin-relaxation rates in agreement with those expected for monomeric globular proteins of their mass, our data indicate the presence of larger adducts for samples of proteins with very short amino acid extensions. The used approach is proposed as a further screening method for the quality assessment of biotechnological protein products.« less
Analysis of routine pilot-controller communication
NASA Technical Reports Server (NTRS)
Morrow, Daniel G.; Lee, Alfred; Rodvold, Michelle
1990-01-01
Although pilot-controller communication is central to aviation safety, this area of aviation human factors has not been extensively researched. Most research has focused on what kinds of communication problems occur. A more complete picture of communication problems requires understanding how communication usually works in routine operations. A sample of routine pilot-controller communication in the TRACON environment is described. After describing several dimensions of routine communication, three kinds of communication problems are treated: inaccuracies such as incorrect readbacks, procedural deviations such as missing callsigns and readbacks, and nonroutine transactions where pilot and controller must deal with misunderstandings or other communication problems. Preliminary results suggest these problems are not frequent events in daily operations. However, analysis of the problems that do occur suggest some factors that may cause them.
5 CFR 2634.903 - General requirements, filing dates, and extensions.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... The agency may request that the individual update such a report if more than six months has expired... may grant such individual a filing extension to last no longer than 90 days after the last day of: (i...
5 CFR 2634.903 - General requirements, filing dates, and extensions.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... The agency may request that the individual update such a report if more than six months has expired... may grant such individual a filing extension to last no longer than 90 days after the last day of: (i...
5 CFR 2634.903 - General requirements, filing dates, and extensions.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... The agency may request that the individual update such a report if more than six months has expired... may grant such individual a filing extension to last no longer than 90 days after the last day of: (i...
5 CFR 470.315 - Project modification and extension.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Project modification and extension. 470.315 Section 470.315 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PERSONNEL MANAGEMENT RESEARCH PROGRAMS AND DEMONSTRATIONS PROJECTS Regulatory Requirements Pertaining to...
78 FR 30333 - Proposed Extension of Information Collection Requests Submitted for Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-22
...: Employee Benefits Security Administration, Department of Labor. Title: Notice Requirements of the Health... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information Collection Requests Submitted for Public Comment AGENCY: Employee Benefits Security Administration...
The Resilience Scale for Adults: Construct Validity and Measurement in a Belgian Sample
ERIC Educational Resources Information Center
Hjemdal, Odin; Friborg, Oddgeir; Braun, Stephanie; Kempenaers, Chantal; Linkowski, Paul; Fossion, Pierre
2011-01-01
The Resilience Scale for Adults (RSA) was developed and has been extensively validated in Norwegian samples. The purpose of this study was to explore the construct validity of the Resilience Scale for Adults in a French-speaking Belgian sample and test measurement invariance between the Belgian and a Norwegian sample. A Belgian student sample (N =…
An extension of incidental teaching procedures to reading instruction for autistic children.
McGee, G G; Krantz, P J; McClannahan, L E
1986-01-01
In an extension of incidental teaching procedures to reading instruction, two autistic children acquired functional sight-word reading skills in the context of a play activity. Children gained access to preferred toys by selecting the label of the toy in tasks requiring increasingly complex visual discriminations. In addition to demonstrating rapid acquisition of 5-choice discriminations, they showed comprehension on probes requiring reading skills to locate toys stored in labeled boxes. Also examined was postteaching transfer across stimulus materials and response modalities. Implications are that extensions of incidental teaching to new response classes may produce the same benefits documented in communication training, in terms of producing generalization concurrent with skill acquisition in the course of child-preferred activities. PMID:3733586
Quantitative interpretations of Visible-NIR reflectance spectra of blood.
Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H
2008-10-27
This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.
Rawat, Naveen; Gudyaka, Russel; Kumar, Mohit; Joshi, Bharat; Santhanam, Kalathur S V
2008-04-01
This paper describes the thermal oxidative behavior of atomized iron or atomized cobalt in the presence of multiwalled carbon nanotubes (MWCNT). The thermogravimetric analysis shows the atomized iron thermal oxidation starts at about 500 degrees C that is absent when the atomized iron is sintered with multiwalled carbon naonotubes. The thermal oxidation of iron in the sintered samples requires the collapse of the multiwalled carbon nanotubes. A similar behavior is observed with atomized cobalt when its oxidation requires the collapse of the nanotubes. This thermal oxidative shift is interpreted as due to the atomized iron or atomized cobalt atom experiencing extensive overlap and confinement effect with multiwalled carbon nanotubes causing a spin transfer. This confinement effect is suggested to produce a transformation of iron from the outermost electronic distribution of 3d64s2 to an effective configuration of 3d84s0 and for cobalt 3d74s2 to 3d94s0 producing spintronics effect.
Munro, Peter R.T.; Ignatyev, Konstantin; Speller, Robert D.; Olivo, Alessandro
2013-01-01
X-ray phase contrast imaging is a very promising technique which may lead to significant advancements in medical imaging. One of the impediments to the clinical implementation of the technique is the general requirement to have an x-ray source of high coherence. The radiation physics group at UCL is currently developing an x-ray phase contrast imaging technique which works with laboratory x-ray sources. Validation of the system requires extensive modelling of relatively large samples of tissue. To aid this, we have undertaken a study of when geometrical optics may be employed to model the system in order to avoid the need to perform a computationally expensive wave optics calculation. In this paper, we derive the relationship between the geometrical and wave optics model for our system imaging an infinite cylinder. From this model we are able to draw conclusions regarding the general applicability of the geometrical optics approximation. PMID:20389424
Munro, Peter R T; Ignatyev, Konstantin; Speller, Robert D; Olivo, Alessandro
2010-03-01
X-ray phase contrast imaging is a very promising technique which may lead to significant advancements in medical imaging. One of the impediments to the clinical implementation of the technique is the general requirement to have an x-ray source of high coherence. The radiation physics group at UCL is currently developing an x-ray phase contrast imaging technique which works with laboratory x-ray sources. Validation of the system requires extensive modelling of relatively large samples of tissue. To aid this, we have undertaken a study of when geometrical optics may be employed to model the system in order to avoid the need to perform a computationally expensive wave optics calculation. In this paper, we derive the relationship between the geometrical and wave optics model for our system imaging an infinite cylinder. From this model we are able to draw conclusions regarding the general applicability of the geometrical optics approximation.
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
An argument for mechanism-based statistical inference in cancer
Ochs, Michael; Price, Nathan D.; Tomasetti, Cristian; Younes, Laurent
2015-01-01
Cancer is perhaps the prototypical systems disease, and as such has been the focus of extensive study in quantitative systems biology. However, translating these programs into personalized clinical care remains elusive and incomplete. In this perspective, we argue that realizing this agenda—in particular, predicting disease phenotypes, progression and treatment response for individuals—requires going well beyond standard computational and bioinformatics tools and algorithms. It entails designing global mathematical models over network-scale configurations of genomic states and molecular concentrations, and learning the model parameters from limited available samples of high-dimensional and integrative omics data. As such, any plausible design should accommodate: biological mechanism, necessary for both feasible learning and interpretable decision making; stochasticity, to deal with uncertainty and observed variation at many scales; and a capacity for statistical inference at the patient level. This program, which requires a close, sustained collaboration between mathematicians and biologists, is illustrated in several contexts, including learning bio-markers, metabolism, cell signaling, network inference and tumorigenesis. PMID:25381197
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-19
... Activities: Country of Origin Marking Requirements for Containers or Holders AGENCY: U.S. Customs and Border... of Origin Marking Requirements for Containers or Holders. This is a proposed extension of an... Requirements for Containers or Holders. [[Page 23491
Next Generation Microbiology Requirements
NASA Technical Reports Server (NTRS)
Ott, C. M.; Oubre, C. M.; Elliott, T. F.; Castro, V. A.; Pierson, D. L.
2012-01-01
As humans continue to explore deep into space, microorganisms will travel with them. The primary means to mitigate the risk of infectious disease are a combination of prudent spacecraft design and rigorous operational controls. The effectiveness of these methods are evaluated by microbiological monitoring of spacecraft, food, water, and the crew that is performed preflight, in-flight, and post-flight. Current NASA requirements associated with microbiological monitoring are based on culture-based methodology where microorganisms are grown on a semi-solid growth medium and enumerated. Subsequent identification of the organisms requires specialized labor and large equipment, which historically has been performed on Earth. Requirements that rely strictly on culture-based units limit the use of non-culture based monitoring technology. Specifically, the culture-based "measurement criteria" are Colony Forming Units (CFU, representing the growth of one microorganism at a single location on the agar medium) per a given volume, area, or sample size. As the CFU unit by definition is culture-based, these requirements limit alternative technologies for spaceflight applications. As spaceflight missions such as those to Mars extend further into space, culture-based technology will become difficult to implement due to the (a) limited shelf life of the culture media, (b) mass/volume necessary to carry these consumables, and (c) problems associated with the production of biohazardous material in the habitable volume of the spacecraft. In addition, an extensive amount of new knowledge has been obtained during the Space Shuttle, NASA-Mir, and International Space Station Programs, which gave direction for new or modified microbial control requirements for vehicle design and mission operations. The goal of this task is to develop and recommend a new set of requirements for vehicle design and mission operations, including microbiological monitoring, based upon "lessons learned" and new technology. During 2011, this study focused on evaluating potable water requirements by assembling a forum of internal and external experts from NASA, other federal agencies, and academia. Key findings from this forum included: (1) Preventive design and operational strategies should be stringent and the primary focus of NASA's mitigation efforts, as they are cost effective and can be attained with conventional technology. (2) Microbial monitoring hardware should be simple and must be able to measure the viability of microorganisms in a sample. Multiple monitoring technologies can be utilized as long as at the microorganisms being identified can also be confirmed as viable. (3) Evidence showing alterations in the crew immune function and microbial virulence complicates risk assessments and creates the need for very conservative requirements. (4) One key source of infectious agents will always be the crew, and appropriate preventative measures should be taken preflight. (5) Water systems should be thoroughly disinfected (sterilized if possible) preflight and retain a residual biocide throughout the mission. Future forums will cover requirements for other types of samples, specifically spaceflight food and environmental samples, such as vehicle air and vehicle and cargo surfaces. An interim report on the potable water forum has been delivered to the Human Research Program with a final report on the recommendations for all sample types being delivered in September 2013.
Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A
2009-12-16
Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.
2009-01-01
Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393
2011-01-01
military requirements and design options for extending the B61 bomb’s service life. The B61 is used to support the U.S. strategic deterrent and the...extension programs, and interviewed officials responsible for B61 operations, life extension program planning, management, and oversight. This is...NNSA have made progress in studying and updating the military’s performance requirements for the B61 bomb and have ruled out some design options, but
Neutron imaging data processing using the Mantid framework
NASA Astrophysics Data System (ADS)
Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried
2016-09-01
Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.
Development of a Multiplex Single Base Extension Assay for Mitochondrial DNA Haplogroup Typing
Nelson, Tahnee M.; Just, Rebecca S.; Loreille, Odile; Schanfield, Moses S.; Podini, Daniele
2007-01-01
Aim To provide a screening tool to reduce time and sample consumption when attempting mtDNA haplogroup typing. Methods A single base primer extension assay was developed to enable typing, in a single reaction, of twelve mtDNA haplogroup specific polymorphisms. For validation purposes a total of 147 samples were tested including 73 samples successfully haplogroup typed using mtDNA control region (CR) sequence data, 21 samples inconclusively haplogroup typed by CR data, 20 samples previously haplogroup typed using restriction fragment length polymorphism (RFLP) analysis, and 31 samples of known ancestral origin without previous haplogroup typing. Additionally, two highly degraded human bones embalmed and buried in the early 1950s were analyzed using the single nucleotide polymorphisms (SNP) multiplex. Results When the SNP multiplex was used to type the 96 previously CR sequenced specimens, an increase in haplogroup or macrohaplogroup assignment relative to conventional CR sequence analysis was observed. The single base extension assay was also successfully used to assign a haplogroup to decades-old, embalmed skeletal remains dating to World War II. Conclusion The SNP multiplex was successfully used to obtain haplogroup status of highly degraded human bones, and demonstrated the ability to eliminate possible contributors. The SNP multiplex provides a low-cost, high throughput method for typing of mtDNA haplogroups A, B, C, D, E, F, G, H, L1/L2, L3, M, and N that could be useful for screening purposes for human identification efforts and anthropological studies. PMID:17696300
The Impact of the Media in Influencing Extension's Perceptions of Methamphetamine
ERIC Educational Resources Information Center
Beaudreault, Amy R.
2013-01-01
The study reported here explored media dependency and moral panic involving methamphetamine perceptions among a national sample of Extension Directors through survey methodology. With a 70.0% response rate, the questionnaire concentrated on demographics; methamphetamine knowledge, information sources, and dependency; and perceptions of the media.…
What Influences Agents to Pursue a Career in Extension?
ERIC Educational Resources Information Center
Arnold, Shannon; Place, Nick
2010-01-01
The qualitative study reported here explored why agricultural agents pursue an Extension career. A purposive sample was used to select twelve Florida agricultural agents. Interviews investigated positive and negative influences that affected agents' employment decisions. Grounded theory was used as the primary data analysis method (Strauss &…
ERIC Educational Resources Information Center
Lyday, Susan Y.; And Others
The North Carolina Agricultural Extension Service conducted a survey to determine if factors such as personal characteristics, organizational factors, career experiences, attributional processes, or sociodemographic factors related to the attitudes of Extension professionals toward women in management. A sample of 266 persons serving in…
Need for Methamphetamine Programming in Extension Education
ERIC Educational Resources Information Center
Beaudreault, Amy R.; Miller, Larry E.
2011-01-01
The study reported sought to identify the prevention education needs involving methamphetamine through survey methodology. The study focused on a random sample of U.S. states and the Extension Directors within each state, resulting in a 70% response rate (n = 134). Findings revealed that 11% reported they had received methamphetamine user…
Factors Influencing Perceptions of Service Quality in Cooperative Extension Workers
ERIC Educational Resources Information Center
Anaza, Nwamaka A.; Rutherford, Brian N.; Widdows, Richard
2012-01-01
The authors examined the direct and indirect impact of empowerment on service quality as perceived by Extension staff. Using a sample 283 respondents, the results revealed that along with empowerment, constructs such as job satisfaction and organizational identification positively affected service quality. Undoubtedly, each of these variables…
Determination of methyl bromide in air samples by headspace gas chromatography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodrow, J.E.; McChesney, M.M.; Seiber, J.N.
1988-03-01
Methyl bromide is extensively used in agriculture (4 x 10/sup 6/ kg for 1985 in California alone as a fumigant to control nematodes, weeds, and fungi in soil and insect pests in harvested grains and nuts. Given its low boiling point (3.8/sup 0/C) and high vapor pressure (approx. 1400 Torr at 20/sup 0/C), methyl bromide will readily diffuse if not rigorously contained. Methods for determining methyl bromide and other halocarbons in air vary widely. A common practice is to trap the material from air on an adsorbent, such as polymeric resins, followed by thermal desorption either directly into the analyticalmore » instrumentation or after intermediary cryofocusing. While in some cases analytical detection limits were reasonable (parts per million range), many of the published methods were labor intensive and required special handling techniques that precluded high sample throughput. They describe here a method for the sampling and analysis of airborne methyl bromide that was designed to handle large numbers of samples through automating some critical steps of the analysis. The result was a method that allowed around-the-clock operation with a minimum of operator attention. Furthermore, the method was not specific to methyl bromide and could be used to determine other halocarbons in air.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
BENDER, SUSAN FAE ANN; RODACY, PHILIP J.; BARNETT, JAMES L.
The ultimate goal of many environmental measurements is to determine the risk posed to humans or ecosystems by various contaminants. Conventional environmental monitoring typically requires extensive sampling grids covering several media including air, water, soil and vegetation. A far more efficient, innovative and inexpensive tactic has been found using honeybees as sampling mechanisms. Members from a single bee colony forage over large areas ({approx}2 x 10{sup 6} m{sup 2}), making tens of thousands of trips per day, and return to a fixed location where sampling can be conveniently conducted. The bees are in direct contact with the air, water, soilmore » and vegetation where they encounter and collect any contaminants that are present in gaseous, liquid and particulate form. The monitoring of honeybees when they return to the hive provides a rapid method to assess chemical distributions and impacts (1). The primary goal of this technology is to evaluate the efficiency of the transport mechanism (honeybees) to the hive using preconcentrators to collect samples. Once the extent and nature of the contaminant exposure has been characterized, resources can be distributed and environmental monitoring designs efficiently directed to the most appropriate locations. Methyl salicylate, a chemical agent surrogate was used as the target compound in this study.« less
Frequency position modulation using multi-spectral projections
NASA Astrophysics Data System (ADS)
Goodman, Joel; Bertoncini, Crystal; Moore, Michael; Nousain, Bryan; Cowart, Gregory
2012-10-01
In this paper we present an approach to harness multi-spectral projections (MSPs) to carefully shape and locate tones in the spectrum, enabling a new and robust modulation in which a signal's discrete frequency support is used to represent symbols. This method, called Frequency Position Modulation (FPM), is an innovative extension to MT-FSK and OFDM and can be non-uniformly spread over many GHz of instantaneous bandwidth (IBW), resulting in a communications system that is difficult to intercept and jam. The FPM symbols are recovered using adaptive projections that in part employ an analog polynomial nonlinearity paired with an analog-to-digital converter (ADC) sampling at a rate at that is only a fraction of the IBW of the signal. MSPs also facilitate using commercial of-the-shelf (COTS) ADCs with uniform-sampling, standing in sharp contrast to random linear projections by random sampling, which requires a full Nyquist rate sample-and-hold. Our novel communication system concept provides an order of magnitude improvement in processing gain over conventional LPI/LPD communications (e.g., FH- or DS-CDMA) and facilitates the ability to operate in interference laden environments where conventional compressed sensing receivers would fail. We quantitatively analyze the bit error rate (BER) and processing gain (PG) for a maximum likelihood based FPM demodulator and demonstrate its performance in interference laden conditions.
Lack of Extraterritorial Jurisdiction over Civilians: A New Look at an Old Problem
1995-04-01
conclusion, this thesis recommends a partial solution based on a limited extension of court-martial jurisdiction over civilians deployed on military...operations. This limited extension of court-martial jurisdiction will enable commanders to command the civilian component of their deployed force and...court can extend many of the time limits if the government can prove extraordinary circumstances that require the extension in the interests of
Carnaz, Letícia; Moriguchi, Cristiane S; de Oliveira, Ana Beatriz; Santiago, Paulo R P; Caurin, Glauco A P; Hansson, Gert-Åke; Coury, Helenice J C Gil
2013-11-01
This study compared neck range of movement recording using three different methods goniometers (EGM), inclinometers (INC) and a three-dimensional video analysis system (IMG) in simultaneous and synchronized data collection. Twelve females performed neck flexion-extension, lateral flexion, rotation and circumduction. The differences between EGM, INC, and IMG were calculated sample by sample. For flexion-extension movement, IMG underestimated the amplitude by 13%; moreover, EGM showed a crosstalk of about 20% for lateral flexion and rotation axes. In lateral flexion movement, all systems showed similar amplitude and the inter-system differences were moderate (4-7%). For rotation movement, EGM showed a high crosstalk (13%) for flexion-extension axis. During the circumduction movement, IMG underestimated the amplitude of flexion-extension movements by about 11%, and the inter-system differences were high (about 17%) except for INC-IMG regarding lateral flexion (7%) and EGM-INC regarding flexion-extension (10%). For application in workplace, INC presents good results compared to IMG and EGM though INC cannot record rotation. EGM should be improved in order to reduce its crosstalk errors and allow recording of the full neck range of movement. Due to non-optimal positioning of the cameras for recording flexion-extension, IMG underestimated the amplitude of these movements. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-10
... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information... Review Procedures for Non-Grandfathered Plans AGENCY: Employee Benefits Security Administration... collection requirements and provide the requested data in the desired format. The Employee Benefits Security...
Developing disease resistant stone fruits
USDA-ARS?s Scientific Manuscript database
Stone fruit (Prunus spp.) (peach, nectarine, plum, apricot, cherry) and almonds are susceptible to a number of pathogens. These pathogens can cause extensive losses in the field, during transport and storage, and in the market. Breeding for disease resistance requires an extensive knowledge of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information Collection Request Submitted for Public Comment; COBRA Notification Requirements--American Recovery and Reinvestment Act of 2009 as Amended AGENCY: Employee Benefits Security Administration, Department of Labor...
76 FR 43996 - Commission Information Collection Activities (FERC-510); Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-22
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. IC11-510-000] Commission Information Collection Activities (FERC-510); Comment Request; Extension AGENCY: Federal Energy Regulatory...) 273-0873. SUPPLEMENTARY INFORMATION: The information collected under the requirements of FERC-510...
78 FR 71668 - Proposed Extension of Information Collection Requests Submitted for Public Comment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information Collection Requests Submitted for Public Comment AGENCY: Employee Benefits Security Administration... collection requirements and provide the requested data in the desired format. The Employee Benefits Security...
48 CFR 37.111 - Extension of services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... CATEGORIES OF CONTRACTING SERVICE CONTRACTING Service Contracts-General 37.111 Extension of services. Award of contracts for recurring and continuing service requirements are often delayed due to circumstances beyond the control of contracting offices. Examples of circumstances causing such delays are bid protests...
48 CFR 37.111 - Extension of services.
Code of Federal Regulations, 2011 CFR
2011-10-01
... CATEGORIES OF CONTRACTING SERVICE CONTRACTING Service Contracts-General 37.111 Extension of services. Award of contracts for recurring and continuing service requirements are often delayed due to circumstances beyond the control of contracting offices. Examples of circumstances causing such delays are bid protests...
78 FR 70584 - Extension of Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-26
... establishes arrangements to protect the rights of affected transit employees. Federal law requires such... DEPARTMENT OF LABOR Office of Labor-Management Standards Extension of Information Collection; Comment Request ACTION: Notice. SUMMARY: The Department of Labor, as part of its continuing effort to...
Improvement of a wind-tunnel sampling system for odour and VOCs.
Wang, X; Jiang, J; Kaye, R
2001-01-01
Wind-tunnel systems are widely used for collecting odour emission samples from surface area sources. Consequently, a portable wind-tunnel system was developed at the University of New South Wales that was easy to handle and suitable for sampling from liquid surfaces. Development work was undertaken to ensure even air-flows above the emitting surface and to optimise air velocities to simulate real situations. However, recovery efficiencies for emissions have not previously been studied for wind-tunnel systems. A series of experiments was carried out for determining and improving the recovery rate of the wind-tunnel sampling system by using carbon monoxide as a tracer gas. It was observed by mass balance that carbon monoxide recovery rates were initially only 37% to 48% from a simulated surface area emission source. It was therefore apparent that further development work was required to improve recovery efficiencies. By analysing the aerodynamic character of air movement and CO transportation inside the wind-tunnel, it was determined that the apparent poor recoveries resulted from uneven mixing at the sample collection point. A number of modifications were made for the mixing chamber of the wind-tunnel system. A special sampling chamber extension and a sampling manifold with optimally distributed sampling orifices were developed for the wind-tunnel sampling system. The simulation experiments were repeated with the new sampling system. Over a series of experiments, the recovery efficiency of sampling was improved to 83-100% with an average of 90%, where the CO tracer gas was introduced at a single point and 92-102% with an average of 97%, where the CO tracer gas was introduced along a line transverse to the sweep air. The stability and accuracy of the new system were determined statistically and are reported.
Possibilities for serial femtosecond crystallography sample delivery at future light sourcesa)
Chavas, L. M. G.; Gumprecht, L.; Chapman, H. N.
2015-01-01
Serial femtosecond crystallography (SFX) uses X-ray pulses from free-electron laser (FEL) sources that can outrun radiation damage and thereby overcome long-standing limits in the structure determination of macromolecular crystals. Intense X-ray FEL pulses of sufficiently short duration allow the collection of damage-free data at room temperature and give the opportunity to study irreversible time-resolved events. SFX may open the way to determine the structure of biological molecules that fail to crystallize readily into large well-diffracting crystals. Taking advantage of FELs with high pulse repetition rates could lead to short measurement times of just minutes. Automated delivery of sample suspensions for SFX experiments could potentially give rise to a much higher rate of obtaining complete measurements than at today's third generation synchrotron radiation facilities, as no crystal alignment or complex robotic motions are required. This capability will also open up extensive time-resolved structural studies. New challenges arise from the resulting high rate of data collection, and in providing reliable sample delivery. Various developments for fully automated high-throughput SFX experiments are being considered for evaluation, including new implementations for a reliable yet flexible sample environment setup. Here, we review the different methods developed so far that best achieve sample delivery for X-ray FEL experiments and present some considerations towards the goal of high-throughput structure determination with X-ray FELs. PMID:26798808
Doyle, Jacqueline M.; Katzner, Todd E.; Roemer, Gary; Cain, James W.; Millsap, Brian; McIntyre, Carol; Sonsthagen, Sarah A.; Fernandez, Nadia B.; Wheeler, Maria; Bulut, Zafer; Bloom, Peter; DeWoody, J. Andrew
2016-01-01
Molecular markers can reveal interesting aspects of organismal ecology and evolution, especially when surveyed in rare or elusive species. Herein, we provide a preliminary assessment of golden eagle (Aquila chrysaetos) population structure in North America using novel single nucleotide polymorphisms (SNPs). These SNPs included one molecular sexing marker, two mitochondrial markers, 85 putatively neutral markers that were derived from noncoding regions within large intergenic intervals, and 74 putatively nonneutral markers found in or very near protein-coding genes. We genotyped 523 eagle samples at these 162 SNPs and quantified genotyping error rates and variability at each marker. Our samples corresponded to 344 individual golden eagles as assessed by unique multilocus genotypes. Observed heterozygosity of known adults was significantly higher than of chicks, as was the number of heterozygous loci, indicating that mean zygosity measured across all 159 autosomal markers was an indicator of fitness as it is associated with eagle survival to adulthood. Finally, we used chick samples of known provenance to test for population differentiation across portions of North America and found pronounced structure among geographic sampling sites. These data indicate that cryptic genetic population structure is likely widespread in the golden eagle gene pool, and that extensive field sampling and genotyping will be required to more clearly delineate management units within North America and elsewhere.
Elemental analysis of size-fractionated particulate matter sampled in Göteborg, Sweden
NASA Astrophysics Data System (ADS)
Wagner, Annemarie; Boman, Johan; Gatari, Michael J.
2008-12-01
The aim of the study was to investigate the mass distribution of trace elements in aerosol samples collected in the urban area of Göteborg, Sweden, with special focus on the impact of different air masses and anthropogenic activities. Three measurement campaigns were conducted during December 2006 and January 2007. A PIXE cascade impactor was used to collect particulate matter in 9 size fractions ranging from 16 to 0.06 µm aerodynamic diameter. Polished quartz carriers were chosen as collection substrates for the subsequent direct analysis by TXRF. To investigate the sources of the analyzed air masses, backward trajectories were calculated. Our results showed that diurnal sampling was sufficient to investigate the mass distribution for Br, Ca, Cl, Cu, Fe, K, Sr and Zn, whereas a 5-day sampling period resulted in additional information on mass distribution for Cr and S. Unimodal mass distributions were found in the study area for the elements Ca, Cl, Fe and Zn, whereas the distributions for Br, Cu, Cr, K, Ni and S were bimodal, indicating high temperature processes as source of the submicron particle components. The measurement period including the New Year firework activities showed both an extensive increase in concentrations as well as a shift to the submicron range for K and Sr, elements that are typically found in fireworks. Further research is required to validate the quantification of trace elements directly collected on sample carriers.
Outcome-Dependent Sampling with Interval-Censored Failure Time Data
Zhou, Qingning; Cai, Jianwen; Zhou, Haibo
2017-01-01
Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664
Wiklander, Oscar P. B.; Bostancioglu, R. Beklem; Welsh, Joshua A.; Zickler, Antje M.; Murke, Florian; Corso, Giulia; Felldin, Ulrika; Hagey, Daniel W.; Evertsson, Björn; Liang, Xiu-Ming; Gustafsson, Manuela O.; Mohammad, Dara K.; Wiek, Constanze; Hanenberg, Helmut; Bremer, Michel; Gupta, Dhanu; Björnstedt, Mikael; Giebel, Bernd; Nordin, Joel Z.; Jones, Jennifer C.; EL Andaloussi, Samir; Görgens, André
2018-01-01
Extracellular vesicles (EVs) can be harvested from cell culture supernatants and from all body fluids. EVs can be conceptually classified based on their size and biogenesis as exosomes and microvesicles. Nowadays, it is however commonly accepted in the field that there is a much higher degree of heterogeneity within these two subgroups than previously thought. For instance, the surface marker profile of EVs is likely dependent on the cell source, the cell’s activation status, and multiple other parameters. Within recent years, several new methods and assays to study EV heterogeneity in terms of surface markers have been described; most of them are being based on flow cytometry. Unfortunately, such methods generally require dedicated instrumentation, are time-consuming and demand extensive operator expertise for sample preparation, acquisition, and data analysis. In this study, we have systematically evaluated and explored the use of a multiplex bead-based flow cytometric assay which is compatible with most standard flow cytometers and facilitates a robust semi-quantitative detection of 37 different potential EV surface markers in one sample simultaneously. First, assay variability, sample stability over time, and dynamic range were assessed together with the limitations of this assay in terms of EV input quantity required for detection of differently abundant surface markers. Next, the potential effects of EV origin, sample preparation, and quality of the EV sample on the assay were evaluated. The findings indicate that this multiplex bead-based assay is generally suitable to detect, quantify, and compare EV surface signatures in various sample types, including unprocessed cell culture supernatants, cell culture-derived EVs isolated by different methods, and biological fluids. Furthermore, the use and limitations of this assay to assess heterogeneities in EV surface signatures was explored by combining different sets of detection antibodies in EV samples derived from different cell lines and subsets of rare cells. Taken together, this validated multiplex bead-based flow cytometric assay allows robust, sensitive, and reproducible detection of EV surface marker expression in various sample types in a semi-quantitative way and will be highly valuable for many researchers in the EV field in different experimental contexts.
Wiklander, Oscar P B; Bostancioglu, R Beklem; Welsh, Joshua A; Zickler, Antje M; Murke, Florian; Corso, Giulia; Felldin, Ulrika; Hagey, Daniel W; Evertsson, Björn; Liang, Xiu-Ming; Gustafsson, Manuela O; Mohammad, Dara K; Wiek, Constanze; Hanenberg, Helmut; Bremer, Michel; Gupta, Dhanu; Björnstedt, Mikael; Giebel, Bernd; Nordin, Joel Z; Jones, Jennifer C; El Andaloussi, Samir; Görgens, André
2018-01-01
Extracellular vesicles (EVs) can be harvested from cell culture supernatants and from all body fluids. EVs can be conceptually classified based on their size and biogenesis as exosomes and microvesicles. Nowadays, it is however commonly accepted in the field that there is a much higher degree of heterogeneity within these two subgroups than previously thought. For instance, the surface marker profile of EVs is likely dependent on the cell source, the cell's activation status, and multiple other parameters. Within recent years, several new methods and assays to study EV heterogeneity in terms of surface markers have been described; most of them are being based on flow cytometry. Unfortunately, such methods generally require dedicated instrumentation, are time-consuming and demand extensive operator expertise for sample preparation, acquisition, and data analysis. In this study, we have systematically evaluated and explored the use of a multiplex bead-based flow cytometric assay which is compatible with most standard flow cytometers and facilitates a robust semi-quantitative detection of 37 different potential EV surface markers in one sample simultaneously. First, assay variability, sample stability over time, and dynamic range were assessed together with the limitations of this assay in terms of EV input quantity required for detection of differently abundant surface markers. Next, the potential effects of EV origin, sample preparation, and quality of the EV sample on the assay were evaluated. The findings indicate that this multiplex bead-based assay is generally suitable to detect, quantify, and compare EV surface signatures in various sample types, including unprocessed cell culture supernatants, cell culture-derived EVs isolated by different methods, and biological fluids. Furthermore, the use and limitations of this assay to assess heterogeneities in EV surface signatures was explored by combining different sets of detection antibodies in EV samples derived from different cell lines and subsets of rare cells. Taken together, this validated multiplex bead-based flow cytometric assay allows robust, sensitive, and reproducible detection of EV surface marker expression in various sample types in a semi-quantitative way and will be highly valuable for many researchers in the EV field in different experimental contexts.
Implications of sampling design and sample size for national carbon accounting systems
Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge
2011-01-01
Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
... strength requirements specified in the Standard, and a requirement that defective wheels and wheel... or a Registered Professional Engineer must certify that the strength requirements specified in (d)(3...
Impact of drug shortages on patients receiving parenteral nutrition after laparotomy.
Bible, Jaimee R; Evans, David C; Payne, Brett; Mostafavifar, Lisa
2014-11-01
Drug shortages, including parenteral nutrition (PN) product shortages, continue to increase and have a significant impact on healthcare. The extent to which product shortages affect bowel recovery and outcomes in patients receiving PN is unknown. The objective of this study is to examine the impact of extensive PN product shortages on patients receiving PN after laparotomy for bowel obstruction. A retrospective review was conducted for patients who underwent a laparotomy for small bowel obstruction and received PN postoperatively. Periods of limited and extensive PN product shortages at our institution were defined. PN therapy duration and composition, daily laboratory values, electrolyte supplementation, length of stay, and cost of hospitalization were recorded. Analyses using χ(2), Wilcoxon rank sum, log-rank, and t tests as appropriate were performed using SAS/STAT 9.2. Patients had longer hospital length of stays (20.0 vs 15.2 days; P = .04), trends toward longer PN therapy courses (8.8 vs 6.6 days; P = .13), and a 51% higher hospital cost during the extensive PN drug shortage period. Mean serum electrolyte concentrations were similar while the need for supplemental magnesium replacements increased during the extensive shortage period (75% vs 35%; P = .01). Supplemented patients also required higher doses of magnesium (2.7 vs 1.0 g; P < .01) and more laboratory draws during the extensive shortage period (59% vs 21% required ≥ 2 draws daily; P = .04). Fewer lipid calories were delivered during the extensive shortage period (2.4 vs 4.8 kcal/kg/d; P < .01). PN drug shortages have a negative impact on patient outcomes and require aggressive management strategies. © 2014 American Society for Parenteral and Enteral Nutrition.
Isotope Ratio Mass Spectrometry and Shale Gas - What Is Possible with Current Technology?
NASA Astrophysics Data System (ADS)
Barrie, C. D.; Kasson, A.
2014-12-01
With ever increasing exploration and exploitation of 'unconventional' hydrocarbon resources, the drive to understand the origins, history and importance of these resources and their effects on the surrounding environment (i.e. ground waters) has never been more important. High-throughput, high-precision isotopic measurements are therefore a key tool in this industry to both understand the gas generated and monitor the development and stability of wells through time. With the advent of cavity ringdown spectroscopy (CRDS) instrumentation, there has been a push in some applications - environmental & atmospheric - to gather more and more data directly at the location of collection or at dedicated field stations. Furthermore, CRDS has resulted in users seeking greater autonomy of instrumentation and so-called black box technology. Traditionally IRMS technology has not met any of these demands, requiring very specific and extensive footprint, power and environmental requirements. This has meant that the 'Oil & Gas' sector, which for natural gases measurements requires GC-IRMS technology - not possible via CRDS - loses time, money and manpower as samples get sent to central facility or contract labs with potentially long lee times. However, recent developments in technology mean that IRMS systems exist which are benchtop, have much lower power requirements, standard power connections and as long as housed in a temperature controlled field stations can be deployed anywhere. Furthermore, with advances in electronics and software IRMS systems are approaching the black box level of newer instrumentation while maintaining the flexibility and abilities of isotope ratio mass spectrometry. This presentation will outline changes in IRMS technology applicable to the Oil & Gas industry, discuss the feasibility of true 'field' deployability and present results from a range of Oil & Gas samples.
Baghaie, Ahmadreza; Pahlavan Tafti, Ahmad; Owen, Heather A; D'Souza, Roshan M; Yu, Zeyun
2017-01-01
Scanning Electron Microscope (SEM) as one of the major research and industrial equipment for imaging of micro-scale samples and surfaces has gained extensive attention from its emerge. However, the acquired micrographs still remain two-dimensional (2D). In the current work a novel and highly accurate approach is proposed to recover the hidden third-dimension by use of multi-view image acquisition of the microscopic samples combined with pre/post-processing steps including sparse feature-based stereo rectification, nonlocal-based optical flow estimation for dense matching and finally depth estimation. Employing the proposed approach, three-dimensional (3D) reconstructions of highly complex microscopic samples were achieved to facilitate the interpretation of topology and geometry of surface/shape attributes of the samples. As a byproduct of the proposed approach, high-definition 3D printed models of the samples can be generated as a tangible means of physical understanding. Extensive comparisons with the state-of-the-art reveal the strength and superiority of the proposed method in uncovering the details of the highly complex microscopic samples.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-18
... Act; Analysis and Sampling Procedures; Extension of Comment Period AGENCY: Environmental Protection..., 2010, EPA proposed changes to analysis and sampling test procedures in wastewater regulations. These...
SensorKit: A Flexible and Extensible System for In-Situ Data Acquisition
NASA Astrophysics Data System (ADS)
Silva, F.; Deschon, A.; Chang, J.; Westrich, S.; Cho, Y. H.; Gullapalli, S.; Benzel, T.; Graham, E. A.
2009-12-01
Over the years, sensor networks technology has evolved tremendously and has great potential in environmental sensing applications. However, because sensor networks are usually designed and built by computer scientists and engineers with little input from the scientific community, the resulting technology is often complex and out of reach for most field scientists. A few sensor, and data logger vendors have released data acquisition systems that can be used with their products. Unfortunately, these are generally vendor-specific, requiring scientists with heterogeneous sensors to use multiple systems to acquire data from all their sensors. A few, more generic systems, are compatible with multiple brands. However, these often offer only limited functionality, little flexibility, and no extensibility. We built SensorKit to overcome these limitations and to accelerate the adoption of sensor networks by field scientists. Using a simplicity-through-sophistication approach, we provide scientists with a powerful tool for field data collection. SensorKit is hardware agnostic, and was built using commercial off-the-shelf components. By employing a Linux-based ultra low-power generic embedded processing platform with a variety of dataloggers (including Berkeley motes, National Instruments' Compact RIOs, as well as legacy and newer PakBus-based Campbell data loggers), we support requirements from a large number of scientists. The user interfaces are designed to be intuitive so that most scientists can deploy, configure, and operate the system without extensive training. Working in close collaboration with field scientists allowed us to better understand scientific requirements and ensure system relevancy. The requirements for data acquisition, data storage, and data communication vary significantly for each deployment. Data acquisition needs to include capabilities for different analog, digital, and other complex sensors (e.g. cameras, and robotic sensors). Moreover, the sensors may be geographically dispersed, requiring the use of a local sensor network for moving data at the site. Data storage has to accommodate varying sampling rates from several times a second, to once every hour (or longer), and handle situations where data is accumulated for several days or even weeks at a time. Additionally, different deployments require the use of varying communication technologies (e.g. satellite, cellular, long range radios, wi-fi, etc) and while some scientists need live access to their data, others are able to tolerate delays of hours, if not days. Finally, power and environmental conditions can have great influence in the type of data acquisition and communication technology that can be used at a certain site. During the past few years, we have used a spiral build, deploy, and revise approach in order to verify our design and incorporate what we have learned at each deployment. In this poster, we present our system architecture, how SensorKit has been used by scientists in a number of places around the world, and how it has evolved over time, adapting to a wide range of deployment requirements in order to accommodate different scientific applications.
Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment
Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek
2017-01-01
The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. PMID:29186754
Radurization of commercial freshwater fish species
NASA Astrophysics Data System (ADS)
Chuaqui-Offermanns, N.; McDougall, T. E.; Sprung, W.; Sullivan, V.
The effect of radurization on the shelf life of fresh Whitefish obtained through ordinary commercial channels has been determined. Whitefish fillets irradiated at 1.2 kGy and stored at 3°C have a shelf life three times longer than the unirradiated fish. When the fish was irradiated at 0.82 kGy a two fold shelf-life extension was obtained. The shelf life was estimated by sensory, chemical and microbiological evaluations. Sensory evaluation involved organoleptic assessment of raw and cooked samples. Since freshwater fish do not contain trimethylamine oxide (TMAO), alternate tests for freshness were required. It was found the determination of hypoxanthine and total volatile acid number (VAN) are excellent tests for freshness and quality of freshwater fish; thus, these analyses were adopted. The degree of radiation-induced lipid oxidation was measured by the thiobarbituric acid test (TBA). It was found at doses of 0.82 and 1.2 kGy the TBA number remained within acceptable limits in all samples. Microbiological analyses consisted of the total microbial load assessment in the sample, as well as Pseudomonas and total psychrotrophic counts. The estimated shelf lives as determined by the three separate evaluations were in very good agreement.
2014-01-01
Background Cases of Mycobacterium bovis infection South American camelids have been increasing in Great Britain. Current antemortem immunological tests have some limitations. Cases at post mortem examination frequently show extensive pathology. The feasibility of detecting Mycobacterium bovis DNA in clinical samples was investigated. Findings A sensitive extraction methodology was developed and used on nasal swabs and faeces taken post-mortem to assess the potential for a PCR test to detect Mycobacterium bovis in clinical samples. The gross pathology of the studied South American camelids was scored and a significantly greater proportion of South American camelids with more severe pathology were positive in both the nasal swab and faecal PCR tests. A combination of the nasal swab and faecal PCR tests detected 63.9% of all the South American camelids with pathology that were tested. Conclusions The results suggest that antemortem diagnosis of Mycobacterium bovis in South American camelids may be possible using a PCR test on clinical samples, however more work is required to determine sensitivity and specificity, and the practicalities of applying the test in the field. PMID:24507471
Simple Penalties on Maximum-Likelihood Estimates of Genetic Parameters to Reduce Sampling Variation
Meyer, Karin
2016-01-01
Multivariate estimates of genetic parameters are subject to substantial sampling variation, especially for smaller data sets and more than a few traits. A simple modification of standard, maximum-likelihood procedures for multivariate analyses to estimate genetic covariances is described, which can improve estimates by substantially reducing their sampling variances. This is achieved by maximizing the likelihood subject to a penalty. Borrowing from Bayesian principles, we propose a mild, default penalty—derived assuming a Beta distribution of scale-free functions of the covariance components to be estimated—rather than laboriously attempting to determine the stringency of penalization from the data. An extensive simulation study is presented, demonstrating that such penalties can yield very worthwhile reductions in loss, i.e., the difference from population values, for a wide range of scenarios and without distorting estimates of phenotypic covariances. Moreover, mild default penalties tend not to increase loss in difficult cases and, on average, achieve reductions in loss of similar magnitude to computationally demanding schemes to optimize the degree of penalization. Pertinent details required for the adaptation of standard algorithms to locate the maximum of the likelihood function are outlined. PMID:27317681
A comparative appraisal of two equivalence tests for multiple standardized effects.
Shieh, Gwowen
2016-04-01
Equivalence testing is recommended as a better alternative to the traditional difference-based methods for demonstrating the comparability of two or more treatment effects. Although equivalent tests of two groups are widely discussed, the natural extensions for assessing equivalence between several groups have not been well examined. This article provides a detailed and schematic comparison of the ANOVA F and the studentized range tests for evaluating the comparability of several standardized effects. Power and sample size appraisals of the two grossly distinct approaches are conducted in terms of a constraint on the range of the standardized means when the standard deviation of the standardized means is fixed. Although neither method is uniformly more powerful, the studentized range test has a clear advantage in sample size requirements necessary to achieve a given power when the underlying effect configurations are close to the priori minimum difference for determining equivalence. For actual application of equivalence tests and advance planning of equivalence studies, both SAS and R computer codes are available as supplementary files to implement the calculations of critical values, p-values, power levels, and sample sizes. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NIH CIDR Program Studies For whole exome sequencing projects, we pretest all samples using a high -density SNP array (>200,000 markers). For custom targeted sequencing, we pretest all samples using a 96 pretest samples using a 96 SNP GoldenGate assay. This extensive pretesting allows us to unambiguously tie
Repeated Random Sampling in Year 5
ERIC Educational Resources Information Center
Watson, Jane M.; English, Lyn D.
2016-01-01
As an extension to an activity introducing Year 5 students to the practice of statistics, the software "TinkerPlots" made it possible to collect repeated random samples from a finite population to informally explore students' capacity to begin reasoning with a distribution of sample statistics. This article provides background for the…
Examination of Pre-Productions Samples of UOP IONSIV(R) IE-910 and IE-911
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, D.D.
2001-07-10
This report includes results of the extensive examination of newly-prepared laboratory-scale and pre-production samples of caustic-washed CST [crystalline silicotitanate] as compared to similar performance data for commercially-available and baseline samples. Conclusions from this work include the following.
An Active Tutorial on Distance Sampling
ERIC Educational Resources Information Center
Richardson, Alice
2007-01-01
The technique of distance sampling is widely used to monitor biological populations. This paper documents an in-class activity to introduce students to the concepts and the mechanics of distance sampling in a simple situation that is relevant to their own experiences. Preparation details are described. Variations and extensions to the activity are…
All-season flash flood forecasting system for real-time operations
USDA-ARS?s Scientific Manuscript database
Flash floods can cause extensive damage to both life and property, especially because they are difficult to predict. Flash flood prediction requires high-resolution meteorologic observations and predictions, as well as calibrated hydrologic models in addition to extensive data handling. We have de...
78 FR 13657 - Commission Information Collection Activities (FERC-577); Comment Request; Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-28
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. IC13-12-000] Commission Information Collection Activities (FERC-577); Comment Request; Extension AGENCY: Federal Energy Regulatory...: In compliance with the requirements of the Paperwork Reduction Act of 1995, the Federal Energy...
48 CFR 17.605 - Award, renewal, and extension.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Award, renewal, and extension. (a) Effective work performance under management and operating contracts... requirements and the unusual (sometimes unique) nature of the work performed under management and operating contracts, the Government is often limited in its ability to effect competition or to replace a contractor...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Process. 905.37 Section 905.37 Energy DEPARTMENT OF ENERGY ENERGY PLANNING AND MANAGEMENT PROGRAM Power Marketing Initiative § 905.37 Process. Modified contractual language shall be required to place resource extensions under contract. Resource extensions and allocations...
A Manhattan Project in Educational Technology, Part II.
ERIC Educational Resources Information Center
Roberts, Wesley K.
The initial four phases of the Training Extension Course (TEC), a project to remedy deficiencies in training programs for armed forces recruits, employed systematic instructional development and extensive audiovisual resources. The project required subcontracting for lesson production and modifications in personnel and budgeting. Posttest evidence…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... standards. The collection also requires airport operators to comply with a security directive by maintaining... airport operators maintain records of criminal history records checks and security threat assessments in... DEPARTMENT OF HOMELAND SECURITY Transportation Security Administration Extension of Agency...
2018-01-30
This document announces the extension of statewide temporary moratoria on the enrollment of new Medicare Part B non-emergency ground ambulance providers and suppliers and Medicare home health agencies, subunits, and branch locations in Florida, Illinois, Michigan, Texas, Pennsylvania, and New Jersey, as applicable, to prevent and combat fraud, waste, and abuse. This extension also applies to the enrollment of new non-emergency ground ambulance suppliers and home health agencies, subunits, and branch locations in Medicaid and the Children's Health Insurance Program in those states. For purposes of these moratoria, providers that were participating as network providers in one or more Medicaid managed care organizations prior to January 1, 2018 will not be considered "newly enrolling" when they are required to enroll with the State Medicaid agency pursuant to a new statutory requirement, and thus will not be subject to the moratoria.
How air pollution alters brain development: the role of neuroinflammation.
Brockmeyer, Sam; D'Angiulli, Amedeo
2016-01-01
The present review synthesizes lines of emerging evidence showing how several samples of children populations living in large cities around the world suffer to some degree neural, behavioral and cognitive changes associated with air pollution exposure. The breakdown of natural barriers warding against the entry of toxic particles, including the nasal, gut and lung epithelial barriers, as well as widespread breakdown of the blood-brain barrier facilitatethe passage of airborne pollutants into the body of young urban residents. Extensive neuroinflammation contributes to cell loss within the central nervous system, and likely is a crucial mechanism by which cognitive deficits may arise. Although subtle, neurocognitive effects of air pollution are substantial, apparent across all populations, and potentially clinically relevant as early evidence of evolving neurodegenerative changes. The diffuse nature of the neuroinflammation risk suggests an integrated neuroscientific approach incorporating current clinical, cognitive, neurophysiological, radiological and epidemiologic research. Neuropediatric air pollution research requires extensive multidisciplinary collaborations to accomplish the goal of protecting exposed children through multidimensional interventions having both broad impact and reach. While intervening by improving environmental quality at a global scale is imperative, we also need to devise efficient strategies on how the neurocognitive effects on local pediatric populations should be monitored.
How air pollution alters brain development: the role of neuroinflammation
Brockmeyer, Sam
2016-01-01
Abstract The present review synthesizes lines of emerging evidence showing how several samples of children populations living in large cities around the world suffer to some degree neural, behavioral and cognitive changes associated with air pollution exposure. The breakdown of natural barriers warding against the entry of toxic particles, including the nasal, gut and lung epithelial barriers, as well as widespread breakdown of the blood-brain barrier facilitatethe passage of airborne pollutants into the body of young urban residents. Extensive neuroinflammation contributes to cell loss within the central nervous system, and likely is a crucial mechanism by which cognitive deficits may arise. Although subtle, neurocognitive effects of air pollution are substantial, apparent across all populations, and potentially clinically relevant as early evidence of evolving neurodegenerative changes. The diffuse nature of the neuroinflammation risk suggests an integrated neuroscientific approach incorporating current clinical, cognitive, neurophysiological, radiological and epidemiologic research. Neuropediatric air pollution research requires extensive multidisciplinary collaborations to accomplish the goal of protecting exposed children through multidimensional interventions having both broad impact and reach. While intervening by improving environmental quality at a global scale is imperative, we also need to devise efficient strategies on how the neurocognitive effects on local pediatric populations should be monitored. PMID:28123818
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... authorities prohibit employment discrimination but also require affirmative action to ensure that equal... physical or mental disability and requires affirmative action to ensure that persons are treated without... be subject to the Affirmative Action Program (AAP) requirements of 41 CFR 60-741.40, the associated...
Differentially Private Frequent Sequence Mining via Sampling-based Candidate Pruning
Xu, Shengzhi; Cheng, Xiang; Li, Zhengyi; Xiong, Li
2016-01-01
In this paper, we study the problem of mining frequent sequences under the rigorous differential privacy model. We explore the possibility of designing a differentially private frequent sequence mining (FSM) algorithm which can achieve both high data utility and a high degree of privacy. We found, in differentially private FSM, the amount of required noise is proportionate to the number of candidate sequences. If we could effectively reduce the number of unpromising candidate sequences, the utility and privacy tradeoff can be significantly improved. To this end, by leveraging a sampling-based candidate pruning technique, we propose a novel differentially private FSM algorithm, which is referred to as PFS2. The core of our algorithm is to utilize sample databases to further prune the candidate sequences generated based on the downward closure property. In particular, we use the noisy local support of candidate sequences in the sample databases to estimate which sequences are potentially frequent. To improve the accuracy of such private estimations, a sequence shrinking method is proposed to enforce the length constraint on the sample databases. Moreover, to decrease the probability of misestimating frequent sequences as infrequent, a threshold relaxation method is proposed to relax the user-specified threshold for the sample databases. Through formal privacy analysis, we show that our PFS2 algorithm is ε-differentially private. Extensive experiments on real datasets illustrate that our PFS2 algorithm can privately find frequent sequences with high accuracy. PMID:26973430
ChariDingari, Narahara; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P.; Kumar, G. Manoj
2012-01-01
Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real world applications, e.g. quality assurance and process monitoring. Specifically, variability in sample, system and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a non-linear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), due to its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data – highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples as well as in related areas of forensic and biological sample analysis. PMID:22292496
Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj
2012-03-20
Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.
van der Wal, Fimme J.; Achterberg, René P.; van Solt-Smits, Conny; Bergervoet, Jan H. W.; de Weerdt, Marjanne; Wisselink, Henk J.
2017-01-01
We investigated the feasibility of an assay based on target-specific primer extension, combined with a suspension array, for the multiplexed detection and typing of a veterinary pathogen in animal samples, using Streptococcus suis as a model pathogen. A procedure was established for simultaneous detection of 6 S. suis targets in pig tonsil samples (i.e., 4 genes associated with serotype 1, 2, 7, or 9, the generic S. suis glutamate dehydrogenase gene [gdh], and the gene encoding the extracellular protein factor [epf]). The procedure was set up as a combination of protocols: DNA isolation from porcine tonsils, a multiplex PCR, a multiplex target-specific primer extension, and finally a suspension array as the readout. The resulting assay was compared with a panel of conventional PCR assays. The proposed multiplex assay can correctly identify the serotype of isolates and is capable of simultaneous detection of multiple targets in porcine tonsillar samples. The assay is not as sensitive as the current conventional PCR assays, but with the correct sampling strategy, the assay can be useful for screening pig herds to establish which S. suis serotypes are circulating in a pig population. PMID:28980519
ERIC Educational Resources Information Center
Futris, Ted G.; Nielsen, Robert B.; Barton, Allen W.
2011-01-01
The study reported here explored level of interest and preferred delivery method of Extension programming related to financial management and relationship skills education. These two subjects comprise areas of Extension that often receive less recognition but appear as pertinent issues in the lives of many individuals. Using a diverse sample of…
Shuttle-tethered satellite system definition study extension
NASA Technical Reports Server (NTRS)
1980-01-01
A system requirements definition and configuration study (Phase B) of the Tethered Satellite System (TSS) was conducted during the period 14 November 1977 to 27 February 1979. Subsequently a study extension was conducted during the period 13 June 1979 to 30 June 1980, for the purpose of refining the requirements identified during the main phase of the study, and studying in some detail the implications of accommodating various types of scientific experiments on the initial verification flight mission. An executive overview is given of the Tethered Satellite System definition developed during the study. The results of specific study tasks undertaken in the extension phase of the study are reported. Feasibility of the Tethered Satellite System has been established with reasonable confidence and the groundwork laid for proceeding with hardware design for the verification mission.
Designing and application of SAN extension interface based on CWDM
NASA Astrophysics Data System (ADS)
Qin, Leihua; Yu, Shengsheng; Zhou, Jingli
2005-11-01
As Fibre Channel (FC) becomes the protocol of choice within corporate data centers, enterprises are increasingly deploying SANs in their data central. In order to mitigate the risk of losing data and improve the availability of data, more and more enterprises are increasingly adopting storage extension technologies to replicate their business critical data to a secondary site. Transmitting this information over distance requires a carrier grade environment with zero data loss, scalable throughput, low jitter, high security and ability to travel long distance. To address this business requirements, there are three basic architectures for storage extension, they are Storage over Internet Protocol, Storage over Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Storage over Dense Wavelength Division Multiplexing (DWDM). Each approach varies in functionality, complexity, cost, scalability, security, availability , predictable behavior (bandwidth, jitter, latency) and multiple carrier limitations. Compared with these connectiviy technologies,Coarse Wavelength Division Multiplexing (CWDM) is a Simplified, Low Cost and High Performance connectivity solutions for enterprises to deploy their storage extension. In this paper, we design a storage extension connectivity over CWDM and test it's electrical characteristic and random read and write performance of disk array through the CWDM connectivity, testing result show us that the performance of the connectivity over CWDM is acceptable. Furthermore, we propose three kinds of network architecture of SAN extension based on CWDM interface. Finally the credit-Based flow control mechanism of FC, and the relationship between credits and extension distance is analyzed.
Botta, Gabriela; Turn, Christina S; Quintyne, Nicholas J; Kirchman, Paul A
2011-10-01
We have previously shown that copper supplementation extends the replicative life span of Saccharomyces cerevisiae when grown under conditions forcing cells to respire. We now show that copper's effect on life span is through Fet3p, a copper containing enzyme responsible for high affinity transport of iron into yeast cells. Life span extensions can also be obtained by supplementing the growth medium with 1mM ferric chloride. Extension by high iron levels is still dependent on the presence of Fet3p. Life span extension by iron or copper requires growth on media containing glycerol as the sole carbon source, which forces yeast to respire. Yeast grown on glucose containing media supplemented with iron show no extension of life span. The iron associated with cells grown in media supplemented with copper or iron is 1.4-1.8 times that of cells grown without copper or iron supplementation. As with copper supplementation, iron supplementation partially rescues the life span of superoxide dismutase mutants. Cells grown with copper supplementation display decreased production of superoxide as measured by dihydroethidium staining. Copyright © 2011 Elsevier Inc. All rights reserved.
Information Foraging and Change Detection for Automated Science Exploration
NASA Technical Reports Server (NTRS)
Furlong, P. Michael; Dille, Michael
2016-01-01
This paper presents a new algorithm for autonomous on-line exploration in unknown environments. The objective is to free remote scientists from possibly-infeasible extensive preliminary site investigation prior to sending robotic agents. We simulate a common exploration task for an autonomous robot sampling the environment at various locations and compare performance against simpler control strategies. An extension is proposed and evaluated that further permits operation in the presence of environmental variability in which the robot encounters a change in the distribution underlying sampling targets. Experimental results indicate a strong improvement in performance across varied parameter choices for the scenario.
77 FR 59294 - Rules of Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-27
... enforcement agencies of ``documentary material, results of inspections of tangible things, written reports or...: (1) Articulating staff's authority to inspect, copy, or sample documentary material--including..., copy, or sample documentary material, including electronic media. The proposal elicited extensive...
Level of endogenous formaldehyde in maple syrup as determined by spectrofluorimetry.
Lagacé, Luc; Guay, Stéphane; Martin, Nathalie
2003-01-01
The level of endogenous formaldehyde in maple syrup was established from a large number (n = 300) of authentic maple syrup samples collected during 2000 and 2001 in the province of Quebec, Canada. The average level of formaldehyde from these authentic samples was measured at 0.18 mg/kg in 2000 and 0.28 mg/kg in 2001, which is lower than previously published. These average values can be attributed to the improved spectrofluorimetric method used for the determination. However, the formaldehyde values obtained demonstrate a relatively large distribution with maximums observed at 1.04 and 1.54 mg/kg. These values are still under the maximum tolerance level of 2.0 mg/kg paraformaldehyde pesticide residue. Extensive heat treatment of maple syrup samples greatly enhanced the formaldehyde concentration of the samples, suggesting that extensive heat degradation of the sap constituents during evaporation could be responsible for the highest formaldehyde values in maple syrup.
77 FR 43582 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-25
...-year extension of its Environment, Safety and Health reporting requirements, OMB Control Number 1910..., and to Felecia Briggs, U.S. Department of Energy, Office of Health, Safety and Security, HS-83/C-412... the following: (1) OMB No.: 1910-0300; (2) Information Collection Request Title: Environment, Safety...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-31
...] Extension of Agency Information Collection Activity Under OMB Review: Flight Training for Aliens and Other... aliens and other designated individuals seeking flight instruction (``candidates'') from Federal Aviation.... Information Collection Requirement Title: Flight Training for Aliens and Other Designated Individuals...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-17
... CONSUMER PRODUCT SAFETY COMMISSION Proposed Extension of Approval of Information Collection; Comment Request--Children's Sleepwear AGENCY: Consumer Product Safety Commission. ACTION: Notice; correction. SUMMARY: As required by the Paperwork Reduction Act of 1995 (44 U.S.C. Chapter 35), the Consumer...
78 FR 14592 - Proposed Extension of Existing Information Collection; Emergency Mine Evacuation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-06
... DEPARTMENT OF LABOR Mine Safety and Health Administration [OMB Control No. 1219-0141] Proposed Extension of Existing Information Collection; Emergency Mine Evacuation AGENCY: Mine Safety and Health... requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration is...
Creating and Implementing Diverse Development Strategies to Support Extension Centers and Programs
ERIC Educational Resources Information Center
Page, Christopher S.; Kern, Michael A.
2018-01-01
Declining government funding for higher education requires colleges and universities to seek alternative revenue streams, including through philanthropic fund-raising. Extension-based subject matter centers and other programs can benefit from the thoughtful supplementation of traditional revenue sources with individual, corporate, and private…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
...] Portable Fire Extinguishers (Annual Maintenance Certification Record); Extension of the Office of Management and Budget's (OMB) Approval of the Information Collection (Paperwork) Requirements AGENCY... solicits public comments concerning its proposal to extend the Office of Management and Budget's (OMB...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-24
...] Portable Fire Extinguishers (Annual Maintenance Certification Record); Extension of the Office of Management and Budget's (OMB) Approval of the Information Collection (Paperwork) Requirements AGENCY... solicits comments concerning its proposal to extend the Office of Management and Budget's (OMB) approval of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-04
... describes: Standard operating procedures for using hazardous chemicals; hazard-control techniques; equipment...] Occupational Exposure to Hazardous Chemicals in Laboratories Standard; Extension of the Office of Management... collection requirements specified in the Standard on Occupational Exposure to Hazardous Chemicals in...
Improving Generation Y Volunteerism in Extension Programs
ERIC Educational Resources Information Center
Andrews, Kevin B.; Lockett, Landry L.
2013-01-01
Members of Generation Y have many positive attributes that make them attractive to Extension volunteer administrators as a potential source of labor. However, they think differently, have unique needs, require new management styles, and have less tolerance for unpleasant working conditions than previous generations. Additionally, they are engaged…
76 FR 3175 - Proposed Extension of Existing Information Collection; Respirator Program Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... extension of the information collection for Respiratory Protection Program Records under 30 CFR 56.5005 and... that such equipment offers adequate protection for workers. A written respiratory protection program... require metal and nonmetal mine operators to institute a respiratory protection program governing...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-14
...] Respiratory Protection Standard; Extension of the Office of Management and Budget's (OMB) Approval of... proposal to extend OMB approval of the information collection requirements specified by the Respiratory... Respiratory Protection Standard (29 CFR 1910.134; hereafter, ``the [[Page 13669
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-21
... accordance with requirements for a 1-year extension, the Philadelphia Area's 4th highest daily 8-hour...) The maximum 4th highest daily 8-hour monitored value at any monitoring site in the Philadelphia area...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-23
... DEPARTMENT OF LABOR Employee Benefits Security Administration Proposed Extension of Information... Beneficiaries Who Are Parties in Interest With Respect to the Plan AGENCY: Employee Benefits Security... collection requirements and provide the requested data in the desired format. The Employee Benefits Security...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-19
... Information Collection; Comment Request: Recordkeeping Requirements Under the Safety Standard for Infant...) requests comments on a proposed 3-year extension of approval of information collection for the... and Budget (OMB) previously approved the collection of information under control number 3041-0141. OMB...
A Decision Tree for Nonmetric Sex Assessment from the Skull.
Langley, Natalie R; Dudzik, Beatrix; Cloutier, Alesia
2018-01-01
This study uses five well-documented cranial nonmetric traits (glabella, mastoid process, mental eminence, supraorbital margin, and nuchal crest) and one additional trait (zygomatic extension) to develop a validated decision tree for sex assessment. The decision tree was built and cross-validated on a sample of 293 U.S. White individuals from the William M. Bass Donated Skeletal Collection. Ordinal scores from the six traits were analyzed using the partition modeling option in JMP Pro 12. A holdout sample of 50 skulls was used to test the model. The most accurate decision tree includes three variables: glabella, zygomatic extension, and mastoid process. This decision tree yielded 93.5% accuracy on the training sample, 94% on the cross-validated sample, and 96% on a holdout validation sample. Linear weighted kappa statistics indicate acceptable agreement among observers for these variables. Mental eminence should be avoided, and definitions and figures should be referenced carefully to score nonmetric traits. © 2017 American Academy of Forensic Sciences.
NASA Technical Reports Server (NTRS)
Welzenbach, L. C.; McCoy, T. J.; Glavin, D. P.; Dworkin, J. P.; Abell, P. A.
2012-01-01
While much of the scientific community s current attention is drawn to sample return missions, it is the existing meteorite and cosmic dust collections that both provide the paradigms to be tested by these missions and the context for interpreting the results. Recent sample returns from the Stardust and Hayabusa missions provided us with new materials and insights about our Solar System history and processes. As an example, Stardust sampled CAIs among the population of cometary grains, requiring extensive and unexpected radial mixing in the early solar nebula. This finding would not have been possible, however, without extensive studies of meteoritic CAIs that established their high-temperature, inner Solar System formation. Samples returned by Stardust also revealed the first evidence of a cometary amino acid, a discovery that would not have been possible with current in situ flight instrument technology. The Hayabusa mission provided the final evidence linking ordinary chondrites and S asteroids, a hypothesis that developed from centuries of collection and laboratory and ground-based telescopic studies. In addition to these scientific findings, studies of existing meteorite collections have defined and refined the analytical techniques essential to studying returned samples. As an example, the fortuitous fall of the Allende CV3 and Murchison CM2 chondrites within months before the return of Apollo samples allowed testing of new state-of-the-art analytical facilities. The results of those studies not only prepared us to better study lunar materials, but unanticipated discoveries changed many of our concepts about the earliest history and processes of the solar nebula. This synergy between existing collections and future space exploration is certainly not limited to sample return missions. Laboratory studies confirmed the existence of meteorites from Mars and raised the provocative possibility of preservation of ancient microbial life. The laboratory studies in turn led to a new wave of Mars exploration that ultimately could lead to sample return focused on evidence for past or present life. This partnership between collections and missions will be increasingly important in the coming decades as we discover new questions to be addressed and identify targets for for both robotic and human exploration . Nowhere is this more true than in the ultimate search for the abiotic and biotic processes that produced life. Existing collections also provide the essential materials for developing and testing new analytical schemes to detect the rare markers of life and distinguish them from abiotic processes. Large collections of meteorites and the new types being identified within these collections, which come to us at a fraction of the cost of a sample return mission, will continue to shape the objectives of future missions and provide new ways of interpreting returned samples.
Use of a probabilistic neural network to reduce costs of selecting construction rock
Singer, Donald A.; Bliss, James D.
2003-01-01
Rocks used as construction aggregate in temperate climates deteriorate to differing degrees because of repeated freezing and thawing. The magnitude of the deterioration depends on the rock's properties. Aggregate, including crushed carbonate rock, is required to have minimum geotechnical qualities before it can be used in asphalt and concrete. In order to reduce chances of premature and expensive repairs, extensive freeze-thaw tests are conducted on potential construction rocks. These tests typically involve 300 freeze-thaw cycles and can take four to five months to complete. Less time consuming tests that (1) predict durability as well as the extended freeze-thaw test or that (2) reduce the number of rocks subject to the extended test, could save considerable amounts of money. Here we use a probabilistic neural network to try and predict durability as determined by the freeze-thaw test using four rock properties measured on 843 limestone samples from the Kansas Department of Transportation. Modified freeze-thaw tests and less time consuming specific gravity (dry), specific gravity (saturated), and modified absorption tests were conducted on each sample. Durability factors of 95 or more as determined from the extensive freeze-thaw tests are viewed as acceptable—rocks with values below 95 are rejected. If only the modified freeze-thaw test is used to predict which rocks are acceptable, about 45% are misclassified. When 421 randomly selected samples and all four standardized and scaled variables were used to train aprobabilistic neural network, the rate of misclassification of 422 independent validation samples dropped to 28%. The network was trained so that each class (group) and each variable had its own coefficient (sigma). In an attempt to reduce errors further, an additional class was added to the training data to predict durability values greater than 84 and less than 98, resulting in only 11% of the samples misclassified. About 43% of the test data was classed by the neural net into the middle group—these rocks should be subject to full freeze-thaw tests. Thus, use of the probabilistic neural network would meanthat the extended test would only need be applied to 43% of the samples, and 11% of the rocks classed as acceptable would fail early.
A physics-based algorithm for the estimation of bearing spall width using vibrations
NASA Astrophysics Data System (ADS)
Kogan, G.; Klein, R.; Bortman, J.
2018-05-01
Evaluation of the damage severity in a mechanical system is required for the assessment of its remaining useful life. In rotating machines, bearings are crucial components. Hence, the estimation of the size of spalls in bearings is important for prognostics of the remaining useful life. Recently, this topic has been extensively studied and many of the methods used for the estimation of spall size are based on the analysis of vibrations. A new tool is proposed in the current study for the estimation of the spall width on the outer ring raceway of a rolling element bearing. The understanding and analysis of the dynamics of the rolling element-spall interaction enabled the development of a generic and autonomous algorithm. The algorithm is generic in the sense that it does not require any human interference to make adjustments for each case. All of the algorithm's parameters are defined by analytical expressions describing the dynamics of the system. The required conditions, such as sampling rate, spall width and depth, defining the feasible region of such algorithms, are analyzed in the paper. The algorithm performance was demonstrated with experimental data for different spall widths.
A public resource facilitating clinical use of genomes
Ball, Madeleine P.; Thakuria, Joseph V.; Zaranek, Alexander Wait; Clegg, Tom; Rosenbaum, Abraham M.; Wu, Xiaodi; Angrist, Misha; Bhak, Jong; Bobe, Jason; Callow, Matthew J.; Cano, Carlos; Chou, Michael F.; Chung, Wendy K.; Douglas, Shawn M.; Estep, Preston W.; Gore, Athurva; Hulick, Peter; Labarga, Alberto; Lee, Je-Hyuk; Lunshof, Jeantine E.; Kim, Byung Chul; Kim, Jong-Il; Li, Zhe; Murray, Michael F.; Nilsen, Geoffrey B.; Peters, Brock A.; Raman, Anugraha M.; Rienhoff, Hugh Y.; Robasky, Kimberly; Wheeler, Matthew T.; Vandewege, Ward; Vorhaus, Daniel B.; Yang, Joyce L.; Yang, Luhan; Aach, John; Ashley, Euan A.; Drmanac, Radoje; Kim, Seong-Jin; Li, Jin Billy; Peshkin, Leonid; Seidman, Christine E.; Seo, Jeong-Sun; Zhang, Kun; Rehm, Heidi L.; Church, George M.
2012-01-01
Rapid advances in DNA sequencing promise to enable new diagnostics and individualized therapies. Achieving personalized medicine, however, will require extensive research on highly reidentifiable, integrated datasets of genomic and health information. To assist with this, participants in the Personal Genome Project choose to forgo privacy via our institutional review board- approved “open consent” process. The contribution of public data and samples facilitates both scientific discovery and standardization of methods. We present our findings after enrollment of more than 1,800 participants, including whole-genome sequencing of 10 pilot participant genomes (the PGP-10). We introduce the Genome-Environment-Trait Evidence (GET-Evidence) system. This tool automatically processes genomes and prioritizes both published and novel variants for interpretation. In the process of reviewing the presumed healthy PGP-10 genomes, we find numerous literature references implying serious disease. Although it is sometimes impossible to rule out a late-onset effect, stringent evidence requirements can address the high rate of incidental findings. To that end we develop a peer production system for recording and organizing variant evaluations according to standard evidence guidelines, creating a public forum for reaching consensus on interpretation of clinically relevant variants. Genome analysis becomes a two-step process: using a prioritized list to record variant evaluations, then automatically sorting reviewed variants using these annotations. Genome data, health and trait information, participant samples, and variant interpretations are all shared in the public domain—we invite others to review our results using our participant samples and contribute to our interpretations. We offer our public resource and methods to further personalized medical research. PMID:22797899
Zielinski, R.A.; Lindsey, D.A.; Rosholt, J.N.
1980-01-01
The distribution and mobility of uranium in a diagenetically altered, 8 Ma old tuff in the Keg Mountain area, Utah, are modelled in this study. The modelling represents an improvement over similar earlier studies in that it: (1) considers a large number of samples (76) collected with good geologic control and exhibiting a wide range of alteration; (2) includes radiometric data for Th, K and RaeU (radium equivalent uranium) as well as U; (3) considers mineralogic and trace-element data for the same samples; and (4) analyzes the mineral and chemical covariation by multivariate statistical methods. The variation of U in the tuff is controlled mainly by its primary abundance in glass and by the relative abundance of non-uraniferous detritus and uraniferous accessory minerals. Alteration of glass to zeolite, even though extensive, caused no large or systematic change in the bulk concentration of U in the tuff. Some redistribution of U during diagenesis is indicated by association of U with minor alteration products such as opal and hydrous Fe-Mn oxide minerals. Isotopic studies indicate that the zeolitized tuff has been open to migration of U decay products during the last 0.8 Ma. The tuff of Keg Mountain has not lost a statistically detectable fraction of its original U, even though it has a high (??? 9 ppm) trace U content and has been extensively altered to zeolite. Similar studies in a variety of geological environments are required in order to identify the particular combination of conditions most favorable for liberation and migration of U from tuffs. ?? 1980.
Seed dispersal at alpine treeline: long distance dispersal maintains alpine treelines
NASA Astrophysics Data System (ADS)
Johnson, J. S.; Gaddis, K. D.; Cairns, D. M.; Krutovsky, K.
2016-12-01
Alpine treelines are expected to advance to higher elevations in conjunction with global warming. Nevertheless, the importance of reproductive method and seed dispersal distances at the alpine treeline ecotone remains unresolved. We address two research questions at mountain hemlock treelines on the Kenai Peninsula, Alaska: (1) What is the primary mode of reproduction, and (2) are recruits derived from local treeline populations or are they arriving from more distant seed sources? We addressed our research questions by exhaustively sampling mountain hemlock individuals along a single mountain slope and then genotyped DNA single nucleotide polymorphisms using a genotyping by sequencing approach (ddRAD Seq). First we assessed mode of reproduction by determining the proportion of sampled individuals with identical multilocus genotypes that are the product of clonal reproduction. Second, we used a categorical allocation based parentage analysis to identify parent-offspring pairs, so that the proportion of treeline reproduction events could be quantified spatially and dispersal distance measured. We identified sexual reproduction as the primary mode of reproduction at our study site. Seedling establishment was characterized by extensive cryptic seed dispersal and gene flow into the ecotone. The average dispersal distance was 73 meters with long distance dispersal identified as dispersal occurring at distances greater than 450 meters. We show that production of seeds within the alpine treeline ecotone is not a necessary requirement for treelines to advance to higher elevations in response to climate change. The extensive cryptic seed dispersal and gene flow into the alpine treeline ecotone is likely sufficient to propel the ecotone higher under more favorable climate.
Skagen, Susan K.; Granfors, Diane A.; Melcher, Cynthia P.
2008-01-01
Conservation challenges enhance the need for quantitative information on dispersed bird populations in extensive landscapes, for techniques to monitor populations and assess environmental effects, and for conservation strategies at appropriate temporal and spatial scales. By estimating population sizes of shorebirds in the U.S. portion of the prairie pothole landscape in central North America, where most migrating shorebirds exhibit a highly dispersed spatial pattern, we determined that the region may play a vital role in the conservation of shorebirds. During northward and southward migration, 7.3 million shorebirds (95% CI: 4.3–10.3 million) and 3.9 million shorebirds (95% CI: 1.7–6.0 million) stopped to rest and refuel in the study area; inclusion of locally breeding species increases the estimates by 0.1 million and 0.07 million shorebirds, respectively. Seven species of calidridine sandpipers, including Semipalmated Sandpipers (Calidris pusilla), White-rumped Sandpipers (C. fuscicollis), and Stilt Sandpipers (C. himantopus), constituted 50% of northbound migrants in our study area. We present an approach to population estimation and monitoring, based on stratified random selection of townships as sample units, that is well suited to 11 migratory shorebird species. For extensive and dynamic wetland systems, we strongly caution against a monitoring program based solely on repeated counts of known stopover sites with historically high numbers of shorebirds. We recommend refinements in methodology to address sample-size requirements and potential sources of bias so that our approach may form the basis of a rigorous migration monitoring program in this and other prairie wetland regions.
77 FR 44262 - Federal Property Suitable as Facilities To Assist the Homeless
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-27
....; office; major repairs required; extensive mold & asbestos located beneath the bldg.; remediation required...: Agricultural surroundings; remedial action has been taken for asbestos removal. Unsuitable Properties Building...
Serotyping of Streptococcus pneumoniae Based on Capsular Genes Polymorphisms
Raymond, Frédéric; Boucher, Nancy; Allary, Robin; Robitaille, Lynda; Lefebvre, Brigitte; Tremblay, Cécile
2013-01-01
Streptococcus pneumoniae serotype epidemiology is essential since serotype replacement is a concern when introducing new polysaccharide-conjugate vaccines. A novel PCR-based automated microarray assay was developed to assist in the tracking of the serotypes. Autolysin, pneumolysin and eight genes located in the capsular operon were amplified using multiplex PCR. This step was followed by a tagged fluorescent primer extension step targeting serotype-specific polymorphisms. The tagged primers were then hybridized to a microarray. Results were exported to an expert system to identify capsular serotypes. The assay was validated on 166 cultured S. pneumoniae samples from 63 different serotypes as determined by the Quellung method. We show that typing only 12 polymorphisms located in the capsular operon allows the identification at the serotype level of 22 serotypes and the assignation of 24 other serotypes to a subgroup of serotypes. Overall, 126 samples (75.9%) were correctly serotyped, 14 were assigned to a member of the same serogroup, 8 rare serotypes were erroneously serotyped, and 18 gave negative serotyping results. Most of the discrepancies involved rare serotypes or serotypes that are difficult to discriminate using a DNA-based approach, for example 6A and 6B. The assay was also tested on clinical specimens including 43 cerebrospinal fluid samples from patients with meningitis and 59 nasopharyngeal aspirates from bacterial pneumonia patients. Overall, 89% of specimens positive for pneumolysin were serotyped, demonstrating that this method does not require culture to serotype clinical specimens. The assay showed no cross-reactivity for 24 relevant bacterial species found in these types of samples. The limit of detection for serotyping and S. pneumoniae detection was 100 genome equivalent per reaction. This automated assay is amenable to clinical testing and does not require any culturing of the samples. The assay will be useful for the evaluation of serotype prevalence changes after new conjugate vaccines introduction. PMID:24086706
Stöckel, Stephan; Meisel, Susann; Elschner, Mandy; Melzer, Falk; Rösch, Petra; Popp, Jürgen
2015-01-01
Burkholderia mallei (the etiologic agent of glanders in equines and rarely humans) and Burkholderia pseudomallei, causing melioidosis in humans and animals, are designated category B biothreat agents. The intrinsically high resistance of both agents to many antibiotics, their potential use as bioweapons, and their low infectious dose, necessitate the need for rapid and accurate detection methods. Current methods to identify these organisms may require up to 1 week, as they rely on phenotypic characteristics and an extensive set of biochemical reactions. In this study, Raman microspectroscopy, a cultivation-independent typing technique for single bacterial cells with the potential for being a rapid point-of-care analysis system, is evaluated to identify and differentiate B. mallei and B. pseudomallei within hours. Here, not only broth-cultured microbes but also bacteria isolated out of pelleted animal feedstuff were taken into account. A database of Raman spectra allowed a calculation of classification functions, which were trained to differentiate Raman spectra of not only both pathogens but also of five further Burkholderia spp. and four species of the closely related genus Pseudomonas. The developed two-stage classification system comprising two support vector machine (SVM) classifiers was then challenged by a test set of 11 samples to simulate the case of a real-world-scenario, when "unknown samples" are to be identified. In the end, all test set samples were identified correctly, even if the contained bacterial strains were not incorporated in the database before or were isolated out of animal feedstuff. Specifically, the five test samples bearing B. mallei and B. pseudomallei were correctly identified on species level with accuracies between 93.9 and 98.7%. The sample analysis itself requires no biomass enrichment step prior to the analysis and can be performed under biosafety level 1 (BSL 1) conditions after inactivating the bacteria with formaldehyde.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... approved information collection, the List Sampling Frame Surveys. Revision to burden hours will be needed due to changes in the size of the target population, sampling design, and/or questionnaire length... Agriculture, (202) 720-4333. SUPPLEMENTARY INFORMATION: Title: List Sampling Frame Surveys. OMB Control Number...
PCBs were used extensively in school building materials (caulk and lighting fixture ballasts) during the approximate period of 1950-1978. Most of the schools built nationwide during this period have not had indoor air sampling conducted for PCBs. Passive air sampling holds promi...
Feature Sampling in Detection: Implications for the Measurement of Perceptual Independence
ERIC Educational Resources Information Center
Macho, Siegfried
2007-01-01
The article presents the feature sampling signal detection (FS-SDT) model, an extension of the multivariate signal detection (SDT) model. The FS-SDT model assumes that, because of attentional shifts, different subsets of features are sampled for different presentations of the same multidimensional stimulus. Contrary to the SDT model, the FS-SDT…
NASA Astrophysics Data System (ADS)
Brune, S.
2016-12-01
The Gulf of California formed by oblique divergence across the Pacific-North America plate boundary. This presentation combines numerical forward modeling and plate tectonic reconstructions in order to address 2 important aspects of rift dynamics: (1) Plate motions during continental rifting are decisively controlled by the non-linear decay of rift strength. This conclusion is based on a recent plate-kinematic analysis of post-Pangea rift systems (Central Atlantic, South Atlantic, Iberia/Newfoundland, Australia/Antarctica, North Atlantic, South China Sea). In all cases, continental rifting starts with a slow phase followed by an abrupt acceleration within a few My introducing a fast rift phase. Numerical forward modeling with force boundary conditions shows that the two-phase velocity behavior and the rapid speed-up during rifting are intrinsic features of continental rupture that can be robustly inferred for different crust and mantle rheologies. (2) Rift strength depends on the obliquity of the rift system: the force required to maintain a given rift velocity can be computed from simple analytical and more realistic numerical models alike, and both modeling approaches demonstrate that less force is required to perpetuate oblique extension. The reason is that plastic yielding requires a smaller plate boundary force when extension is oblique to the rift trend. Comparing strike slip and pure extension end-member scenarios, it can be shown that about 50% less force is required to deform the lithosphere under strike-slip. This result implies that rift systems involving significant obliquity are mechanically preferred. These two aspects shed new light on the underlying geodynamic causes of Gulf of California rift history. Continental extension is thought to have started in Late Eocene/Oligocene times as part of the southern Basin and Range Province and evolved in a protracted history at low extension rate (≤15 mm/yr). However, with a direction change in Baja California microplate motion 13-6 My ago, plate divergence drastically increased its obliquity, which reduced the rifts mechanical resistance to extension. This effective loss of rift strength sparked an acceleration of the Gulf of California rift and ultimately enabled today's divergence velocities of more than 45 mm/yr.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-16
... population, sampling design, and/or questionnaire length. Some of the vegetable production surveys will incorporate sampling of the total population of producers, while the processing surveys will involve a total...
Sadones, Nele; Archer, John R H; Ingels, Ann-Sofie M E; Dargan, Paul I; Wood, David M; Wood, Michelle; Neels, Hugo; Lambert, Willy E; Stove, Christophe P
2015-04-01
Gamma-hydroxybutyric acid (GHB) is a well-known illicit club and date-rape drug. Dried blood spot (DBS) sampling is a promising alternative for classical venous sampling in cases of (suspected) GHB intoxication since it allows rapid sampling, which is of interest for the extensively metabolized GHB. However, there is limited data if -and how- capillary DBS concentrations correlate with venous concentrations. We conducted a comparative study in 50 patients with suspected GHB intoxication, to determine and to correlate GHB concentrations in venous DBS (vDBS) and capillary DBS (cDBS). This is the first study that evaluates in a large cohort the correlation between capillary and venous concentrations of an illicit drug in real-life samples. Of the 50 paired samples, 7 were excluded: the vDBS concentration was below the LLOQ of 2 µg/mL in 3 cases and 4 samples were excluded after visual inspection of the DBS. Bland-Altman analysis revealed a mean % difference of -2.8% between cDBS and vDBS concentrations, with the zero value included in the 95% confidence interval of the mean difference in GHB concentration. A paired sample t-test confirmed this observation (p = 0.17). Also the requirement for incurred sample reproducibility was fulfilled: for more than two-thirds of the samples the concentrations obtained in cDBS and those in vDBS were within 20% of their mean. Since equivalent concentrations were observed in cDBS and vDBS, blood obtained by fingerprick can be considered a valid alternative for venous blood for GHB determination. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
1979-01-01
A plan for the production of two PEP flight systems is defined. The task's milestones are described. Provisions for the development and assembly of new ground support equipment required for both testing and launch operations are included.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-25
...] Asbestos in Construction Standard; Extension of the Office of Management and Budget's (OMB) Approval of... proposal to extend OMB's approval of the information collection requirements contained in the Asbestos in... occupational exposure to asbestos, including lung cancer, mesothelioma, asbestosis (an emphysema-like condition...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-30
...] Standard on Hazardous Waste Operations and Emergency Response (HAZWOPER); Extension of the Office of Management and Budget's (OMB) Approval of Information Collection (Paperwork) Requirements AGENCY... solicits public comments concerning its proposal to extend the Office of Management and Budget's (OMB...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-26
...] Mechanical Power Presses Standard; Extension of the Office of Management and Budget's (OMB) Approval of the... proposal to extend OMB approval of the information collection requirements specified in the Mechanical...). The collections of information contained in the Mechanical Power [[Page 78396
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... CONSUMER PRODUCT SAFETY COMMISSION [Docket No. CPSC-2009-0102] Collection of Information; Proposed Extension of Approval; Comment Request--Follow-Up Activities for Product-Related Injuries AGENCY: Consumer Product Safety Commission. ACTION: Notice. SUMMARY: As required by the Paperwork Reduction Act of 1995...