21 CFR 111.80 - What representative samples must you collect?
Code of Federal Regulations, 2010 CFR
2010-04-01
... Process Control System § 111.80 What representative samples must you collect? The representative samples... unique lot within each unique shipment); (b) Representative samples of in-process materials for each manufactured batch at points, steps, or stages, in the manufacturing process as specified in the master...
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shine, E. P.; Poirier, M. R.
2013-10-29
Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and datamore » interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and statisticians used carefully thought out designs that systematically and economically provided plans for data collection from the DWPF process. Key shared features of the sampling designs used at DWPF and the Gy sampling methodology were the specification of a standard for sample representativeness, an investigation that produced data from the process to study the sampling function, and a decision framework used to assess whether the specification was met based on the data. Without going into detail with regard to the seven errors identified by Pierre Gy, as excellent summaries are readily available such as Pitard [1989] and Smith [2001], SRS engineers understood, for example, that samplers can be biased (Gy's extraction error), and developed plans to mitigate those biases. Experiments that compared installed samplers with more representative samples obtained directly from the tank may not have resulted in systematically partitioning sampling errors into the now well-known error categories of Gy, but did provide overall information on the suitability of sampling systems. Most of the designs in this report are related to the DWPF vessels, not the large SRS Tank Farm tanks. Samples from the DWPF Slurry Mix Evaporator (SME), which contains the feed to the DWPF melter, are characterized using standardized analytical methods with known uncertainty. The analytical error is combined with the established error from sampling and processing in DWPF to determine the melter feed composition. This composition is used with the known uncertainty of the models in the Product Composition Control System (PCCS) to ensure that the wasteform that is produced is comfortably within the acceptable processing and product performance region. Having the advantage of many years of processing that meets the waste glass product acceptance criteria, the DWPF process has provided a considerable amount of data about itself in addition to the data from many special studies. Demonstrating representative sampling directly from the large Tank Farm tanks is a difficult, if not unsolvable enterprise due to limited accessibility. However, the consistency and the adequacy of sampling and mixing at SRS could at least be studied under the controlled process conditions based on samples discussed by Ray and others [2012a] in Waste Form Qualification Report (WQR) Volume 2 and the transfers from Tanks 40H and 51H to the Sludge Receipt and Adjustment Tank (SRAT) within DWPF. It is important to realize that the need for sample representativeness becomes more stringent as the material gets closer to the melter, and the tanks within DWPF have been studied extensively to meet those needs.« less
NASA Technical Reports Server (NTRS)
Hudson, Nicolas; Lin, Ying; Barengoltz, Jack
2010-01-01
A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.
Deciphering Martian climatic history using returned samples
NASA Technical Reports Server (NTRS)
Paige, D. A.; Krieger, D. B.; Brigham, C. A.
1988-01-01
By necessity, a Mars sample return mission must sample the upper few meters of the Martian surface. This material was subjected to a wide variety of physical processes. Presently, the most important processes are believed to be wind-driven erosion and deposition, and water ice accumulation at higher latitudes. A sample return mission represents an opportunity to better understand and quantify these important geological processes. By obtaining sample cores at key locations, it may be possible to interpret much of recent Martian climatic history.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, V.E.
Naturally occurring radioactivity was measured in the atmospheric emissions and process materials of a thermal phosphate (elemental phosphorus) plant. Representative exhaust stack samples were collected from each process in the plant. The phosphate ore contained 12 to 20 parts per million uranium. Processes, emission points, and emission controls are described. Radioactivity concentrations and emission rates from the sources sampled are given.
Data processing 1: Advancements in machine analysis of multispectral data
NASA Technical Reports Server (NTRS)
Swain, P. H.
1972-01-01
Multispectral data processing procedures are outlined beginning with the data display process used to accomplish data editing and proceeding through clustering, feature selection criterion for error probability estimation, and sample clustering and sample classification. The effective utilization of large quantities of remote sensing data by formulating a three stage sampling model for evaluation of crop acreage estimates represents an improvement in determining the cost benefit relationship associated with remote sensing technology.
Native microflora in fresh-cut processing plants and their potentials of biofilm formation
USDA-ARS?s Scientific Manuscript database
Representative food contact and non-food contact surfaces in two mid-sized fresh cut processing facilities were sampled for microbiological analyses post routine daily sanitization. Mesophilic and psychrotrophic bacteria on the sampled surfaces were isolated by plating on non-selective bacterial med...
Inhibition Of Molecular And Biological Processes Using Modified Oligonucleotides
Kozyavkin, Sergei A.; Malykh, Andrei G.; Polouchine, Nikolai N.; Slesarev, Alexei I.
2003-04-15
A method of inhibiting at least one molecular process in a sample, comprising administering to the sample an oligonucleotide or polynucleotide containing at least one monomeric unit having formula (I): wherein A is an organic moiety, n is at least 1, and each X is independently selected from the group consisting of --NRCOCONu, --NHCOCR.sub.2 CR.sub.2 CONu, --NHCOCR.dbd.CRCONu, and --NHCOSSCONu, wherein each R independently represents H or a substituted or unsubstituted alkyl group, and Nu represents a nucleophile, or a salt of the compound.
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...
ERIC Educational Resources Information Center
Staats, Arthur W.
Psychological researchers should deal with the concrete stimulus-response principles of learning on which behavior is based, and study behaviors that are representative of real life behaviors. The present research strategy has come from two faulty ideas: first, a concern with underlying, inferred mental processes, rather than with actual tasks or…
Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.
Blutke, Andreas; Wanke, Rüdiger
2018-03-06
In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.
Sampling and analysis plan for sludge located on the floor and in the pits of the 105-K basins
DOE Office of Scientific and Technical Information (OSTI.GOV)
BAKER, R.B.
1998-11-20
This Sampling and Analysis Plan (SAP) provides direction for the sampling of the sludge found on the floor and in the remote pits of the 105-K Basins to provide: (1) basic data for the sludges that have not been characterized to-date and (2) representative Sludge material for process tests to be made by the SNF Project/K Basins sludge treatment process subproject. The sampling equipment developed will remove representative samples of the radioactive sludge from underwater at the K Basins, depositing them in shielded containers for transport to the Hanford Site laboratories. Included in the present document is the basic backgroundmore » logic for selection of the samples to meet the requirements established in the Data Quality Objectives (DQO), HNF-2033, for this sampling activity. The present document also includes the laboratory analyses, methods, procedures, and reporting that will be required to meet the DQO.« less
Sampling Designs in Qualitative Research: Making the Sampling Process More Public
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Leech, Nancy L.
2007-01-01
The purpose of this paper is to provide a typology of sampling designs for qualitative researchers. We introduce the following sampling strategies: (a) parallel sampling designs, which represent a body of sampling strategies that facilitate credible comparisons of two or more different subgroups that are extracted from the same levels of study;…
Cryogenic homogenization and sampling of heterogeneous multi-phase feedstock
Doyle, Glenn Michael; Ideker, Virgene Linda; Siegwarth, James David
2002-01-01
An apparatus and process for producing a homogeneous analytical sample from a heterogenous feedstock by: providing the mixed feedstock, reducing the temperature of the feedstock to a temperature below a critical temperature, reducing the size of the feedstock components, blending the reduced size feedstock to form a homogeneous mixture; and obtaining a representative sample of the homogeneous mixture. The size reduction and blending steps are performed at temperatures below the critical temperature in order to retain organic compounds in the form of solvents, oils, or liquids that may be adsorbed onto or absorbed into the solid components of the mixture, while also improving the efficiency of the size reduction. Preferably, the critical temperature is less than 77 K (-196.degree. C.). Further, with the process of this invention the representative sample may be maintained below the critical temperature until being analyzed.
Vision Research Literature May Not Represent the Full Intellectual Range of Autism Spectrum Disorder
Brown, Alyse C.; Chouinard, Philippe A.; Crewther, Sheila G.
2017-01-01
Sensory, in particular visual processing is recognized as often perturbed in individuals with Autism Spectrum Disorder (ASD). However, in terms of the literature pertaining to visual processing, individuals in the normal intelligence range (IQ = 90–110) and above, are more frequently represented in study samples than individuals who score below normal in the borderline intellectual disability (ID) (IQ = 71–85) to ID (IQ < 70) ranges. This raises concerns as to whether or not current research is generalizable to a disorder that is often co-morbid with ID. Thus, the aim of this review is to better understand to what extent the current ASD visual processing literature is representative of the entire ASD population as either diagnosed or recognized under DSM-5. Our recalculation of ASD prevalence figures, using the criteria of DSM-5, indicates approximately 40% of the ASD population are likely to be ID although searching of the visual processing literature in ASD up to July 2016 showed that only 20% of papers included the ASD with-ID population. In the published literature, the mean IQ sampled was found to be 104, with about 80% of studies sampling from the 96–115 of the IQ range, highlighting the marked under-representation of the ID and borderline ID sections of the ASD population. We conclude that current understanding of visual processing and perception in ASD is not based on the mean IQ profile of the DSM-5 defined ASD population that now appears to lie within the borderline ID to ID range. Give the importance of the role of vision for the social and cognitive processing in ASD, we recommend accurately representing ASD via greater inclusion of individuals with IQ below 80, in future ASD research. PMID:28261072
NASA Astrophysics Data System (ADS)
Ran, Youhua; Li, Xin; Jin, Rui; Kang, Jian; Cosh, Michael H.
2017-01-01
Monitoring and estimating grid-mean soil moisture is very important for assessing many hydrological, biological, and biogeochemical processes and for validating remotely sensed surface soil moisture products. Temporal stability analysis (TSA) is a valuable tool for identifying a small number of representative sampling points to estimate the grid-mean soil moisture content. This analysis was evaluated and improved using high-quality surface soil moisture data that were acquired by a wireless sensor network in a high-intensity irrigated agricultural landscape in an arid region of northwestern China. The performance of the TSA was limited in areas where the representative error was dominated by random events, such as irrigation events. This shortcoming can be effectively mitigated by using a stratified TSA (STSA) method, proposed in this paper. In addition, the following methods were proposed for rapidly and efficiently identifying representative sampling points when using TSA. (1) Instantaneous measurements can be used to identify representative sampling points to some extent; however, the error resulting from this method is significant when validating remotely sensed soil moisture products. Thus, additional representative sampling points should be considered to reduce this error. (2) The calibration period can be determined from the time span of the full range of the grid-mean soil moisture content during the monitoring period. (3) The representative error is sensitive to the number of calibration sampling points, especially when only a few representative sampling points are used. Multiple sampling points are recommended to reduce data loss and improve the likelihood of representativeness at two scales.
21 CFR 111.315 - What are the requirements for laboratory control processes?
Code of Federal Regulations, 2010 CFR
2010-04-01
... specifications; (b) Use of sampling plans for obtaining representative samples, in accordance with subpart E of... for distribution rather than for return to the supplier); and (5) Packaged and labeled dietary...
Is Knowledge Random? Introducing Sampling and Bias through Outdoor Inquiry
ERIC Educational Resources Information Center
Stier, Sam
2010-01-01
Sampling, very generally, is the process of learning about something by selecting and assessing representative parts of that population or object. In the inquiry activity described here, students learned about sampling techniques as they estimated the number of trees greater than 12 cm dbh (diameter at breast height) in a wooded, discrete area…
NASA Astrophysics Data System (ADS)
Qi, Shengqi; Hou, Deyi; Luo, Jian
2017-09-01
This study presents a numerical model based on field data to simulate groundwater flow in both the aquifer and the well-bore for the low-flow sampling method and the well-volume sampling method. The numerical model was calibrated to match well with field drawdown, and calculated flow regime in the well was used to predict the variation of dissolved oxygen (DO) concentration during the purging period. The model was then used to analyze sampling representativeness and sampling time. Site characteristics, such as aquifer hydraulic conductivity, and sampling choices, such as purging rate and screen length, were found to be significant determinants of sampling representativeness and required sampling time. Results demonstrated that: (1) DO was the most useful water quality indicator in ensuring groundwater sampling representativeness in comparison with turbidity, pH, specific conductance, oxidation reduction potential (ORP) and temperature; (2) it is not necessary to maintain a drawdown of less than 0.1 m when conducting low flow purging. However, a high purging rate in a low permeability aquifer may result in a dramatic decrease in sampling representativeness after an initial peak; (3) the presence of a short screen length may result in greater drawdown and a longer sampling time for low-flow purging. Overall, the present study suggests that this new numerical model is suitable for describing groundwater flow during the sampling process, and can be used to optimize sampling strategies under various hydrogeological conditions.
Requirements management: A CSR's perspective
NASA Technical Reports Server (NTRS)
Thompson, Joanie
1991-01-01
The following subject areas are covered: customer service overview of network service request processing; Customer Service Representative (CSR) responsibility matrix; extract from a sample Memorandum of Understanding; Network Service Request Form and its instructions sample notification of receipt; and requirements management in the NASA Science Internet.
Vernazza, Christopher R; Carr, Katherine; Wildman, John; Gray, Joanne; Holmes, Richard D; Exley, Catherine; Smith, Robert A; Donaldson, Cam
2018-06-22
Resources in any healthcare systems are scarce relative to need and therefore choices need to be made which often involve difficult decisions about the best allocation of these resources. One pragmatic and robust tool to aid resource allocation is Programme Budgeting and Marginal Analysis (PBMA), but there is mixed evidence on its uptake and effectiveness. Furthermore, there is also no evidence on the incorporation of the preferences of a large and representative sample of the general public into such a process. The study therefore aims to undertake, evaluate and refine a PBMA process within the exemplar of NHS dentistry in England whilst also using an established methodology (Willingness to Pay (WTP)) to systematically gather views from a representative sample of the public. Stakeholders including service buyers (commissioners), dentists, dental public health representatives and patient representatives will be recruited to participate in a PBMA process involving defining current spend, agreeing criteria to judge services/interventions, defining areas for investment and disinvestment, rating these areas against the criteria and making final recommendations. The process will be refined based on participatory action research principles and evaluated through semi-structured interviews, focus groups and observation of the process by the research team. In parallel a representative sample of English adults will be recruited to complete a series of four surveys including WTP valuations of programmes being considered by the PBMA panel. In addition a methodological experiment comparing two ways of eliciting WTP will be undertaken. The project will allow the PBMA process and particularly the use of WTP within it to be investigated and developed. There will be challenges around engagement with the task by the panel undertaking it and with the outputs by stakeholders but careful relationship building will help to mitigate this. The large volume of data will be managed through careful segmenting of the analysis and the use of the well-established Framework approach to qualitative data analysis. WTP has various potential biases but the elicitation will be carefully designed to minimise these and some methodological investigation will take place.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pochan, M.J.; Massey, M.J.
1979-02-01
This report discusses the results of actual raw product gas sampling efforts and includes: Rationale for raw product gas sampling efforts; design and operation of the CMU gas sampling train; development and analysis of a sampling train data base; and conclusions and future application of results. The results of sampling activities at the CO/sub 2/-Acceptor and Hygas pilot plants proved that: The CMU gas sampling train is a valid instrument for characterization of environmental parameters in coal gasification gas-phase process streams; depending on the particular process configuration, the CMU gas sampling train can reduce gasifier effluent characterization activity to amore » single location in the raw product gas line; and in contrast to the slower operation of the EPA SASS Train, CMU's gas sampling train can collect representative effluent data at a rapid rate (approx. 2 points per hour) consistent with the rate of change of process variables, and thus function as a tool for process engineering-oriented analysis of environmental characteristics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, J.; Talbott, J.
1984-01-01
Task 1. Methods development for the speciation of the polysulfides. Work on this task has been completed in December 1983 and reported accordingly in DOE/PC/40783-T13. Task 2. Methods development for the speciation of dithionite and polythionates. Work on Task 2 has been completed in June 1984 and has been reported accordingly in DOE/PC/40783-T15. Task 3. Total accounting of the sulfur balance in representative samples of synfuel process streams. A systematic and critical comparison of results, obtained in the analysis of sulfur moieties in representative samples of coal conversion process streams, revealed the following general trends. (a) In specimens of highmore » pH (9-10) and low redox potential (-0.3 to -0.4 volt versus NHE) sulfidic and polysulfidic sulfur moieties predominate. (b) In process streams of lower pH and more positive redox potential, higher oxidation states of sulfur (notably sulfate) account for most of the total sulfur present. (c) Oxidative wastewater treatment procedures by the PETC stripping process convert lower oxidation states of sulfur into thiosulfate and sulfate. In this context, remarkable similarities were observed between liquefaction and gasification process streams. However, the thiocyanate present in samples from the Grand Forks gasifier were impervious to the PETC stripping process. (d) Total sulfur contaminant levels in coal conversion process stream wastewater samples are primarily determined by the abundance of sulfur in the coal used as starting material than by the nature of the conversion process (liquefaction or gasification). 13 references.« less
Ultrasonic imaging of textured alumina
NASA Technical Reports Server (NTRS)
Stang, David B.; Salem, Jonathan A.; Generazio, Edward R.
1989-01-01
Ultrasonic images representing the bulk attenuation and velocity of a set of alumina samples were obtained by a pulse-echo contact scanning technique. The samples were taken from larger bodies that were chemically similar but were processed by extrusion or isostatic processing. The crack growth resistance and fracture toughness of the larger bodies were found to vary with processing method and test orientation. The results presented here demonstrate that differences in texture that contribute to variations in structural performance can be revealed by analytic ultrasonic techniques.
Solberg, Siri Løvsjø; Terragni, Laura; Granheim, Sabrina Ionata
2016-08-01
To identify the use of ultra-processed foods - vectors of salt, sugar and fats - in the Norwegian diet through an assessment of food sales. Sales data from a representative sample of food retailers in Norway, collected in September 2005 (n 150) and September 2013 (n 170), were analysed. Data consisted of barcode scans of individual food item purchases, reporting type of food, price, geographical region and retail concept. Foods were categorized as minimally processed, culinary ingredients, processed products and ultra-processed. Indicators were share of purchases and share of expenditure on food categories. Six geographical regions in Norway. The barcode data included 296 121 observations in 2005 and 501 938 observations in 2013. Ultra-processed products represented 58·8 % of purchases and 48·8 % of expenditure in 2013. Minimally processed foods accounted for 17·2 % of purchases and 33·0 % of expenditure. Every third purchase was a sweet ultra-processed product. Food sales changed marginally in favour of minimally processed foods and in disfavour of processed products between 2005 and 2013 (χ 2 (3)=203 195, P<0·001, Cramer's V=0·017, P<0·001). Ultra-processed products accounted for the majority of food sales in Norway, indicating a high consumption of such products. This could be contributing to rising rates of overweight, obesity and non-communicable diseases in the country, as findings from other countries indicate. Policy measures should aim at decreasing consumption of ultra-processed products and facilitating access (including economic) to minimally processed foods.
ERIC Educational Resources Information Center
Leonardi, Fabio; Spazzafumo, Liana; Marcellini, Fiorella
2005-01-01
Based on the constructionist point of view applied to Subjective Well-Being (SWB), five hypotheses were advanced about the predictive power of the top-down effects and bottom-up processes over a five years period. The sample consisted of 297 respondents, which represent the Italian sample of a European longitudinal survey; the first phase was…
Sparking young minds with Moon rocks and meteorites
NASA Technical Reports Server (NTRS)
Taylor, G. Jeffrey; Lindstrom, Marilyn M.
1993-01-01
What could be more exciting than seeing pieces of other worlds? The Apollo program left a legacy of astounding accomplishments and precious samples. Part of the thrill of those lunar missions is brought to schools by the lunar sample educational disks, which contain artifacts of six piloted trips to the Moon. Johnson Space Center (JSC) is preparing 100 new educational disks containing pieces of meteorites collected in Antarctica. These represent chunks of several different asteroids, that were collected in one of the most remote, forbidding environments on Earth. These pieces of the Moon and asteroids represent the products of basic planetary processes (solar nebular processes, initial differentiation, volcanism, and impact), and, in turn, these processes are controlled by basic physical and chemical processes (energy, energy transfer, melting, buoyancy, etc.). Thus, the lunar and meteorite sample disks have enormous educational potential. New educational materials are being developed to accompany the disks. Present materials are not as effective as they could be, especially in relating samples to processes and to other types of data such as spectral studies and photogeology. Furthermore, the materials are out of date. New background materials will be produced for teachers, assembling slide sets with extensive captions, and devising numerous hands-on classroom activities to do while the disks are at a school and before and after they arrive. The classroom activities will be developed by teams of experienced teachers working with lunar and meteorite experts.
NASA Technical Reports Server (NTRS)
Brand, R. R.; Barker, J. L.
1983-01-01
A multistage sampling procedure using image processing, geographical information systems, and analytical photogrammetry is presented which can be used to guide the collection of representative, high-resolution spectra and discrete reflectance targets for future satellite sensors. The procedure is general and can be adapted to characterize areas as small as minor watersheds and as large as multistate regions. Beginning with a user-determined study area, successive reductions in size and spectral variation are performed using image analysis techniques on data from the Multispectral Scanner, orbital and simulated Thematic Mapper, low altitude photography synchronized with the simulator, and associated digital data. An integrated image-based geographical information system supports processing requirements.
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders are having ...
Modified Pressure System for Imaging Egg Cracks
USDA-ARS?s Scientific Manuscript database
Abstract One aspect of grading table eggs is shell checks or cracks. Currently, USDA voluntary regulations require that humans grade a representative sample of all eggs processed. However, as processing plants and packing facilities continue to increase their volume and throughput, human graders a...
A Synopsis of Technical Issues for Monitoring Sediment in Highway and Urban Runoff
Bent, Gardner C.; Gray, John R.; Smith, Kirk P.; Glysson, G. Douglas
2000-01-01
Accurate and representative sediment data are critical for assessing the potential effects of highway and urban runoff on receiving waters. The U.S. Environmental Protection Agency identified sediment as the most widespread pollutant in the Nation's rivers and streams, affecting aquatic habitat, drinking water treatment processes, and recreational uses of rivers, lakes, and estuaries. Representative sediment data are also necessary for quantifying and interpreting concentrations, loads, and effects of trace elements and organic constituents associated with highway and urban runoff. Many technical issues associated with the collecting, processing, and analyzing of samples must be addressed to produce valid (useful for intended purposes), current, complete, and technically defensible data for local, regional, and national information needs. All aspects of sediment data-collection programs need to be evaluated, and adequate quality-control data must be collected and documented so that the comparability and representativeness of data obtained for highway- and urban-runoff studies may be assessed. Collection of representative samples for the measurement of sediment in highway and urban runoff involves a number of interrelated issues. Temporal and spatial variability in runoff result from a combination of factors, including volume and intensity of precipitation, rate of snowmelt, and features of the drainage basin such as area, slope, infiltration capacity, channel roughness, and storage characteristics. In small drainage basins such as those found in many highway and urban settings, automatic samplers are often the most suitable method for collecting samples of runoff for a variety of reasons. Indirect sediment-measurement methods are also useful as supplementary and(or) surrogate means for monitoring sediment in runoff. All of these methods have limitations in addition to benefits, which must be identified and quantified to produce representative data. Methods for processing raw sediment samples (including homogenization and subsampling) for subsequent analysis for total suspended solids or suspended-sediment concentration often increase variance and may introduce bias. Processing artifacts can be substantial if the methods used are not appropriate for the concentrations and particle-size distributions present in the samples collected. Analytical methods for determining sediment concentrations include the suspended-sediment concentration and the total suspended solids methods. Although the terms suspended-sediment concentration and total suspended solids are often used interchangeably to describe the total concentration of suspended solid-phase material, the analytical methods differ and can produce substantially different results. The total suspended solids method, which commonly is used to produce highway- and urban-runoff sediment data, may not be valid for studies of runoff water quality. Studies of fluvial and highway-runoff sediment data indicate that analyses of samples by the total suspended solids method tends to under represent the true sediment concentration, and that relations between total suspended solids and suspended-sediment concentration are not transferable from site to site even when grain-size distribution information is available. Total suspended solids data used to calculate suspended-sediment loads in highways and urban runoff may be fundamentally unreliable. Consequently, use of total suspended solids data may have adverse consequences for the assessment, design, and maintenance of sediment-removal best management practices. Therefore, it may be necessary to analyze water samples using the suspended-sediment concentration method. Data quality, comparability, and utility are important considerations in collection, processing, and analysis of sediment samples and interpretation of sediment data for highway- and urban-runoff studies. Results from sediment studies must be comparable and readily transf
Recommendations for representative ballast water sampling
NASA Astrophysics Data System (ADS)
Gollasch, Stephan; David, Matej
2017-05-01
Until now, the purpose of ballast water sampling studies was predominantly limited to general scientific interest to determine the variety of species arriving in ballast water in a recipient port. Knowing the variety of species arriving in ballast water also contributes to the assessment of relative species introduction vector importance. Further, some sampling campaigns addressed awareness raising or the determination of organism numbers per water volume to evaluate the species introduction risk by analysing the propagule pressure of species. A new aspect of ballast water sampling, which this contribution addresses, is compliance monitoring and enforcement of ballast water management standards as set by, e.g., the IMO Ballast Water Management Convention. To achieve this, sampling methods which result in representative ballast water samples are essential. We recommend such methods based on practical tests conducted on two commercial vessels also considering results from our previous studies. The results show that different sampling approaches influence the results regarding viable organism concentrations in ballast water samples. It was observed that the sampling duration (i.e., length of the sampling process), timing (i.e., in which point in time of the discharge the sample is taken), the number of samples and the sampled water quantity are the main factors influencing the concentrations of viable organisms in a ballast water sample. Based on our findings we provide recommendations for representative ballast water sampling.
ERIC Educational Resources Information Center
Mitchell, Eugene E., Ed.
The simulation of a sampled-data system is described that uses a full parallel hybrid computer. The sampled data system simulated illustrates the proportional-integral-derivative (PID) discrete control of a continuous second-order process representing a stirred-tank. The stirred-tank is simulated using continuous analog components, while PID…
REAL TIME MONITORING OF PCDD/PCDF FOR TRANSIENT CHARACTERIZATION AND PROCESS CONTROL
Current sampling methods for PCDD/F emission compliance make use of a sample taken during steady state conditions which is assumed to be representative of facility performance. This is often less than satisfactory. The rapid variation of PCDDs, PCDF, and other co-pollutants due ...
Robust model selection and the statistical classification of languages
NASA Astrophysics Data System (ADS)
García, J. E.; González-López, V. A.; Viola, M. L. L.
2012-10-01
In this paper we address the problem of model selection for the set of finite memory stochastic processes with finite alphabet, when the data is contaminated. We consider m independent samples, with more than half of them being realizations of the same stochastic process with law Q, which is the one we want to retrieve. We devise a model selection procedure such that for a sample size large enough, the selected process is the one with law Q. Our model selection strategy is based on estimating relative entropies to select a subset of samples that are realizations of the same law. Although the procedure is valid for any family of finite order Markov models, we will focus on the family of variable length Markov chain models, which include the fixed order Markov chain model family. We define the asymptotic breakdown point (ABDP) for a model selection procedure, and we show the ABDP for our procedure. This means that if the proportion of contaminated samples is smaller than the ABDP, then, as the sample size grows our procedure selects a model for the process with law Q. We also use our procedure in a setting where we have one sample conformed by the concatenation of sub-samples of two or more stochastic processes, with most of the subsamples having law Q. We conducted a simulation study. In the application section we address the question of the statistical classification of languages according to their rhythmic features using speech samples. This is an important open problem in phonology. A persistent difficulty on this problem is that the speech samples correspond to several sentences produced by diverse speakers, corresponding to a mixture of distributions. The usual procedure to deal with this problem has been to choose a subset of the original sample which seems to best represent each language. The selection is made by listening to the samples. In our application we use the full dataset without any preselection of samples. We apply our robust methodology estimating a model which represent the main law for each language. Our findings agree with the linguistic conjecture, related to the rhythm of the languages included on our dataset.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Subsampling for dataset optimisation
NASA Astrophysics Data System (ADS)
Ließ, Mareike
2017-04-01
Soil-landscapes have formed by the interaction of soil-forming factors and pedogenic processes. In modelling these landscapes in their pedodiversity and the underlying processes, a representative unbiased dataset is required. This concerns model input as well as output data. However, very often big datasets are available which are highly heterogeneous and were gathered for various purposes, but not to model a particular process or data space. As a first step, the overall data space and/or landscape section to be modelled needs to be identified including considerations regarding scale and resolution. Then the available dataset needs to be optimised via subsampling to well represent this n-dimensional data space. A couple of well-known sampling designs may be adapted to suit this purpose. The overall approach follows three main strategies: (1) the data space may be condensed and de-correlated by a factor analysis to facilitate the subsampling process. (2) Different methods of pattern recognition serve to structure the n-dimensional data space to be modelled into units which then form the basis for the optimisation of an existing dataset through a sensible selection of samples. Along the way, data units for which there is currently insufficient soil data available may be identified. And (3) random samples from the n-dimensional data space may be replaced by similar samples from the available dataset. While being a presupposition to develop data-driven statistical models, this approach may also help to develop universal process models and identify limitations in existing models.
NASA Astrophysics Data System (ADS)
Estrany, Joan; Martinez-Carreras, Nuria
2013-04-01
Tracers have been acknowledged as a useful tool to identify sediment sources, based upon a variety of techniques and chemical and physical sediment properties. Sediment fingerprinting supports the notion that changes in sedimentation rates are not just related to increased/reduced erosion and transport in the same areas, but also to the establishment of different pathways increasing sediment connectivity. The Na Borges is a Mediterranean lowland agricultural river basin (319 km2) where traditional soil and water conservation practices have been applied over millennia to provide effective protection of cultivated land. During the twentieth century, industrialisation and pressure from tourism activities have increased urbanised surfaces, which have impacts on the processes that control streamflow. Within this context, source material sampling was focused in Na Borges on obtaining representative samples from potential sediment sources (comprised topsoil; i.e., 0-2 cm) susceptible to mobilisation by water and subsequent routing to the river channel network, while those representing channel bank sources were collected from actively eroding channel margins and ditches. Samples of road dust and of solids from sewage treatment plants were also collected. During two hydrological years (2004-2006), representative suspended sediment samples for use in source fingerprinting studies were collected at four flow gauging stations and at eight secondary sampling points using time-integrating sampling samplers. Likewise, representative bed-channel sediment samples were obtained using the resuspension approach at eight sampling points in the main stem of the Na Borges River. These deposits represent the fine sediment temporarily stored in the bed-channel and were also used for tracing source contributions. A total of 102 individual time-integrated sediment samples, 40 bulk samples and 48 bed-sediment samples were collected. Upon return to the laboratory, source material samples were oven-dried at 40° C, disaggregated using a pestle and mortar, and dry sieved to
Age, Marital Processes, and Depressed Affect
ERIC Educational Resources Information Center
Bookwala, Jamila; Jacobs, Jamie
2004-01-01
Purpose: We examined age-cohort differences in the interrelationships among marital processes and depressed affect. Design and Methods: We used data from individuals in first marriages that participated in the National Survey of Families and Households (NSFH). The NSFH interviewed one adult per household of a nationally representative sample.…
Code of Federal Regulations, 2010 CFR
2010-04-01
... STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Blood Grouping Reagent § 660.21 Processing. (a... representative samples of each group of products manufactured in the same fashion. (2) Only that material that... specifications to verify that each sublot is identical to other sublots of the lot. (4) Each lot of Blood...
Bottino, Marco C; Coelho, Paulo G; Henriques, Vinicius A R; Higa, Olga Z; Bressiani, Ana H A; Bressiani, José C
2009-03-01
This article presents details of processing, characterization and in vitro as well as in vivo evaluations of powder metallurgy processed Ti-13Nb-13Zr samples with different levels of porosity. Sintered samples were characterized for density, crystalline phases (XRD), and microstructure (SEM and EDX). Samples sintered at 1000 degrees C showed the highest porosity level ( approximately 30%), featuring open and interconnected pores ranging from 50 to 100 mum in diameter but incomplete densification. In contrast, samples sintered at 1300 and 1500 degrees C demonstrated high densification with 10% porosity level distributed in a homogeneous microstructure. The different sintering conditions used in this study demonstrated a coherent trend that is increase in temperature lead to higher sample densification, even though densification represents a drawback for bone ingrowth. Cytotoxicity tests did not reveal any toxic effects of the starting and processed materials on surviving cell percentage. After an 8-week healing period in rabbit tibias, the implants were retrieved, processed for nondecalcified histological evaluation, and then assessed by backscattered electron images (BSEI-SEM) and EDX. Bone growth into the microstructure was observed only in samples sintered at 1000 degrees C. Overall, a close relation between newly formed bone and all processed samples was observed. (c) 2008 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Zulueta, R. C.; Metzger, S.; Ayres, E.; Luo, H.; Meier, C. L.; Barnett, D.; Sanclements, M.; Elmendorf, S.
2013-12-01
The National Ecological Observatory Network (NEON) is a continental-scale research platform currently in development to assess the causes of ecological change and biological responses to change across a projected 30-year timeframe. A suite of standardized sensor-based measurements (i.e., Terrestrial Instrument System (TIS) measurements) and in-situ field sampling and observations (i.e., Terrestrial Observation System (TOS) activities) will be conducted across 20 ecoclimatic domains in the U.S. where NEON is establishing 60 terrestrial research sites. NEON's TIS measurements and TOS activities are designed to observe the temporal and spatial dynamics of key drivers and ecological processes and responses to change within each of the 60 terrestrial research sites. The TIS measurements are non-destructive and designed to provide in-situ, continuous, and areally integrated observations of the surrounding ecosystem and environment, while TOS sampling and observation activities are designed to encompass a hierarchy of measurable biological states and processes including diversity, abundance, phenology, demography, infectious disease prevalence, ecohydrology, and biogeochemistry. To establish valid relationships between these drivers and site-specific responses, two contradicting requirements must be fulfilled: (i) both types of observations shall be representative of the same ecosystem, and (ii) they shall not significantly influence one another. Here we outline the theoretical background and algorithmic process for determining areas of mutual representativeness and exclusion around NEON's TIS measurements and develop a procedure which quantitatively optimizes this trade-off through: (i) quantifying the source area distributions of TIS measurements, (ii) determining the ratio of user-defined impact threshold to effective impact area for different TOS activities, and (iii) determining the range of feasible distances between TIS locations and TOS activities. This approach provides an evidence-based and repeatable method for combining sensor-based measurements and field sampling and observations at predefined levels of disturbance and spatial representativeness. The developed approach represents a general framework which is applicable to other environmental research sites where similar collocation is desired.
Simulation of Wind Profile Perturbations for Launch Vehicle Design
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
2004-01-01
Ideally, a statistically representative sample of measured high-resolution wind profiles with wavelengths as small as tens of meters is required in design studies to establish aerodynamic load indicator dispersions and vehicle control system capability. At most potential launch sites, high- resolution wind profiles may not exist. Representative samples of Rawinsonde wind profiles to altitudes of 30 km are more likely to be available from the extensive network of measurement sites established for routine sampling in support of weather observing and forecasting activity. Such a sample, large enough to be statistically representative of relatively large wavelength perturbations, would be inadequate for launch vehicle design assessments because the Rawinsonde system accurately measures wind perturbations with wavelengths no smaller than 2000 m (1000 m altitude increment). The Kennedy Space Center (KSC) Jimsphere wind profiles (150/month and seasonal 2 and 3.5-hr pairs) are the only adequate samples of high resolution profiles approx. 150 to 300 m effective resolution, but over-sampled at 25 m intervals) that have been used extensively for launch vehicle design assessments. Therefore, a simulation process has been developed for enhancement of measured low-resolution Rawinsonde profiles that would be applicable in preliminary launch vehicle design studies at launch sites other than KSC.
Re-evaluation of Moisture Controls During ARIES Oxide Processing, Packaging and Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karmiol, Benjamin; Wayne, David Matthew
DOE-STD-3013 [1] requires limiting the relative humidity (RH) in the glovebox during processing of the oxide product for specific types of plutonium oxides. This requirement is mandated in order to limit corrosion of the stainless steel containers by deliquescence of chloride salts if present in the PuO2. DOE-STD-3013 also specifies the need to limit and monitor internal pressure buildup in the 3013 containers due to the potential for the generation of free H2 and O2 gas from the radiolysis of surfaceadsorbed water. DOE-STD-3013 requires that the oxide sample taken for moisture content verification be representative of the stabilized material inmore » the 3013 container. This is accomplished by either limiting the time between sampling and packaging, or by control of the glovebox relative humidity (%RH). This requirement ensures that the sample is not only representative, but also conservative from the standpoint of moisture content.« less
The software peculiarities of pattern recognition in track detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starkov, N.
The different kinds of nuclear track recognition algorithms are represented. Several complicated samples of use them in physical experiments are considered. The some processing methods of complicated images are described.
Highway engineers assess the public hearing process.
DOT National Transportation Integrated Search
1976-01-01
Representatives from the eight construction districts of the Virginia Department of Highways and Transportation were interviewed concerning their involvement in public hearings. The sample consisted of seven district engineers and eighteen other dist...
This procedure describes the process for collecting and analyzing blood and urine samples. The presence of chemical contaminants in biological specimens such as blood, urine, and hair represent a measure of the internal dose or body burden for a given individual derived from the ...
ERIC Educational Resources Information Center
Wickrama, K. A. S.; Elder, Glen H.; Abraham, W. Todd
2007-01-01
Context and Purpose: This study's objectives are to: investigate potential additive and multiplicative influences of rurality and race/ethnicity on chronic physical illness in a nationally representative sample of youth; and examine intra-Latino processes using a Latino sub-sample. Specifically, we examine how rurality and individual psychosocial…
The American Teacher, 1984-1995, Metropolitan Life Survey. Old Problems, New Challenges.
ERIC Educational Resources Information Center
Harris (Louis) and Associates, Inc., New York, NY.
During the past decade there have been considerable efforts to reform the American public school system. This survey, based on 15-minute telephone interviews with a nationally representative sample of 1,011 public school teachers in the United States, duplicates the sampling and interviewing process used in a similar study in 1984 and 1985. In…
Schillaci, Michael A; Schillaci, Mario E
2009-02-01
The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.
Visell, Yon
2015-04-01
This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.
Active Parent Consent for Health Surveys with Urban Middle School Students: Processes and Outcomes
ERIC Educational Resources Information Center
Secor-Turner, Molly; Sieving, Renee; Widome, Rachel; Plowman, Shari; Vanden Berk, Eric
2010-01-01
Background: To achieve high participation rates and a representative sample, active parent consent procedures require a significant investment of study resources. The purpose of this article is to describe processes and outcomes of utilizing active parent consent procedures with sixth-grade students from urban, ethnically diverse, economically…
NASA Technical Reports Server (NTRS)
Ferrario, Joseph; Byrne, Christian
2002-01-01
Processed ball clay samples used in the production of ceramics and samples of the ceramic products were collected and analyzed for the presence and concentration of the 2,3,7,8-Cl substituted polychlorinated dibenzo-p-dioxins and -furans (PCDDs/PCDFs). The processed ball clay had average PCDD concentrations of 3.2 ng/g toxic equivalents, a congener profile, and isomer distribution consistent with those found previously in raw ball clay. The PCDF concentrations were below the average limit of detection (LOD) of 0.5 pg/g. The final fired ceramic products were found to be free of PCDDs/PCDFs at the LODs. A consideration of the conditions involved in the firing process suggests that the PCDDs, if not destroyed, may be released to the atmosphere and could represent an as yet unidentified source of dioxins to the environment. In addition, the PCDDs in clay dust generated during manufacturing operations may represent a potential occupational exposure.
Ferrario, Joseph; Byrne, Christian
2002-03-01
Processed ball clay samples used in the production of ceramics and samples of the ceramic products were collected and analyzed for the presence and concentration of the 2,3,7,8-Cl substituted polychlorinated dibenzo-p-dioxins and -furans (PCDDs/PCDFs). The processed ball clay had average PCDD concentrations of 3.2 ng/g toxic equivalents, a congener profile, and isomer distribution consistent with those found previously in raw ball clay. The PCDF concentrations were below the average limit of detection (LOD) of 0.5 pg/g. The final fired ceramic products were found to be free of PCDDs/PCDFs at the LODs. A consideration of the conditions involved in the firing process suggests that the PCDDs, if not destroyed, may be released to the atmosphere and could represent an as yet unidentified source of dioxins to the environment. In addition, the PCDDs in clay dust generated during manufacturing operations may represent a potential occupational exposure.
SEIPS-based process modeling in primary care.
Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T
2017-04-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.
SEIPS-Based Process Modeling in Primary Care
Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter
2016-01-01
Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883
Evaluation of Sampling Methods for Bacillus Spore ...
Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.
Bioactive lipids in the butter production chain from Parmigiano Reggiano cheese area.
Verardo, Vito; Gómez-Caravaca, Ana M; Gori, Alessandro; Losi, Giuseppe; Caboni, Maria F
2013-11-01
Bovine milk contains hundreds of diverse components, including proteins, peptides, amino acids, lipids, lactose, vitamins and minerals. Specifically, the lipid composition is influenced by different variables such as breed, feed and technological process. In this study the fatty acid and phospholipid compositions of different samples of butter and its by-products from the Parmigiano Reggiano cheese area, produced by industrial and traditional churning processes, were determined. The fatty acid composition of samples manufactured by the traditional method showed higher levels of monounsaturated and polyunsaturated fatty acids compared with industrial samples. In particular, the contents of n-3 fatty acids and conjugated linoleic acids were higher in samples produced by the traditional method than in samples produced industrially. Sample phospholipid composition also varied between the two technological processes. Phosphatidylethanolamine was the major phospholipid in cream, butter and buttermilk samples obtained by the industrial process as well as in cream and buttermilk samples from the traditional process, while phosphatidylcholine was the major phospholipid in traditionally produced butter. This result may be explained by the different churning processes causing different types of membrane disruption. Generally, samples produced traditionally had higher contents of total phospholipids; in particular, butter produced by the traditional method had a total phospholipid content 33% higher than that of industrially produced butter. The samples studied represent the two types of products present in the Parmigiano Reggiano cheese area, where the industrial churning process is widespread compared with the traditional processing of Reggiana cow's milk. This is because Reggiana cow's milk production is lower than that of other breeds and the traditional churning process is time-consuming and economically disadvantageous. However, its products have been demonstrated to contain more bioactive lipids compared with products obtained from other breeds and by the industrial process. © 2013 Society of Chemical Industry.
A Reexamination of the Carnivora Malleus (Mammalia, Placentalia)
Wible, John R.; Spaulding, Michelle
2012-01-01
Authoritative anatomical references depict domestic dogs and cats as having a malleus with a short rostral (anterior) process that is connected via a ligament to the ectotympanic of the auditory bulla. Similar mallei have been reported for representatives of each of the 15 extant families of Carnivora, the placental order containing dogs and cats. This morphology is in contrast to a malleus with a long rostral process anchored to the ectotympanic that is considered to be primitive for mammals. Our reexamination of extant carnivorans found representatives from 12 families that possess an elongate rostral process anchored to the ectotympanic. Consequently, the malleus also is a component of the bulla. In a subset of our carnivoran sample, we confirmed that the elongate rostral process on the ectotympanic is continuous with the rest of the malleus through a thin osseous lamina. This morphology is reconstructed as primitive for Carnivora. Prior inaccurate descriptions of the taxa in our sample having mallei continuous with the bulla were based on damaged mallei. In addition to coupling to the ectotympanic, the rostral process of the malleus was found to have a hook-like process that fits in a facet on the skull base in representatives from seven families (felids, nandiniids, viverrids, canids, ursids, procyonids, and mustelids); its occurrence in the remaining families could not be ascertained. This feature is named herein the mallear hook and is likewise reconstructed to be primitive for Carnivora. We also investigated mallei in one additional placental order reported to have mallei not connected to the ectotympanic, Pholidota (pangolins), the extant sister group of Carnivora. We found pholidotans to also have anchored mallei with long rostral processes, but lacking mallear hooks. In light of our results, other mammals previously reported to have short rostral processes should be reexamined. PMID:23209753
Sampling criteria in multicollection searching.
NASA Astrophysics Data System (ADS)
Gilio, A.; Scozzafava, R.; Marchetti, P. G.
In the first stage of the document retrieval process, no information concerning relevance of a particular document is available. On the other hand, computer implementation requires that the analysis be made only for a sample of retrieved documents. This paper addresses the significance and suitability of two different sampling criteria for a multicollection online search facility. The inevitability of resorting to a logarithmic criterion in order to achieve a "spread of representativeness" from the multicollection is demonstrated.
Allogenic and Autogenic Signals in the Detrital Zircon U-Pb Record of the Deep-Sea Bengal Fan
NASA Astrophysics Data System (ADS)
Blum, M. D.; Rogers, K. G.; Gleason, J. D.; Najman, Y.
2017-12-01
The Himalayan-sourced Ganges-Brahmaputra river system and the deep-sea Bengal Fan represent Earth's largest sediment-dispersal system. This presentation summarizes a new detrital zircon U-Pb (DZ) provenance record from the Bengal Fan from cores collected during IODP Expedition 354, with coring sites located 1350 km downdip from the shelf margin. Each of our 15 samples were collected from medium- to fine-grained turbidite sand and, based on shipboard biostratigraphic analyses, our samples are late Miocene to late Pleistocene in age. Each sample was analyzed by LA-ICPMS at the Arizona Laserchron facility, with an average of n=270 concordant U-Pb ages per sample. Our goals are to use these data to evaluate the influence of allogenic controls vs. autogenic processes on signal propagation from source-to-sink. At the first order, large-scale sediment transfer to the Bengal Fan clearly records the strong tectonic and climatic forcing associated with the Himalayas and Ganges-Brahmaputra system: after up to 2500 km of river transport, and 1350 km of transport in turbidity currents, the DZ record faithfully represents Himalayan source terrains. The sand-rich turbidite part of the record is nevertheless biased towards glacial periods when rivers extended across the shelf in response to climate-forced sea-level fall, and discharged directly to slope canyons. However, only part of the Bengal Fan DZ record represents either the Ganges or the Brahmaputra, with most samples representing varying degrees of mixing of sediments from the two systems: this mixing, or the lack thereof, represents the signal of autogenic avulsions on the delta plain that result in the two river systems delivering sediment separately to the shelf margin, or together as they do today. Within the allogenic framework established by tectonic processes, the climatic system, and global climate-forced sea-level change, the DZ U-Pb record of sediment mixing or the lack thereof provides a fingerprint of autogenic avulsions on signal transfer from source-to-sink in the world's largest sediment-dispersal system.
Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.
Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira
2016-01-01
Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.
ERIC Educational Resources Information Center
Kaufman, Alan S.; Kamphaus, Randy W.
1984-01-01
The construct validity of the Sequential Processing, Simultaneous Processing and Achievement scales of the Kaufman Assessment Battery for Children was supported by factor-analytic investigations of a representative national stratified sample of 2,000 children. Correlations provided insight into the relationship of sequential/simultaneous…
2013-09-01
sequence dataset. All procedures were performed by personnel in the IIMT UT Southwestern Genomics and Microarray Core using standard protocols. More... sequencing run, samples were demultiplexed using standard algorithms in the Genomics and Microarray Core and processed into individual sample Illumina single... Sequencing (RNA-Seq), using Illumina’s multiplexing mRNA-Seq to generate full sequence libraries from the poly-A tailed RNA to a read depth of 30
Studies in astronomical time series analysis: Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1979-01-01
Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.
NASA Technical Reports Server (NTRS)
Neal, C. R.; Lawrence, S. J.
2017-01-01
There have been 11 missions to the Moon this century, 10 of which have been orbital, from 5 different space agencies. China became the third country to successfully soft-land on the Moon in 2013, and the second to successfully remotely operate a rover on the lunar surface. We now have significant global datasets that, coupled with the 1990s Clementine and Lunar Prospector missions, show that the sample collection is not representative of the lithologies present on the Moon. The M3 data from the Indian Chandrayaan-1 mission have identified lithologies that are not present/under-represented in the sample collection. LRO datasets show that volcanism could be as young as 100 Ma and that significant felsic complexes exist within the lunar crust. A multi-decadal sample return campaign is the next logical step in advancing our understanding of lunar origin and evolution and Solar System processes.
Ultralow-Power Digital Correlator for Microwave Polarimetry
NASA Technical Reports Server (NTRS)
Piepmeier, Jeffrey R.; Hass, K. Joseph
2004-01-01
A recently developed high-speed digital correlator is especially well suited for processing readings of a passive microwave polarimeter. This circuit computes the autocorrelations of, and the cross-correlations among, data in four digital input streams representing samples of in-phase (I) and quadrature (Q) components of two intermediate-frequency (IF) signals, denoted A and B, that are generated in heterodyne reception of two microwave signals. The IF signals arriving at the correlator input terminals have been digitized to three levels (-1,0,1) at a sampling rate up to 500 MHz. Two bits (representing sign and magnitude) are needed to represent the instantaneous datum in each input channel; hence, eight bits are needed to represent the four input signals during any given cycle of the sampling clock. The accumulation (integration) time for the correlation is programmable in increments of 2(exp 8) cycles of the sampling clock, up to a maximum of 2(exp 24) cycles. The basic functionality of the correlator is embodied in 16 correlation slices, each of which contains identical logic circuits and counters (see figure). The first stage of each correlation slice is a logic gate that computes one of the desired correlations (for example, the autocorrelation of the I component of A or the negative of the cross-correlation of the I component of A and the Q component of B). The sampling of the output of the logic gate output is controlled by the sampling-clock signal, and an 8-bit counter increments in every clock cycle when the logic gate generates output. The most significant bit of the 8-bit counter is sampled by a 16-bit counter with a clock signal at 2(exp 8) the frequency of the sampling clock. The 16-bit counter is incremented every time the 8-bit counter rolls over.
NASA Astrophysics Data System (ADS)
Wiederhold, J. G.; Jew, A. D.; Brown, G. E.; Bourdon, B.; Kretzschmar, R.
2010-12-01
The seven stable isotopes of Hg are fractionated in the environment as a result of mass-dependent (MDF) and mass-independent (MIF) fractionation processes that can be studied in parallel by analyzing the ratios of even and odd mass Hg isotopes. MDF and MIF Hg isotope signatures of natural samples may provide a new tool to trace sources and transformations in environmental Hg cycling. However, the mechanisms controlling the extent of kinetic and equilibrium Hg isotope fractionations are still only partially understood. Thus, development of this promising tracer requires experimental calibration of relevant fractionation factors as well as assessment of natural variations of Hg isotope ratios under different environmental conditions. The inoperative Hg mine in New Idria (California, USA) represents an ideal case study to explore Hg isotope fractionation during Hg transformation and transport processes. More than a century of Hg mining and on-site thermal refining to obtain elemental Hg until 1972 produced large volumes of contaminated mine wastes which now represent sources of Hg pollution for the surrounding ecosystems. Here, we present Hg isotope data from various materials collected at New Idria using Cold-Vapor-MC-ICPMS with a long-term δ202Hg reproducibility of ±0.1‰ (2SD). Uncalcined mine waste samples were isotopically similar to NIST-3133 and did not exhibit any MIF signatures. In contrast, calcine samples, which represent the residue of the thermal ore processing at 700°C, had significantly heavier δ202Hg values of up to +1.5‰. In addition, we observed small negative MIF anomalies of the odd-mass Hg isotopes in the calcine samples, which could be caused either by nuclear volume fractionation or a magnetic isotope effect during or after the roasting process. The mass-dependent enrichment of heavy Hg isotopes in the calcine materials indicates that light Hg isotopes were preferentially removed during the roasting process, in agreement with a previous study by Stetson et al. (ES&T, 2009, 43:7331-7336). In order to further elucidate the Hg isotope signatures of the New Idria samples, we performed a three-step sequential extraction procedure to separate different Hg pools. The calcine samples exhibited a higher proportion of leachable Hg phases compared with the unrefined ore waste samples. The most soluble Hg pool (HAc/HCl, pH 2) had a significantly heavier MDF and more negative MIF signature than the bulk calcine samples, suggesting that the dissolution of more soluble Hg phases from calcine materials results in an enhanced flux of leached Hg which is isotopically distinct from the original ore. Moreover, this finding demonstrates that the Hg isotope fractionation during the ore roasting cannot be solely explained by a kinetic Rayleigh-type process which removes light Hg isotopes, but must additionally involve the formation of isotopically heavy secondary Hg phases in the calcine. The analysis of additional samples will enable us to test this hypothesis and to gain further insights into the applicability of stable Hg isotope ratios as source and process tracers in Hg-contaminated environments.
Evaluating Core Quality for a Mars Sample Return Mission
NASA Technical Reports Server (NTRS)
Weiss, D. K.; Budney, C.; Shiraishi, L.; Klein, K.
2012-01-01
Sample return missions, including the proposed Mars Sample Return (MSR) mission, propose to collect core samples from scientifically valuable sites on Mars. These core samples would undergo extreme forces during the drilling process, and during the reentry process if the EEV (Earth Entry Vehicle) performed a hard landing on Earth. Because of the foreseen damage to the stratigraphy of the cores, it is important to evaluate each core for rock quality. However, because no core sample return mission has yet been conducted to another planetary body, it remains unclear as to how to assess the cores for rock quality. In this report, we describe the development of a metric designed to quantitatively assess the mechanical quality of any rock cores returned from Mars (or other planetary bodies). We report on the process by which we tested the metric on core samples of Mars analogue materials, and the effectiveness of the core assessment metric (CAM) in assessing rock core quality before and after the cores were subjected to shocking (g forces representative of an EEV landing).
Identification of Sources of Endotoxin Exposure as Input for Effective Exposure Control Strategies.
van Duuren-Stuurman, Birgit; Gröllers-Mulderij, Mariska; van de Runstraat, Annemieke; Duisterwinkel, Anton; Terwoert, Jeroen; Spaan, Suzanne
2018-02-13
Aim of the present study is to investigate the levels of endotoxins on product samples from potatoes, onions, and seeds, representing a relevant part of the agro-food industry in the Netherlands, to gather valuable insights in possibilities for exposure control measures early in the process of industrial processing of these products. Endotoxin levels on 330 products samples from companies representing the potato, onion, and seed (processing) industry (four potato-packaging companies, five potato-processing companies, five onion-packaging companies, and four seed-processing companies) were assessed using the Limulus Amboecyte Lysate (LAL) assay. As variation in growth conditions (type of soil, growth type) and product characteristics (surface roughness, dustiness, size, species) are assumed to influence the level of endotoxin on products, different types, and growth conditions were considered when collecting the samples. Additionally, waste material, rotten products, felt material (used for drying), and process water were collected. A large variation in the endotoxin levels was found on samples of potatoes, onions, and seeds (overall geometric standard deviation 17), in the range between 0.7 EU g-1 to 16400000 EU g-1. The highest geometric mean endotoxin levels were found in plant material (319600 EU g-1), followed by soil material (49100 EU g-1) and the outer side of products (9300 EU g-1), indicating that removal of plant and soil material early in the process would be an effective exposure control strategy. The high levels of endotoxins found in the limited number of samples from rotten onions indicate that these rotten onions should also be removed early in the process. Mean endotoxin levels found in waste material (only available for seed processing) is similar to the level found in soil material, although the range is much larger. On uncleaned seeds, higher endotoxin levels were found than on cleaned seeds, indicating that cleaning processes are important control measures and also that the waste material should be handled with care. Although endotoxin levels in batches of to-be-processed potatoes, onions, and seeds vary quite dramatically, it could be concluded that rotten products, plant material, and waste material contain particularly high endotoxin levels. This information was used to propose control measures to reduce exposure to endotoxins of workers during the production process. © The Author(s) 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Introduction to the Apollo collections: Part 2: Lunar breccias
NASA Technical Reports Server (NTRS)
Mcgee, P. E.; Simonds, C. H.; Warner, J. L.; Phinney, W. C.
1979-01-01
Basic petrographic, chemical and age data for a representative suite of lunar breccias are presented for students and potential lunar sample investigators. Emphasis is on sample description and data presentation. Samples are listed, together with a classification scheme based on matrix texture and mineralogy and the nature and abundance of glass present both in the matrix and as clasts. A calculus of the classification scheme, describes the characteristic features of each of the breccia groups. The cratering process which describes the sequence of events immediately following an impact event is discussed, especially the thermal and material transport processes affecting the two major components of lunar breccias (clastic debris and fused material).
NASA Astrophysics Data System (ADS)
Sammartano, G.; Spanò, A.
2017-09-01
Delineating accurate surface water quality levels (SWQLs) always presents a great challenge to researchers. Existing methods of assessing surface water quality only provide individual concentrations of monitoring stations without providing the overall SWQLs. Therefore, the results of existing methods are usually difficult to be understood by decision-makers. Conversely, the water quality index (WQI) can simplify surface water quality assessment process to be accessible to decision-makers. However, in most cases, the WQI reflects inaccurate SWQLs due to the lack of representative water samples. It is very challenging to provide representative water samples because this process is costly and time consuming. To solve this problem, we introduce a cost-effective method which combines the Landsat-8 imagery and artificial intelligence to develop models to derive representative water samples by correlating concentrations of ground truth water samples to satellite spectral information. Our method was validated and the correlation between concentrations of ground truth water samples and predicted concentrations from the developed models reached a high level of coefficient of determination (R2) > 0.80, which is trustworthy. Afterwards, the predicted concentrations over each pixel of the study area were used as an input to the WQI developed by the Canadian Council of Ministers of the Environment to extract accurate SWQLs, for drinking purposes, in the Saint John River. The results indicated that SWQL was observed as 67 (Fair) and 59 (Marginal) for the lower and middle basins of the river, respectively. These findings demonstrate the potential of using our approach in surface water quality management.
Have We Really Been Analyzing Terminating Simulations Incorrectly All These Years?
2013-12-01
TERMINATING SIMULATIONS INCORRECTLY ALL THESE YEARS? Paul J. Sánchez Operations Research Naval Postgraduate School 1411 Cunningham Road Monterey, CA...measure. If that observation directly represents an end state such as the number of failed components after a week’s operation , or the number of patients...processed in 24 hours of emergency room operations , there’s no problem—the set of values obtained by replication represent a random sample from the
ERIC Educational Resources Information Center
Shen, Jianping; Leslie, Jeffrey M.; Spybrook, Jessaca K.; Ma, Xin
2012-01-01
Using nationally representative samples for public school teachers and principals, the authors inquired into whether principal background and school processes are related to teacher job satisfaction. Employing hierarchical linear modeling (HLM), the authors were able to control for background characteristics at both the teacher and school levels.…
Redfern, Julie; Adedoyin, Rufus Adesoji; Ofori, Sandra; Anchala, Raghupathy; Ajay, Vamadevan S; De Andrade, Luciano; Zelaya, Jose; Kaur, Harparkash; Balabanova, Dina; Sani, Mahmoud U
2016-01-01
Background Prevention and optimal management of hypertension in the general population is paramount to the achievement of the World Heart Federation (WHF) goal of reducing premature cardiovascular disease (CVD) mortality by 25% by the year 2025 and widespread access to good quality antihypertensive medicines is a critical component for achieving the goal. Despite research and evidence relating to other medicines such as antimalarials and antibiotics, there is very little known about the quality of generic antihypertensive medicines in low-income and middle-income countries. The aim of this study was to determine the physicochemical equivalence (percentage of active pharmaceutical ingredient, API) of generic antihypertensive medicines available in the retail market of a developing country. Methods An observational design will be adopted, which includes literature search, landscape assessment, collection and analysis of medicine samples. To determine physicochemical equivalence, a multistage sampling process will be used, including (1) identification of the 2 most commonly prescribed classes of antihypertensive medicines prescribed in Nigeria; (2) identification of a random sample of 10 generics from within each of the 2 most commonly prescribed classes; (3) a geographical representative sampling process to identify a random sample of 24 retail outlets in Nigeria; (4) representative sample purchasing, processing to assess the quality of medicines, storage and transport; and (5) assessment of the physical and chemical equivalence of the collected samples compared to the API in the relevant class. In total, 20 samples from each of 24 pharmacies will be tested (total of 480 samples). Discussion Availability of and access to quality antihypertensive medicines globally is therefore a vital strategy needed to achieve the WHF 25×25 targets. However, there is currently a scarcity of knowledge about the quality of antihypertensive medicines available in developing countries. Such information is important for enforcing and for ensuring the quality of antihypertensive medicines. PMID:28588941
The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center
NASA Astrophysics Data System (ADS)
Zeigler, R. A.; Blumenfeld, E. H.; Srinivasan, P.; McCubbin, F. M.; Evans, C. A.
2018-04-01
The Astromaterials Curation Office has recently begun incorporating X-ray CT data into the curation processes for lunar and meteorite samples, and long-term curation of that data and serving it to the public represent significant technical challenges.
Technology for return of planetary samples, 1977
NASA Technical Reports Server (NTRS)
1978-01-01
Recent progress on the development of a basic warning system (BWS) proposed to assess the biohazard of a Mars sample returned to earth, an earth orbiting spacecraft, or to a moon base was presented. The BWS package consists of terrestrial microorganisms representing major metabolic pathways. A vital processes component of the BWS will examine the effects of a Mars sample at terrestrial atmospheric conditions while a hardy organism component will examine the effects of a Mars sample under conditions approaching those of the Martian environment. Any deleterious insult on terrestrial metabolism effected by the Mars sample could be indicated long before the sample reached earth proximity.
Neupane, Ghanashyam; McLing, Travis
2017-04-01
These brine samples are collected from the Soda Geyser (a thermal feature, temperature ~30 C) in Soda Springs, Idaho. These samples also represent the overthrust brines typical of oil and gas plays in western Wyoming. Samples were collected from the source and along the flow channel at different distances from the source. By collecting and analyzing these samples we are able to increase the density and quality of data from the western Wyoming oil and gas plays. Furthermore, the sampling approach also helped determine the systematic variation in REE concentration with the sampling distance from the source. Several geochemical processes are at work along the flow channels, such as degassing, precipitation, sorption, etc.
Iachan, Ronaldo; H. Johnson, Christopher; L. Harding, Richard; Kyle, Tonja; Saavedra, Pedro; L. Frazier, Emma; Beer, Linda; L. Mattson, Christine; Skarbinski, Jacek
2016-01-01
Background: Health surveys of the general US population are inadequate for monitoring human immunodeficiency virus (HIV) infection because the relatively low prevalence of the disease (<0.5%) leads to small subpopulation sample sizes. Objective: To collect a nationally and locally representative probability sample of HIV-infected adults receiving medical care to monitor clinical and behavioral outcomes, supplementing the data in the National HIV Surveillance System. This paper describes the sample design and weighting methods for the Medical Monitoring Project (MMP) and provides estimates of the size and characteristics of this population. Methods: To develop a method for obtaining valid, representative estimates of the in-care population, we implemented a cross-sectional, three-stage design that sampled 23 jurisdictions, then 691 facilities, then 9,344 HIV patients receiving medical care, using probability-proportional-to-size methods. The data weighting process followed standard methods, accounting for the probabilities of selection at each stage and adjusting for nonresponse and multiplicity. Nonresponse adjustments accounted for differing response at both facility and patient levels. Multiplicity adjustments accounted for visits to more than one HIV care facility. Results: MMP used a multistage stratified probability sampling design that was approximately self-weighting in each of the 23 project areas and nationally. The probability sample represents the estimated 421,186 HIV-infected adults receiving medical care during January through April 2009. Methods were efficient (i.e., induced small, unequal weighting effects and small standard errors for a range of weighted estimates). Conclusion: The information collected through MMP allows monitoring trends in clinical and behavioral outcomes and informs resource allocation for treatment and prevention activities. PMID:27651851
A Stochastic Diffusion Process for the Dirichlet Distribution
Bakosi, J.; Ristorcelli, J. R.
2013-03-01
The method of potential solutions of Fokker-Planck equations is used to develop a transport equation for the joint probability ofNcoupled stochastic variables with the Dirichlet distribution as its asymptotic solution. To ensure a bounded sample space, a coupled nonlinear diffusion process is required: the Wiener processes in the equivalent system of stochastic differential equations are multiplicative with coefficients dependent on all the stochastic variables. Individual samples of a discrete ensemble, obtained from the stochastic process, satisfy a unit-sum constraint at all times. The process may be used to represent realizations of a fluctuating ensemble ofNvariables subject to a conservation principle.more » Similar to the multivariate Wright-Fisher process, whose invariant is also Dirichlet, the univariate case yields a process whose invariant is the beta distribution. As a test of the results, Monte Carlo simulations are used to evolve numerical ensembles toward the invariant Dirichlet distribution.« less
Time resolved quantitative imaging of charring in materials at temperatures above 1000 K
NASA Astrophysics Data System (ADS)
Böhrk, Hannah; Jemmali, Raouf
2016-07-01
A device is presented allowing for in situ investigation of chemically changing materials by means of X-ray imaging. A representative cork ablator sample, additionally instrumented with thermocouples, is encapsulated in an evacuated cell heating a sample surface with a heat flux of 230 kW/m2. The images show the sample surface and the in-depth progression of the char front dividing the char layer from the virgin material. Correlating the images to thermocouple data allows for the deduction of a reaction temperature. For the representative cork ablator investigated at the present conditions, the progression rate of the pyrolysis layer is determined to 0.0285 mm/s and pyrolysis temperature is 770 or 737 K, depending on the pre-existing conditions. It is found that the novel device is ideally suited for volume process imaging.
Mueller, Amy V; Hemond, Harold F
2016-05-18
Knowledge of ionic concentrations in natural waters is essential to understand watershed processes. Inorganic nitrogen, in the form of nitrate and ammonium ions, is a key nutrient as well as a participant in redox, acid-base, and photochemical processes of natural waters, leading to spatiotemporal patterns of ion concentrations at scales as small as meters or hours. Current options for measurement in situ are costly, relying primarily on instruments adapted from laboratory methods (e.g., colorimetric, UV absorption); free-standing and inexpensive ISE sensors for NO3(-) and NH4(+) could be attractive alternatives if interferences from other constituents were overcome. Multi-sensor arrays, coupled with appropriate non-linear signal processing, offer promise in this capacity but have not yet successfully achieved signal separation for NO3(-) and NH4(+)in situ at naturally occurring levels in unprocessed water samples. A novel signal processor, underpinned by an appropriate sensor array, is proposed that overcomes previous limitations by explicitly integrating basic chemical constraints (e.g., charge balance). This work further presents a rationalized process for the development of such in situ instrumentation for NO3(-) and NH4(+), including a statistical-modeling strategy for instrument design, training/calibration, and validation. Statistical analysis reveals that historical concentrations of major ionic constituents in natural waters across New England strongly covary and are multi-modal. This informs the design of a statistically appropriate training set, suggesting that the strong covariance of constituents across environmental samples can be exploited through appropriate signal processing mechanisms to further improve estimates of minor constituents. Two artificial neural network architectures, one expanded to incorporate knowledge of basic chemical constraints, were tested to process outputs of a multi-sensor array, trained using datasets of varying degrees of statistical representativeness to natural water samples. The accuracy of ANN results improves monotonically with the statistical representativeness of the training set (error decreases by ∼5×), while the expanded neural network architecture contributes a further factor of 2-3.5 decrease in error when trained with the most representative sample set. Results using the most statistically accurate set of training samples (which retain environmentally relevant ion concentrations but avoid the potential interference of humic acids) demonstrated accurate, unbiased quantification of nitrate and ammonium at natural environmental levels (±20% down to <10 μM), as well as the major ions Na(+), K(+), Ca(2+), Mg(2+), Cl(-), and SO4(2-), in unprocessed samples. These results show promise for the development of new in situ instrumentation for the support of scientific field work.
Opto-electrochemical spectroscopy of metals in aqueous solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, K., E-mail: khaledhabib@usa.net
In the present investigation, holographic interferometry was utilized for the first time to determine the rate change of the electrical resistance of aluminium samples during the initial stage of anodisation processes in aqueous solution. In fact, because the resistance values in this investigation were obtained by holographic interferometry, electromagnetic method rather than electronic method, the abrupt rate change of the resistance was called electrical resistance–emission spectroscopy. The anodisation process of the aluminium samples was carried out by electrochemical impedance spectroscopy (EIS) in different sulphuric acid concentrations (1.0%–2.5% H{sub 2}SO{sub 4}) at room temperature. In the meantime, the real time holographicmore » interferometry was used to determine the difference between the electrical resistance of two subsequent values, dR, as a function of the elapsed time of the EIS experiment for the aluminium samples in 1.0%, 1.5%, 2.0%, and 2.5% H{sub 2}SO{sub 4} solutions. The electrical resistance–emission spectra of the present investigation represent a detailed picture of not only the rate change of the electrical resistance throughout the anodisation processes but also the spectra represent the rate change of the growth of the oxide films on the aluminium samples in different solutions. As a result, a new spectrometer was developed based on the combination of the holographic interferometry and electrochemical impedance spectroscopy for studying in situ the electrochemical behavior of metals in aqueous solutions.« less
γ-Oryzanol and tocopherol contents in residues of rice bran oil refining.
Pestana-Bauer, Vanessa Ribeiro; Zambiazi, Rui C; Mendonça, Carla R B; Beneito-Cambra, Miriam; Ramis-Ramos, Guillermo
2012-10-01
Rice bran oil (RBO) contains significant amounts of the natural antioxidants γ-oryzanol and tocopherols, which are lost to a large degree during oil refining. This results in a number of industrial residues with high contents of these phytochemicals. With the aim of supporting the development of profitable industrial procedures for γ-oryzanol and tocopherol recovery, the contents of these phytochemicals in all the residues produced during RBO refining were evaluated. The samples included residues from the degumming, soap precipitation, bleaching earth filtering, dewaxing and deodorisation distillation steps. The highest phytochemical concentrations were found in the precipitated soap for γ-oryzanol (14.2 mg g(-1), representing 95.3% of total γ-oryzanol in crude RBO), and in the deodorisation distillate for tocopherols (576 mg 100 g(-1), representing 6.7% of total tocopherols in crude RBO). Therefore, among the residues of RBO processing, the deodorisation distillate was the best source of tocopherols. As the soap is further processed for the recovery of fatty acids, samples taken from every step of this secondary process, including hydrosoluble fraction, hydrolysed soap, distillation residue and purified fatty acid fraction, were also analyzed. The distillation residue left after fatty acid recovery from soap was found to be the best source of γ-oryzanol (43.1 mg g(-1), representing 11.5% of total γ-oryzanol in crude RBO). Copyright © 2012 Elsevier Ltd. All rights reserved.
Auerbach, Scott S; Phadke, Dhiral P; Mav, Deepak; Holmgren, Stephanie; Gao, Yuan; Xie, Bin; Shin, Joo Heon; Shah, Ruchir R; Merrick, B Alex; Tice, Raymond R
2015-07-01
Formalin-fixed, paraffin-embedded (FFPE) pathology specimens represent a potentially vast resource for transcriptomic-based biomarker discovery. We present here a comparison of results from a whole transcriptome RNA-Seq analysis of RNA extracted from fresh frozen and FFPE livers. The samples were derived from rats exposed to aflatoxin B1 (AFB1 ) and a corresponding set of control animals. Principal components analysis indicated that samples were separated in the two groups representing presence or absence of chemical exposure, both in fresh frozen and FFPE sample types. Sixty-five percent of the differentially expressed transcripts (AFB1 vs. controls) in fresh frozen samples were also differentially expressed in FFPE samples (overlap significance: P < 0.0001). Genomic signature and gene set analysis of AFB1 differentially expressed transcript lists indicated highly similar results between fresh frozen and FFPE at the level of chemogenomic signatures (i.e., single chemical/dose/duration elicited transcriptomic signatures), mechanistic and pathology signatures, biological processes, canonical pathways and transcription factor networks. Overall, our results suggest that similar hypotheses about the biological mechanism of toxicity would be formulated from fresh frozen and FFPE samples. These results indicate that phenotypically anchored archival specimens represent a potentially informative resource for signature-based biomarker discovery and mechanistic characterization of toxicity. Copyright © 2014 John Wiley & Sons, Ltd.
Shelton, Larry R.; Capel, Paul D.
1994-01-01
A major component of the U.S. Geological Survey's National Water-Quality Assessment program is to assess the occurrence and distribution of trace elements and organic contaminants in streams. The first phase of the strategy for the assessment is to analyze samples of bed sediments from depositional zones. Fine-grained particles deposited in these zones are natural accumulators of trace elements and hydrophobic organic compounds. For the information to be comparable among studies in many different parts of the Nation, strategies for selecting stream sites and depositional zones are critical. Fine-grained surficial sediments are obtained from several depositional zones within a stream reach and composited to yield a sample representing average conditions. Sample collection and processing must be done consistently and by procedures specifically designed to separate the fine material into fractions that yield uncontaminated samples for trace-level analytes in the laboratory. Special coring samplers and other instruments made of Teflon are used for collection. Samples are processed through a 2.0-millimeter stainless-steel mesh sieve for organic contaminate analysis and a 63-micrometer nylon-cloth sieve for trace-element analysis. Quality assurance is maintained by strict collection and processing procedures, duplicate samplings, and a rigid cleaning procedure.
Results of Hg speciation testing on DWPF SMECT-8, OGCT-1, AND OGCT-2 samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bannochie, C.
2016-02-22
The Savannah River National Laboratory (SRNL) was tasked with preparing and shipping samples for Hg speciation by Eurofins Frontier Global Sciences, Inc. in Seattle, WA on behalf of the Savannah River Remediation (SRR) Mercury Task Team. The sixteenth shipment of samples was designated to include a Defense Waste Processing Facility (DWPF) Slurry Mix Evaporator Condensate Tank (SMECT) sample from Sludge Receipt and Adjustment Tank (SRAT) Batch 738 processing and two Off-Gas Condensate Tank (OGCT) samples, one following Batch 736 and one following Batch 738. The DWPF sample designations for the three samples analyzed are provided. The Batch 738 ‘End ofmore » SME Cycle’ SMECT sample was taken at the conclusion of Slurry Mix Evaporator (SME) operations for this batch and represents the fourth SMECT sample examined from Batch 738. Batch 738 experienced a sludge slurry carryover event, which introduced sludge solids to the SMECT that were particularly evident in the SMECT-5 sample, but less evident in the ‘End of SME Cycle’ SMECT-8 sample.« less
Robert H. McAlister; Alexander Clark; Joseph R. Saucier
1997-01-01
The effect of rotation age on strength and stiffness of lumber produced from unthinned loblolly pine stands in the Coastal Plain of Georgia was examined. Six stands representing 22-, 28-, and 40-year-old roations were sampled. A stratified random sample of trees 8 to 16 inches in diameter at breast height was selected from each stand and processed into lumber....
Shelton, Larry R.
1997-01-01
For many years, stream samples for analysis of volatile organic compounds have been collected without specific guidelines or a sampler designed to avoid analyte loss. In 1996, the U.S. Geological Survey's National Water-Quality Assessment Program began aggressively monitoring urban stream-water for volatile organic compounds. To assure representative samples and consistency in collection procedures, a specific sampler was designed to collect samples for analysis of volatile organic compounds in stream water. This sampler, and the collection procedures, were tested in the laboratory and in the field for compound loss, contamination, sample reproducibility, and functional capabilities. This report describes that sampler and its use, and outlines field procedures specifically designed to provide contaminant-free, reproducible volatile organic compound data from stream-water samples. These guidelines and the equipment described represent a significant change in U.S. Geological Survey instructions for collecting and processing stream-water samples for analysis of volatile organic compounds. They are intended to produce data that are both defensible and interpretable, particularly for concentrations below the microgram-per-liter level. The guidelines also contain detailed recommendations for quality-control samples.
Cottin, Hervé; Guan, Yuan Yong; Noblet, Audrey; Poch, Olivier; Saiagh, Kafila; Cloix, Mégane; Macari, Frédérique; Jérome, Murielle; Coll, Patrice; Raulin, François; Stalport, Fabien; Szopa, Cyril; Bertrand, Marylène; Chabin, Annie; Westall, Frances; Chaput, Didier; Demets, René; Brack, André
2012-05-01
The PROCESS (PRebiotic Organic ChEmistry on the Space Station) experiment was part of the EXPOSE-E payload outside the European Columbus module of the International Space Station from February 2008 to August 2009. During this interval, organic samples were exposed to space conditions to simulate their evolution in various astrophysical environments. The samples used represent organic species related to the evolution of organic matter on the small bodies of the Solar System (carbonaceous asteroids and comets), the photolysis of methane in the atmosphere of Titan, and the search for organic matter at the surface of Mars. This paper describes the hardware developed for this experiment as well as the results for the glycine solid-phase samples and the gas-phase samples that were used with regard to the atmosphere of Titan. Lessons learned from this experiment are also presented for future low-Earth orbit astrochemistry investigations.
Francy, Donna S; Stelzer, Erin A; Bushon, Rebecca N; Brady, Amie M G; Williston, Ashley G; Riddell, Kimberly R; Borchardt, Mark A; Spencer, Susan K; Gellner, Terry M
2012-09-01
Log removals of bacterial indicators, coliphage, and enteric viruses were studied in three membrane bioreactor (MBR) activated-sludge and two conventional secondary activated-sludge municipal wastewater treatment plants during three recreational seasons (May-Oct.) when disinfection of effluents is required. In total, 73 regular samples were collected from key locations throughout treatment processes: post-preliminary, post-MBR, post-secondary, post-tertiary, and post-disinfection (UV or chlorine). Out of 19 post-preliminary samples, adenovirus by quantitative polymerase chain reaction (qPCR) was detected in all 19, enterovirus by quantitative reverse transcription polymerase chain reaction (qRT-PCR) was detected in 15, and norovirus GI by qRT-PCR was detected in 11. Norovirus GII and Hepatitis A virus were not detected in any samples, and rotavirus was detected in one sample but could not be quantified. Although culturable viruses were found in 12 out of 19 post-preliminary samples, they were not detected in any post-secondary, post-MBR, post-ultraviolet, or post-chlorine samples. Median log removals for all organisms were higher for MBR secondary treatment (3.02 to >6.73) than for conventional secondary (1.53-4.19) treatment. Ultraviolet disinfection after MBR treatment provided little additional log removal of any organism except for somatic coliphage (>2.18), whereas ultraviolet or chlorine disinfection after conventional secondary treatment provided significant log removals (above the analytical variability) of all bacterial indicators (1.18-3.89) and somatic and F-specific coliphage (0.71 and >2.98). Median log removals of adenovirus across disinfection were low in both MBR and conventional secondary plants (no removal detected and 0.24), and few removals of individual samples were near or above the analytical variability of 1.2 log genomic copies per liter. Based on qualitative examinations of plots showing reductions of organisms throughout treatment processes, somatic coliphage may best represent the removal of viruses across secondary treatment in both MBR and conventional secondary plants. F-specific coliphage and Escherichia coli may best represent the removal of viruses across the disinfection process in MBR facilities, but none of the indicators represented the removal of viruses across disinfection in conventional secondary plants. Published by Elsevier Ltd.
Francy, Donna S.; Erin, A. Stelzer; Bushon, Rebecca N.; Brady, Amie M.G.; Williston, Ashley G.; Riddell, Kimberly R.; Borchardt, Mark A.; Spencer, Susan K.; Gellner, Terry M.
2012-01-01
Log removals of bacterial indicators, coliphage, and enteric viruses were studied in three membrane bioreactor (MBR) activated-sludge and two conventional secondary activated-sludge municipal wastewater treatment plants during three recreational seasons (May-Oct.) when disinfection of effluents is required. In total, 73 regular samples were collected from key locations throughout treatment processes: post-preliminary, post-MBR, post-secondary, post-tertiary, and post-disinfection (UV or chlorine). Out of 19 post-preliminary samples, adenovirus by quantitative polymerase chain reaction (qPCR) was detected in all 19, enterovirus by quantitative reverse transcription polymerase chain reaction (qRT-PCR) was detected in 15, and norovirus GI by qRT-PCR was detected in 11. Norovirus GII and Hepatitis A virus were not detected in any samples, and rotavirus was detected in one sample but could not be quantified. Although culturable viruses were found in 12 out of 19 post-preliminary samples, they were not detected in any post-secondary, post-MBR, post-ultraviolet, or post-chlorine samples. Median log removals for all organisms were higher for MBR secondary treatment (3.02 to >6.73) than for conventional secondary (1.53-4.19) treatment. Ultraviolet disinfection after MBR treatment provided little additional log removal of any organism except for somatic coliphage (>2.18), whereas ultraviolet or chlorine disinfection after conventional secondary treatment provided significant log removals (above the analytical variability) of all bacterial indicators (1.18-3.89) and somatic and F-specific coliphage (0.71 and >2.98). Median log removals of adenovirus across disinfection were low in both MBR and conventional secondary plants (no removal detected and 0.24), and few removals of individual samples were near or above the analytical variability of 1.2 log genomic copies per liter. Based on qualitative examinations of plots showing reductions of organisms throughout treatment processes, somatic coliphage may best represent the removal of viruses across secondary treatment in both MBR and conventional secondary plants. F-specific coliphage and Escherichia coli may best represent the removal of viruses across the disinfection process in MBR facilities, but none of the indicators represented the removal of viruses across disinfection in conventional secondary plants.
ERIC Educational Resources Information Center
Peter, Beate; Matsushita, Mark; Raskind, Wendy H.
2011-01-01
Purpose: To investigate processing speed as a latent dimension in children with dyslexia and children and adults with typical reading skills. Method: Exploratory factor analysis (FA) was based on a sample of multigenerational families, each ascertained through a child with dyslexia. Eleven measures--6 of them timed--represented verbal and…
Experiments in concept modeling for radiographic image reports.
Bell, D S; Pattison-Gordon, E; Greenes, R A
1994-01-01
OBJECTIVE: Development of methods for building concept models to support structured data entry and image retrieval in chest radiography. DESIGN: An organizing model for chest-radiographic reporting was built by analyzing manually a set of natural-language chest-radiograph reports. During model building, clinician-informaticians judged alternative conceptual structures according to four criteria: content of clinically relevant detail, provision for semantic constraints, provision for canonical forms, and simplicity. The organizing model was applied in representing three sample reports in their entirety. To explore the potential for automatic model discovery, the representation of one sample report was compared with the noun phrases derived from the same report by the CLARIT natural-language processing system. RESULTS: The organizing model for chest-radiographic reporting consists of 62 concept types and 17 relations, arranged in an inheritance network. The broadest types in the model include finding, anatomic locus, procedure, attribute, and status. Diagnoses are modeled as a subtype of finding. Representing three sample reports in their entirety added 79 narrower concept types. Some CLARIT noun phrases suggested valid associations among subtypes of finding, status, and anatomic locus. CONCLUSIONS: A manual modeling process utilizing explicitly stated criteria for making modeling decisions produced an organizing model that showed consistency in early testing. A combination of top-down and bottom-up modeling was required. Natural-language processing may inform model building, but algorithms that would replace manual modeling were not discovered. Further progress in modeling will require methods for objective model evaluation and tools for formalizing the model-building process. PMID:7719807
A stochastic diffusion process for Lochner's generalized Dirichlet distribution
Bakosi, J.; Ristorcelli, J. R.
2013-10-01
The method of potential solutions of Fokker-Planck equations is used to develop a transport equation for the joint probability of N stochastic variables with Lochner’s generalized Dirichlet distribution as its asymptotic solution. Individual samples of a discrete ensemble, obtained from the system of stochastic differential equations, equivalent to the Fokker-Planck equation developed here, satisfy a unit-sum constraint at all times and ensure a bounded sample space, similarly to the process developed in for the Dirichlet distribution. Consequently, the generalized Dirichlet diffusion process may be used to represent realizations of a fluctuating ensemble of N variables subject to a conservation principle.more » Compared to the Dirichlet distribution and process, the additional parameters of the generalized Dirichlet distribution allow a more general class of physical processes to be modeled with a more general covariance matrix.« less
Marin, Tania; Taylor, Anne Winifred; Grande, Eleonora Dal; Avery, Jodie; Tucker, Graeme; Morey, Kim
2015-05-19
The considerably lower average life expectancy of Aboriginal and Torres Strait Islander Australians, compared with non-Aboriginal and non-Torres Strait Islander Australians, has been widely reported. Prevalence data for chronic disease and health risk factors are needed to provide evidence based estimates for Australian Aboriginal and Torres Strait Islanders population health planning. Representative surveys for these populations are difficult due to complex methodology. The focus of this paper is to describe in detail the methodological challenges and resolutions of a representative South Australian Aboriginal population-based health survey. Using a stratified multi-stage sampling methodology based on the Australian Bureau of Statistics 2006 Census with culturally appropriate and epidemiological rigorous methods, 11,428 randomly selected dwellings were approached from a total of 209 census collection districts. All persons eligible for the survey identified as Aboriginal and/or Torres Strait Islander and were selected from dwellings identified as having one or more Aboriginal person(s) living there at the time of the survey. Overall, the 399 interviews from an eligible sample of 691 SA Aboriginal adults yielded a response rate of 57.7%. These face-to-face interviews were conducted by ten interviewers retained from a total of 27 trained Aboriginal interviewers. Challenges were found in three main areas: identification and recruitment of participants; interviewer recruitment and retainment; and using appropriate engagement with communities. These challenges were resolved, or at least mainly overcome, by following local protocols with communities and their representatives, and reaching agreement on the process of research for Aboriginal people. Obtaining a representative sample of Aboriginal participants in a culturally appropriate way was methodologically challenging and required high levels of commitment and resources. Adhering to these principles has resulted in a rich and unique data set that provides an overview of the self-reported health status for Aboriginal people living in South Australia. This process provides some important principles to be followed when engaging with Aboriginal people and their communities for the purpose of health research.
Dzieciol, Monika; Schornsteiner, Elisa; Muhterem-Uyar, Meryem; Stessl, Beatrix; Wagner, Martin; Schmitz-Esser, Stephan
2016-04-16
Sanitation protocols are applied on a daily basis in food processing facilities to prevent the risk of cross-contamination with spoilage organisms. Floor drain water serves along with product-associated samples (slicer dust, brine or cheese smear) as an important hygiene indicator in monitoring Listeria monocytogenes in food processing facilities. Microbial communities of floor drains are representative for each processing area and are influenced to a large degree by food residues, liquid effluents and washing water. The microbial communities of drain water are steadily changing, whereas drain biofilms provide more stable niches. Bacterial communities of four floor drains were characterized using 16S rRNA gene pyrosequencing to better understand the composition and exchange of drain water and drain biofilm communities. Furthermore, the L. monocytogenes contamination status of each floor drain was determined by applying cultivation-independent real-time PCR quantification and cultivation-dependent detection according to ISO11290-1. Pyrosequencing of 16S rRNA genes of drain water and drain biofilm bacterial communities yielded 50,611 reads, which were clustered into 641 operational taxonomic units (OTUs), affiliated to 16 phyla dominated by Proteobacteria, Firmicutes and Bacteroidetes. The most abundant OTUs represented either product- (Lactococcus lactis) or fermentation- and food spoilage-associated phylotypes (Pseudomonas mucidolens, Pseudomonas fragi, Leuconostoc citreum, and Acetobacter tropicalis). The microbial communities in DW and DB samples were distinct in each sample type and throughout the whole processing plant, indicating the presence of indigenous specific microbial communities in each processing compartment. The microbiota of drain biofilms was largely different from the microbiota of the drain water. A sampling approach based on drain water alone may thus only provide reliable information on planktonic bacterial cells but might not allow conclusions on the bacterial composition of the microbiota in biofilms. Copyright © 2016. Published by Elsevier B.V.
Yeheyis, Likawent; Kijora, Claudia; Wink, Michael; Peters, Kurt J
2011-01-01
The effect of a traditional Ethiopian lupin processing method on the chemical composition of lupin seed samples was studied. Two sampling districts, namely Mecha and Sekela, representing the mid- and high-altitude areas of north-western Ethiopia, respectively, were randomly selected. Different types of traditionally processed and marketed lupin seed samples (raw, roasted, and finished) were collected in six replications from each district. Raw samples are unprocessed, and roasted samples are roasted using firewood. Finished samples are those ready for human consumption as snack. Thousand seed weight for raw and roasted samples within a study district was similar (P > 0.05), but it was lower (P < 0.01) for finished samples compared to raw and roasted samples. The crude fibre content of finished lupin seed sample from Mecha was lower (P < 0.01) than that of raw and roasted samples. However, the different lupin samples from Sekela had similar crude fibre content (P > 0.05). The crude protein and crude fat contents of finished samples within a study district were higher (P < 0.01) than those of raw and roasted samples, respectively. Roasting had no effect on the crude protein content of lupin seed samples. The crude ash content of raw and roasted lupin samples within a study district was higher (P < 0.01) than that of finished lupin samples of the respective study districts. The content of quinolizidine alkaloids of finished lupin samples was lower than that of raw and roasted samples. There was also an interaction effect between location and lupin sample type. The traditional processing method of lupin seeds in Ethiopia has a positive contribution improving the crude protein and crude fat content, and lowering the alkaloid content of the finished product. The study showed the possibility of adopting the traditional processing method to process bitter white lupin for the use as protein supplement in livestock feed in Ethiopia, but further work has to be done on the processing method and animal evaluation.
Marketing Norm Perception Among Medical Representatives in Indian Pharmaceutical Industry
Nagashekhara, Molugulu; Agil, Syed Omar Syed; Ramasamy, Ravindran
2012-01-01
Study of marketing norm perception among medical representatives is an under-portrayed component that deserves further perusal in the pharmaceutical industry. The purpose of this study is to find out the perception of marketing norms among medical representatives. The research design is quantitative and cross sectional study with medical representatives as unit of analysis. Data is collected from medical representatives (n=300) using a simple random and cluster sampling using a structured questionnaire. Results indicate that there is no difference in the perception of marketing norms among male and female medical representatives. But there is a difference in opinion among domestic and multinational company’s medical representatives. Educational back ground of medical representatives also shows the difference in opinion among medical representatives. Degree holders and multinational company medical representatives have high perception of marketing norms compare to their counterparts. The researchers strongly believe that mandatory training on marketing norms is beneficial in decision making process during the dilemmas in the sales field. PMID:24826035
Representativeness of direct observations selected using a work-sampling equation.
Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas
2015-01-01
Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.
Challenges in creating an opt-in biobank with a registrar-based consent process and a commercial EHR
Corsmo, Jeremy; Barnes, Michael G; Pollick, Carrie; Chalfin, Jamie; Nix, Jeremy; Smith, Christopher; Ganta, Rajesh
2012-01-01
Residual clinical samples represent a very appealing source of biomaterial for translational and clinical research. We describe the implementation of an opt-in biobank, with consent being obtained at the time of registration and the decision stored in our electronic health record, Epic. Information on that decision, along with laboratory data, is transferred to an application that signals to biobank staff whether a given sample can be kept for research. Investigators can search for samples using our i2b2 data warehouse. Patient participation has been overwhelmingly positive and much higher than anticipated. Over 86% of patients provided consent and almost 83% requested to be notified of any incidental research findings. In 6 months, we obtained decisions from over 18 000 patients and processed 8000 blood samples for storage in our research biobank. However, commercial electronic health records like Epic lack key functionality required by a registrar-based consent process, although workarounds exist. PMID:22878682
Parental Divorce and Child Mental Health Trajectories
ERIC Educational Resources Information Center
Strohschein, Lisa
2005-01-01
A process-oriented approach to parental divorce locates the experience within the social and developmental context of children's lives, providing greater insight into how parental divorce produces vulnerability in some children and resiliency in others. The current study involves prospectively tracking a nationally representative sample of…
The contribution of temporary storage and executive processes to category learning.
Wang, Tengfei; Ren, Xuezhu; Schweizer, Karl
2015-09-01
Three distinctly different working memory processes, temporary storage, mental shifting and inhibition, were proposed to account for individual differences in category learning. A sample of 213 participants completed a classic category learning task and two working memory tasks that were experimentally manipulated for tapping specific working memory processes. Fixed-links models were used to decompose data of the category learning task into two independent components representing basic performance and improvement in performance in category learning. Processes of working memory were also represented by fixed-links models. In a next step the three working memory processes were linked to components of category learning. Results from modeling analyses indicated that temporary storage had a significant effect on basic performance and shifting had a moderate effect on improvement in performance. In contrast, inhibition showed no effect on any component of the category learning task. These results suggest that temporary storage and the shifting process play different roles in the course of acquiring new categories. Copyright © 2015 Elsevier B.V. All rights reserved.
The effect of membrane filtration on dissolved trace element concentrations
Horowitz, A.J.; Lum, K.R.; Garbarino, J.R.; Hall, G.E.M.; Lemieux, C.; Demas, C.R.
1996-01-01
The almost universally accepted operational definition for dissolved constituents is based on processing whole-water samples through a 0.45-??m membrane filter. Results from field and laboratory experiments indicate that a number of factors associated with filtration, other than just pore size (e.g., diameter, manufacturer, volume of sample processed, amount of suspended sediment in the sample), can produce substantial variations in the 'dissolved' concentrations of such elements as Fe, Al, Cu, Zn, Pb, Co, and Ni. These variations result from the inclusion/exclusion of colloidally- associated trace elements. Thus, 'dissolved' concentrations quantitated by analyzing filtrates generated by processing whole-water through similar pore- sized membrane filters may not be equal/comparable. As such, simple filtration through a 0.45-??m membrane filter may no longer represent an acceptable operational definition for dissolved chemical constituents. This conclusion may have important implications for environmental studies and regulatory agencies.
Yang, Z Janet; McComas, Katherine A; Gay, Geri K; Leonard, John P; Dannenberg, Andrew J; Dillon, Hildy
2012-01-01
This study extends a risk information seeking and processing model to explore the relative effect of cognitive processing strategies, positive and negative emotions, and normative beliefs on individuals' decision making about potential health risks. Most previous research based on this theoretical framework has examined environmental risks. Applying this risk communication model to study health decision making presents an opportunity to explore theoretical boundaries of the model, while also bringing this research to bear on a pressing medical issue: low enrollment in clinical trials. Comparative analysis of data gathered from 2 telephone surveys of a representative national sample (n = 500) and a random sample of cancer patients (n = 411) indicated that emotions played a more substantive role in cancer patients' decisions to enroll in a potential trial, whereas cognitive processing strategies and normative beliefs had greater influences on the decisions of respondents from the national sample.
Micron-Scale Differential Scanning Calorimeter on a Chip
Cavicchi, Richard E; Poirier, Gregory Ernest; Suehle, John S; Gaitan, Michael; Tea, Nim H
1998-06-30
A differential scanning microcalorimeter produced on a silicon chip enables microscopic scanning calorimetry measurements of small samples and thin films. The chip may be fabricated using standard CMOS processes. The microcalorimeter includes a reference zone and a sample zone. The reference and sample zones may be at opposite ends of a suspended platform or may reside on separate platforms. An integrated polysilicon heater provides heat to each zone. A thermopile consisting of a succession of thermocouple junctions generates a voltage representing the temperature difference between the reference and sample zones. Temperature differences between the zones provide information about the chemical reactions and phase transitions that occur in a sample placed in the sample zone.
Food and Feed Safety Assessment: The Importance of Proper Sampling.
Kuiper, Harry A; Paoletti, Claudia
2015-03-24
The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.
Food and feed safety assessment: the importance of proper sampling.
Kuiper, Harry A; Paoletti, Claudia
2015-01-01
The general principles for safety and nutritional evaluation of foods and feed and the potential health risks associated with hazardous compounds are described as developed by the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) and further elaborated in the European Union-funded project Safe Foods. We underline the crucial role of sampling in foods/feed safety assessment. High quality sampling should always be applied to ensure the use of adequate and representative samples as test materials for hazard identification, toxicological and nutritional characterization of identified hazards, as well as for estimating quantitative and reliable exposure levels of foods/feed or related compounds of concern for humans and animals. The importance of representative sampling is emphasized through examples of risk analyses in different areas of foods/feed production. The Theory of Sampling (TOS) is recognized as the only framework within which to ensure accuracy and precision of all sampling steps involved in the field-to-fork continuum, which is crucial to monitor foods and feed safety. Therefore, TOS must be integrated in the well-established FAO/WHO risk assessment approach in order to guarantee a transparent and correct frame for the risk assessment and decision making process.
Weber, Daniela; Davies, Michael J.; Grune, Tilman
2015-01-01
Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. PMID:26141921
Weber, Daniela; Davies, Michael J; Grune, Tilman
2015-08-01
Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. © 2015 Published by Elsevier Ltd.
'Peeling a comet': Layering of comet analogues
NASA Astrophysics Data System (ADS)
Kaufmann, E.; Hagermann, A.
2017-09-01
Using a simple comet analogue we investigate the influence of subsurface solar light absorption by dust. We found that a sample initially consisting of loose water ice grains and carbon particles becomes significantly harder after being irradiated with artificial sunlight for several hours. Further a drastic change of the sample surface could be observed. These results suggests that models should treat the nucleus surface as an interactive transitional zone to better represent cometary processes.
Spectroscopic analyses of soil samples outside Nile Delta of Egypt
NASA Astrophysics Data System (ADS)
Fakhry, Ahmed; Osman, Osama; Ezzat, Hend; Ibrahim, Medhat
2016-11-01
Soil in Egypt, especially around Delta is exposed to various pollutants which are affecting adversely soil fertility and stability. Humic Acids (HA) as a main part of soil organic matter (SOM) represent the heart of the interaction process of inorganic pollutants with soil. Consequently, Fourier transform infrared spectroscopy (FTIR) and Nuclear magnetic resonances (NMR) were used to characterize soil, sediment and extracted HA. Resulting data confirmed that the HA was responsible for transporting inorganic pollutants from surface to subsurface reaching the ground water, which may represent a high risk on public health. The transport process is coming as carboxyl in surface soil changed into metal carboxylate then transferred into the carboxyl in bottom soil.
Method and system for determining precursors of health abnormalities from processing medical records
None, None
2013-06-25
Medical reports are converted to document vectors in computing apparatus and sampled by applying a maximum variation sampling function including a fitness function to the document vectors to reduce a number of medical records being processed and to increase the diversity of the medical records being processed. Linguistic phrases are extracted from the medical records and converted to s-grams. A Haar wavelet function is applied to the s-grams over the preselected time interval; and the coefficient results of the Haar wavelet function are examined for patterns representing the likelihood of health abnormalities. This confirms certain s-grams as precursors of the health abnormality and a parameter can be calculated in relation to the occurrence of such a health abnormality.
Digital audio watermarking using moment-preserving thresholding
NASA Astrophysics Data System (ADS)
Choi, DooSeop; Jung, Hae Kyung; Choi, Hyuk; Kim, Taejeong
2007-09-01
The Moment-Preserving Thresholding technique for digital images has been used in digital image processing for decades, especially in image binarization and image compression. Its main strength lies in that the binary values that the MPT produces as a result, called representative values, are usually unaffected when the signal being thresholded goes through a signal processing operation. The two representative values in MPT together with the threshold value are obtained by solving the system of the preservation equations for the first, second, and third moment. Relying on this robustness of the representative values to various signal processing attacks considered in the watermarking context, this paper proposes a new watermarking scheme for audio signals. The watermark is embedded in the root-sum-square (RSS) of the two representative values of each signal block using the quantization technique. As a result, the RSS values are modified by scaling the signal according to the watermark bit sequence under the constraint of inaudibility relative to the human psycho-acoustic model. We also address and suggest solutions to the problem of synchronization and power scaling attacks. Experimental results show that the proposed scheme maintains high audio quality and robustness to various attacks including MP3 compression, re-sampling, jittering, and, DA/AD conversion.
NASA Technical Reports Server (NTRS)
Morris, Penny A.; Wentworth, Susan J.; Nelman, Mayra; Byrne, Monica; Longazo, Teresa; Galindo, Charles; McKay, David S.; Sams, Clarence
2003-01-01
Terrestrial biotas from microbially dominated hypersaline environments will help us understand microbial fossilization processes. Hypersaline tolerant biota from Storr's Lake, San Salvador Island (Bahamas), Mono Lake (California), and the Dead Sea (Israel) represent marine and nonmarine sites for comparative studies of potential analogs for interpreting some Mars meteorites and Mars sample return rocks [1,2,3,4,5,6]. The purpose of this study is to compare microbial fossilization processes, the dominant associated minerals, and potential diagenic implications.
Rix, Catherine S; Sims, Mark R; Cullen, David C
2011-11-01
The proposed ExoMars mission, due to launch in 2018, aims to look for evidence of extant and extinct life in martian rocks and regolith. Previous attempts to detect organic molecules of biological or abiotic origin on Mars have been unsuccessful, which may be attributable to destruction of these molecules by perchlorate salts during pyrolysis sample extraction techniques. Organic molecules can also be extracted and measured with solvent-based systems. The ExoMars payload includes the Life Marker Chip (LMC) instrument, capable of detecting biomarker molecules of extant and extinct Earth-like life in liquid extracts of martian samples with an antibody microarray assay. The aim of the work reported here was to investigate whether the presence of perchlorate salts, at levels similar to those at the NASA Phoenix landing site, would compromise the LMC extraction and detection method. To test this, we implemented an LMC-representative sample extraction process with an LMC-representative antibody assay and used these to extract and analyze a model sample that consisted of a Mars analog sample matrix (JSC Mars-1) spiked with a representative organic molecular target (pyrene, an example of abiotic meteoritic infall targets) in the presence of perchlorate salts. We found no significant change in immunoassay function when using pyrene standards with added perchlorate salts. When model samples spiked with perchlorate salts were subjected to an LMC-representative liquid extraction, immunoassays functioned in a liquid extract and detected extracted pyrene. For the same model sample matrix without perchlorate salts, we observed anomalous assay signals that coincided with yellow coloration of the extracts. This unexpected observation is being studied further. This initial study indicates that the presence of perchlorate salts, at levels similar to those detected at the NASA Phoenix landing site, is unlikely to prevent the LMC from extracting and detecting organic molecules from martian samples.
Mathematics Learning Development: The Role of Long-Term Retrieval
ERIC Educational Resources Information Center
Calderón-Tena, Carlos O.; Caterino, Linda C.
2016-01-01
This study assessed the relation between long-term memory retrieval and mathematics calculation and mathematics problem solving achievement among elementary, middle, and high school students in nationally representative sample of US students, when controlling for fluid and crystallized intelligence, short-term memory, and processing speed. As…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-18
... Information Collection: Quality Control for Rental Assistance Subsidy Determinations AGENCY: Office of the... Collection Title of Information Collection: Quality Control for Rental Assistance Subsidy Determinations. OMB... Quality Control process involves selecting a nationally representative sample of assisted households to...
Simulation and flavor compound analysis of dealcoholized beer via one-step vacuum distillation.
Andrés-Iglesias, Cristina; García-Serna, Juan; Montero, Olimpio; Blanco, Carlos A
2015-10-01
The coupled operation of vacuum distillation process to produce alcohol free beer at laboratory scale and Aspen HYSYS simulation software was studied to define the chemical changes during the dealcoholization process in the aroma profiles of 2 different lager beers. At the lab-scale process, 2 different parameters were chosen to dealcoholize beer samples, 102mbar at 50°C and 200mbar at 67°C. Samples taken at different steps of the process were analyzed by HS-SPME-GC-MS focusing on the concentration of 7 flavor compounds, 5 alcohols and 2 esters. For simulation process, the EoS parameters of the Wilson-2 property package were adjusted to the experimental data and one more pressure was tested (60mbar). Simulation methods represent a viable alternative to predict results of the volatile compound composition of a final dealcoholized beer. Copyright © 2015 Elsevier Ltd. All rights reserved.
Modeling unobserved sources of heterogeneity in animal abundance using a Dirichlet process prior
Dorazio, R.M.; Mukherjee, B.; Zhang, L.; Ghosh, M.; Jelks, H.L.; Jordan, F.
2008-01-01
In surveys of natural populations of animals, a sampling protocol is often spatially replicated to collect a representative sample of the population. In these surveys, differences in abundance of animals among sample locations may induce spatial heterogeneity in the counts associated with a particular sampling protocol. For some species, the sources of heterogeneity in abundance may be unknown or unmeasurable, leading one to specify the variation in abundance among sample locations stochastically. However, choosing a parametric model for the distribution of unmeasured heterogeneity is potentially subject to error and can have profound effects on predictions of abundance at unsampled locations. In this article, we develop an alternative approach wherein a Dirichlet process prior is assumed for the distribution of latent abundances. This approach allows for uncertainty in model specification and for natural clustering in the distribution of abundances in a data-adaptive way. We apply this approach in an analysis of counts based on removal samples of an endangered fish species, the Okaloosa darter. Results of our data analysis and simulation studies suggest that our implementation of the Dirichlet process prior has several attractive features not shared by conventional, fully parametric alternatives. ?? 2008, The International Biometric Society.
Lippi, Giuseppe; Montagnana, Martina; Giavarina, Davide
2006-01-01
Owing to remarkable advances in automation, laboratory technology and informatics, the pre-analytical phase has become the major source of variability in laboratory testing. The present survey investigated the development of several pre-analytical processes within a representative cohort of Italian clinical laboratories. A seven-point questionnaire was designed to investigate the following issues: 1a) the mean outpatient waiting time before check-in and 1b) the mean time from check-in to sample collection; 2) the mean time from sample collection to analysis; 3) the type of specimen collected for clinical chemistry testing; 4) the degree of pre-analytical automation; 5a) the number of samples shipped to other laboratories and 5b) the availability of standardised protocols for transportation; 6) the conditions for specimen storage; and 7) the availability and type of guidelines for management of unsuitable specimens. The questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. 107 questionnaires (71.3%) were returned. Data analysis revealed a high degree of variability among laboratories for the time required for check-in, outpatient sampling, sample transportation to the referral laboratory and analysis upon the arrival. Only 31% of laboratories have automated some pre-analytical steps. Of the 87% of laboratories that ship specimens to other facilities without sample preparation, 19% have no standardised protocol for transportation. For conventional clinical chemistry testing, 74% of the laboratories use serum evacuated tubes (59% with and 15% without serum separator), whereas the remaining 26% use lithium-heparin evacuated tubes (11% with and 15% without plasma separator). The storage period and conditions for rerun/retest vary widely. Only 63% of laboratories have a codified procedure for the management of unsuitable specimens, which are recognised by visual inspection (69%) or automatic detection (29%). Only 56% of the laboratories have standardised procedures for the management of unsuitable specimens, which vary widely on a local basis. The survey highlights broad heterogeneity in several pre-analytical processes among Italian laboratories. The lack of reliable guidelines encompassing evidence-based practice is a major problem for the standardisation of this crucial part of the testing process and represents a major challenge for laboratory medicine in the 2000s.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
NETWORK DESIGN FACTORS FOR ASSESSING TEMPORAL VARIABILITY IN GROUND-WATER QUALITY
A 1.5 year benchmark data Set was collected at biweekly frequency from two siteS in shallow sand and gravel deposits in West Central Illinois. ne site was near a hog-processing facility and the other represented uncontaminated conditions. onsistent sampling and analytical protoco...
40 CFR Appendix I to Part 261 - Representative Sampling Methods
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...
40 CFR Appendix I to Part 261 - Representative Sampling Methods
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the form and consistency of the waste materials to be sampled. Samples collected using the sampling...
1998-01-01
bleaching. 3 Dr. Rowan and I have taken samples from these colonies and Dr. Rowan will process these samples with the hypothesis that zooxanthellae from...non-bleached colonies will be represented by clade A zooxanthellae , a species more resistant to temperature stress. IMPACT/APPLICATIONS Two specific
Stevens, C. M. [Chemical Technology Division, Argonne National Laboratory, Argonne, Illinois (USA)
2012-01-01
This data package presents atmospheric CH4 concentration and 13C isotopic abundance data derived from air samples collected over the period 1978-1989 at globally distributed clean-air sites. The data set comprises 201 records, 166 from the Northern Hemisphere and 35 from the Southern Hemisphere. The air samples were collected mostly in rural or marine locations remote from large sources of CH4 and are considered representative of tropospheric background conditions. The air samples were processed by isolation of CH4 from air and conversion to CO2 for isotopic analysis by isotope ratio mass spectrometry. These data represent one of the earliest records of 13C isotopic yy!measurements for atmospheric methane and have been used to refine estimates of CH4 emissions, calculate annual growth rates of emissions from changing sources, and provide evidence for changes in the rate of atmospheric removal of CH4. The data records consist of sample collection date; number of samples combined for analysis; sampling location; analysis date; CH4 concentration; 13C isotopic abundance; and flag codes to indicate outliers, repeated analyses, and other information.
Some thoughts on problems associated with various sampling media used for environmental monitoring
Horowitz, A.J.
1997-01-01
Modern analytical instrumentation is capable of measuring a variety of trace elements at concentrations down into the single or double digit parts-per-trillion (ng l-1) range. This holds for the three most common sample media currently used in environmental monitoring programs: filtered water, whole-water and separated suspended sediment. Unfortunately, current analytical capabilities have exceeded the current capacity to collect both uncontaminated and representative environmental samples. The success of any trace element monitoring program requires that this issue be both understood and addressed. The environmental monitoring of trace elements requires the collection of calendar- and event-based dissolved and suspended sediment samples. There are unique problems associated with the collection and chemical analyses of both types of sample media. Over the past 10 years, reported ambient dissolved trace element concentrations have declined. Generally, these decreases do not reflect better water quality, but rather improvements in the procedures used to collect, process, preserve and analyze these samples without contaminating them during these steps. Further, recent studies have shown that the currently accepted operational definition of dissolved constituents (material passing a 0.45 ??m membrane filter) is inadequat owing to sampling and processing artifacts. The existence of these artifacts raises questions about the generation of accurate, precise and comparable 'dissolved' trace element data. Suspended sediment and associated trace elements can display marked short- and long-term spatial and temporal variability. This implies that spatially representative samples only can be obtained by generating composites using depth- and width-integrated sampling techniques. Additionally, temporal variations have led to the view that the determination of annual trace element fluxes may require nearly constant (e.g., high-frequency) sampling and subsequent chemical analyses. Ultimately, sampling frequency for flux estimates becomes dependent on the time period of concern (daily, weekly, monthly, yearly) and the amount of acceptable error associated with these estimates.
Korbel, Kathryn; Chariton, Anthony; Stephenson, Sarah; Greenfield, Paul; Hose, Grant C.
2017-01-01
When compared to surface ecosystems, groundwater sampling has unique constraints, including limited access to ecosystems through wells. In order to monitor groundwater, a detailed understanding of groundwater biota and what biological sampling of wells truly reflects, is paramount. This study aims to address this uncertainty, comparing the composition of biota in groundwater wells prior to and after purging, with samples collected prior to purging reflecting a potentially artificial environment and samples collected after purging representing the surrounding aquifer. This study uses DNA community profiling (metabarcoding) of 16S rDNA and 18S rDNA, combined with traditional stygofauna sampling methods, to characterise groundwater biota from four catchments within eastern Australia. Aquifer waters were dominated by Archaea and bacteria (e.g. Nitrosopumilales) that are often associated with nitrification processes, and contained a greater proportion of bacteria (e.g. Anaerolineales) associated with fermenting processes compared to well waters. In contrast, unpurged wells contained greater proportions of pathogenic bacteria and bacteria often associated with denitrification processes. In terms of eukaryotes, the abundances of copepods, syncarids and oligochaetes and total abundances of stygofauna were greater in wells than aquifers. These findings highlight the need to consider sampling requirements when completing groundwater ecology surveys. PMID:28102290
West, A G; Goldsmith, G R; Matimati, I; Dawson, T E
2011-08-30
Previous studies have demonstrated the potential for large errors to occur when analyzing waters containing organic contaminants using isotope ratio infrared spectroscopy (IRIS). In an attempt to address this problem, IRIS manufacturers now provide post-processing spectral analysis software capable of identifying samples with the types of spectral interference that compromises their stable isotope analysis. Here we report two independent tests of this post-processing spectral analysis software on two IRIS systems, OA-ICOS (Los Gatos Research Inc.) and WS-CRDS (Picarro Inc.). Following a similar methodology to a previous study, we cryogenically extracted plant leaf water and soil water and measured the δ(2)H and δ(18)O values of identical samples by isotope ratio mass spectrometry (IRMS) and IRIS. As an additional test, we analyzed plant stem waters and tap waters by IRMS and IRIS in an independent laboratory. For all tests we assumed that the IRMS value represented the "true" value against which we could compare the stable isotope results from the IRIS methods. Samples showing significant deviations from the IRMS value (>2σ) were considered to be contaminated and representative of spectral interference in the IRIS measurement. Over the two studies, 83% of plant species were considered contaminated on OA-ICOS and 58% on WS-CRDS. Post-analysis, spectra were analyzed using the manufacturer's spectral analysis software, in order to see if the software correctly identified contaminated samples. In our tests the software performed well, identifying all the samples with major errors. However, some false negatives indicate that user evaluation and testing of the software are necessary. Repeat sampling of plants showed considerable variation in the discrepancies between IRIS and IRMS. As such, we recommend that spectral analysis of IRIS data must be incorporated into standard post-processing routines. Furthermore, we suggest that the results from spectral analysis be included when reporting stable isotope data from IRIS. Copyright © 2011 John Wiley & Sons, Ltd.
Caititu: a tool to graphically represent peptide sequence coverage and domain distribution.
Carvalho, Paulo C; Junqueira, Magno; Valente, Richard H; Domont, Gilberto B
2008-10-07
Here we present Caititu, an easy-to-use proteomics software to graphically represent peptide sequence coverage and domain distribution for different correlated samples (e.g. originated from 2D gel spots) relatively to the full-sequence of the known protein they are related to. Although Caititu has a broad applicability, we exemplify its usefulness in Toxinology using snake venom as a model. For example, proteolytic processing may lead to inactivation or loss of domains. Therefore, our proposed graphic representation for peptides identified by two dimensional electrophoresis followed by mass spectrometric identification of excised spots can aid in inferring what kind of processing happened to the toxins, if any. Caititu is freely available to download at: http://pcarvalho.com/things/caititu.
Atomic diffusion in laser surface modified AISI H13 steel
NASA Astrophysics Data System (ADS)
Aqida, S. N.; Brabazon, D.; Naher, S.
2013-07-01
This paper presents a laser surface modification process of AISI H13 steel using 0.09 and 0.4 mm of laser spot sizes with an aim to increase surface hardness and investigate elements diffusion in laser modified surface. A Rofin DC-015 diffusion-cooled CO2 slab laser was used to process AISI H13 steel samples. Samples of 10 mm diameter were sectioned to 100 mm length in order to process a predefined circumferential area. The parameters selected for examination were laser peak power, pulse repetition frequency (PRF), and overlap percentage. The hardness properties were tested at 981 mN force. Metallographic study and energy dispersive X-ray spectroscopy (EDXS) were performed to observe presence of elements and their distribution in the sample surface. Maximum hardness achieved in the modified surface was 1017 HV0.1. Change of elements composition in the modified layer region was detected in the laser modified samples. Diffusion possibly occurred for C, Cr, Cu, Ni, and S elements. The potential found for increase in surface hardness represents an important method to sustain tooling life. The EDXS findings signify understanding of processing parameters effect on the modified surface composition.
Weller, Daniel; Andrus, Alexis; Wiedmann, Martin; den Bakker, Henk C
2015-01-01
Sampling of seafood and dairy processing facilities in the north-eastern USA produced 18 isolates of Listeria spp. that could not be identified at the species-level using traditional phenotypic and genotypic identification methods. Results of phenotypic and genotypic analyses suggested that the isolates represent two novel species with an average nucleotide blast identity of less than 92% with previously described species of the genus Listeria. Phylogenetic analyses based on whole genome sequences, 16S rRNA gene and sigB gene sequences confirmed that the isolates represented by type strain FSL M6-0635(T) and FSL A5-0209 cluster phylogenetically with Listeria cornellensis. Phylogenetic analyses also showed that the isolates represented by type strain FSL A5-0281(T) cluster phylogenetically with Listeria riparia. The name Listeria booriae sp. nov. is proposed for the species represented by type strain FSL A5-0281(T) ( =DSM 28860(T) =LMG 28311(T)), and the name Listeria newyorkensis sp. nov. is proposed for the species represented by type strain FSL M6-0635(T) ( =DSM 28861(T) =LMG 28310(T)). Phenotypic and genotypic analyses suggest that neither species is pathogenic. © 2015 IUMS.
Liberto, Erica; Cagliero, Cecilia; Cordero, Chiara; Rubiolo, Patrizia; Bicchi, Carlo; Sgorbini, Barbara
2017-03-17
Recent technological advances in dynamic headspace sampling (D-HS) and the possibility to automate this sampling method have lead to a marked improvement in its the performance, a strong renewal of interest in it, and have extended its fields of application. The introduction of in-parallel and in-series automatic multi-sampling and of new trapping materials, plus the possibility to design an effective sampling process by correctly applying the breakthrough volume theory, have make profiling more representative, and have enhanced selectivity, and flexibility, also offering the possibility of fractionated enrichment in particular for high-volatility compounds. This study deals with fractionated D-HS ability to produce a sample representative of the volatile fraction of solid or liquid matrices. Experiments were carried out on a model equimolar (0.5mM) EtOH/water solution, comprising 16 compounds with different polarities and volatilities, structures ranging from C5 to C15 and vapor pressures from 4.15kPa (2,3-pentandione) to 0.004kPa (t-β-caryophyllene), and on an Arabica roasted coffee powder. Three trapping materials were considered: Tenax TA™ (TX), Polydimethylsiloxane foam (PDMS), and a three-carbon cartridge Carbopack B/Carbopack C/Carbosieve S-III™ (CBS). The influence of several parameters on the design of successful fractionated D-HS sampling. Including the physical and chemical characteristics of analytes and matrix, trapping material, analyte breakthrough, purge gas volumes, and sampling temperature, were investigated. The results show that, by appropriately choosing sampling conditions, fractionated D-HS sampling, based on component volatility, can produce a fast and representative profile of the matrix volatile fraction, with total recoveries comparable to those obtained by full evaporation D-HS for liquid samples, and very high concentration factors for solid samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Studies of erosion of solar max samples of Kapton and Teflon
NASA Technical Reports Server (NTRS)
Fristrom, R. M.; Benson, R. C.; Bargeron, C. B.; Phillips, T. E.; Vest, C. E.; Hoshall, C. H.; Satkiewicz, F. G.; Uy, O. M.
1985-01-01
Several samples of Kapton and Teflon which was exposed to solar radiation were examined. The samples represent material behavior in near Earth space. Clues to the identity of erosive processes and the responsible species were searched for. Interest centered around oxygen atoms which are ubiquitous at these altitudes and are known to erode some metal surfaces. Three diagnostic methods were employed: optical microscopy, scanning electron microscopy, and fourier transform infrared spectroscopy. Two types of simulation were used: a flow containing low energy oxygen atoms and bombardment with 3000 volt Ar ions. Results and conclusions are presented.
A Surrogate-based Adaptive Sampling Approach for History Matching and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Zhang, Dongxiao; Lin, Guang
A critical procedure in reservoir simulations is history matching (or data assimilation in a broader sense), which calibrates model parameters such that the simulation results are consistent with field measurements, and hence improves the credibility of the predictions given by the simulations. Often there exist non-unique combinations of parameter values that all yield the simulation results matching the measurements. For such ill-posed history matching problems, Bayesian theorem provides a theoretical foundation to represent different solutions and to quantify the uncertainty with the posterior PDF. Lacking an analytical solution in most situations, the posterior PDF may be characterized with a samplemore » of realizations, each representing a possible scenario. A novel sampling algorithm is presented here for the Bayesian solutions to history matching problems. We aim to deal with two commonly encountered issues: 1) as a result of the nonlinear input-output relationship in a reservoir model, the posterior distribution could be in a complex form, such as multimodal, which violates the Gaussian assumption required by most of the commonly used data assimilation approaches; 2) a typical sampling method requires intensive model evaluations and hence may cause unaffordable computational cost. In the developed algorithm, we use a Gaussian mixture model as the proposal distribution in the sampling process, which is simple but also flexible to approximate non-Gaussian distributions and is particularly efficient when the posterior is multimodal. Also, a Gaussian process is utilized as a surrogate model to speed up the sampling process. Furthermore, an iterative scheme of adaptive surrogate refinement and re-sampling ensures sampling accuracy while keeping the computational cost at a minimum level. The developed approach is demonstrated with an illustrative example and shows its capability in handling the above-mentioned issues. Multimodal posterior of the history matching problem is captured and are used to give a reliable production prediction with uncertainty quantification. The new algorithm reveals a great improvement in terms of computational efficiency comparing previously studied approaches for the sample problem.« less
Under-sampling in a Multiple-Channel Laser Vibrometry System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corey, Jordan
2007-03-01
Laser vibrometry is a technique used to detect vibrations on objects using the interference of coherent light with itself. Most vibrometry systems process only one target location at a time, but processing multiple locations simultaneously provides improved detection capabilities. Traditional laser vibrometry systems employ oversampling to sample the incoming modulated-light signal, however as the number of channels increases in these systems, certain issues arise such a higher computational cost, excessive heat, increased power requirements, and increased component cost. This thesis describes a novel approach to laser vibrometry that utilizes undersampling to control the undesirable issues associated with over-sampled systems. Undersamplingmore » allows for significantly less samples to represent the modulated-light signals, which offers several advantages in the overall system design. These advantages include an improvement in thermal efficiency, lower processing requirements, and a higher immunity to the relative intensity noise inherent in laser vibrometry applications. A unique feature of this implementation is the use of a parallel architecture to increase the overall system throughput. This parallelism is realized using a hierarchical multi-channel architecture based on off-the-shelf programmable logic devices (PLDs).« less
19 CFR 151.52 - Sampling procedures.
Code of Federal Regulations, 2012 CFR
2012-04-01
.... Representative commercial moisture and assay samples shall be taken under Customs supervision for testing by the Customs laboratory. The samples used for the moisture test shall be representative of the shipment at the... verified commercial moisture sample and prepared assay sample certified to be representative of the...
1992-09-01
and collecting and processing data. They were at the front line in interacting with the subjects and maintaining morale. They did an excellent job. They...second for 16 parameter channels, and the data were processed to produce a single root mean square (RMS) error value for each channel appropriate to...represented in the final analysis. Physiological data The physiological data on the VAX were processed by sampling them at 5-minute intervals throughout the
Oligosaccharide formation during commercial pear juice processing.
Willems, Jamie L; Low, Nicholas H
2016-08-01
The effect of enzyme treatment and processing on the oligosaccharide profile of commercial pear juice samples was examined by high performance anion exchange chromatography with pulsed amperometric detection and capillary gas chromatography with flame ionization detection. Industrial samples representing the major stages of processing produced with various commercial enzyme preparations were studied. Through the use of commercially available standards and laboratory scale enzymatic hydrolysis of pectin, starch and xyloglucan; galacturonic acid oligomers, glucose oligomers (e.g., maltose and cellotriose) and isoprimeverose were identified as being formed during pear juice production. It was found that the majority of polysaccharide hydrolysis and oligosaccharide formation occurred during enzymatic treatment at the pear mashing stage and that the remaining processing steps had minimal impact on the carbohydrate-based chromatographic profile of pear juice. Also, all commercial enzyme preparations and conditions (time and temperature) studied produced similar carbohydrate-based chromatographic profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.
A method for development of a system of identification for Appalachian coal-bearing rocks
Ferm, J.C.; Weisenfluh, G.A.; Smith, G.C.
2002-01-01
The number of observable properties of sedimentary rocks is large and numerous classifications have been proposed for describing them. Some rock classifications, however, may be disadvantageous in situations such as logging rock core during coal exploration programs, where speed and simplicity are the essence. After experimenting with a number of formats for logging rock core in the Appalachian coal fields, a method of using color photographs accompanied by a rock name and numeric code was selected. In order to generate a representative collection of rocks to be photographed, sample methods were devised to produce a representative collection, and empirically based techniques were devised to identify repeatedly recognizable rock types. A number of cores representing the stratigraphic and geographic range of the region were sampled so that every megascopically recognizable variety was included in the collection; the frequency of samples of any variety reflects the frequency with which it would be encountered during logging. In order to generate repeatedly recognizable rock classes, the samples were sorted to display variation in grain size, mineral composition, color, and sedimentary structures. Class boundaries for each property were selected on the basis of existing, widely accepted limits and the precision with which these limits could be recognized. The process of sorting the core samples demonstrated relationships between rock properties and indicated that similar methods, applied to other groups of rocks, could yield more widely applicable field classifications. ?? 2002 Elsevier Science B.V. All rights reserved.
Bat community species richness and composition in a restinga protected area in Southeastern Brazil.
Oprea, M; Esbérard, C E L; Vieira, T B; Mendes, P; Pimenta, V T; Brito, D; Ditchfield, A D
2009-11-01
In Brazil, restingas are under severe human-induced impacts resulting in habitat degradation and loss and remain one of the less frequently studied ecosystems. The main objectives of the present study are to describe the bat community in a restinga in Paulo Cesar Vinha State Park, Guarapari municipality, state of Espírito Santo, southeastern Brazil. Fieldwork was conducted twice a month from August 2004 to September 2005. A total sampling effort of 40,300 m(2)/h, represents the largest sampling effort for sampling bats in restingas to date. Bats were sampled in five different vegetational types in the area. Captured bats were processed recording information on species, sex, age, forearm length and weight. Shannon Diversity and Jaccard indexes were used to analyse diversity and similarity among habitats in the Park. A total of 554 captures belonging to 14 species and two families were obtained. Noctilio leporinus was recorded through direct observation and an ultra-sound detector also registered the presence of individuals from the family Molossidae, without being possible to distinguish it at specific level. Frugivores were the most representative guild. Richness was higher in Clusia shrubs (11 species) and Caraís lagoon (10 species). Shannon diversity index was estimated at H' = 1.43 for the overall sample, with Caraís lagoon representing the most diverse habitat (H' = 1.60). The greater similarity (J = 0.714) was observed for the two areas under high human influence.
Linking Vocational Education to Business/Industry Training Needs. Final Report.
ERIC Educational Resources Information Center
Gilbertson, Alan; And Others
A study investigated the processes Wisconsin's businesses and industries use to identify training and retraining needs and the mechanisms they use to communicate these needs to the state's vocational, technical, and adult education (VTAE) system. Data were collected by a survey questionnaire sent to a representative sample of 361 Wisconsin firms.…
Joint Book Reading and Receptive Vocabulary: A Parallel Process Model
ERIC Educational Resources Information Center
Meng, Christine
2016-01-01
The purpose of the present study was to understand the reciprocal, bidirectional longitudinal relation between joint book reading and English receptive vocabulary. To address the research goals, a nationally representative sample of Head Start children, the Head Start Family and Child Experiences Survey (2003 cohort), was used for analysis. The…
Impact of a spring defoliator on common oak
Victor V. Rubtsov; Irina A. Utkina
1991-01-01
We have investigated the population dynamics of some common phyllophagous insects in oak stands of the forest-steppe zone and their impact on common oak (Quercus robur L). Considerable attention has also been paid to mathematical modeling of the studied processes. All field data represent samples taken from the Tellerman oak grove in the Voronezh...
User's guide to the SEPHIS computer code for calculating the Thorex solvent extraction system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, S.B.; Rainey, R.H.
1979-05-01
The SEPHIS computer program was developed to simulate the countercurrent solvent extraction process. The code has now been adapted to model the Acid Thorex flow sheet. This report represents a practical user's guide to SEPHIS - Thorex containing a program description, user information, program listing, and sample input and output.
Letter-Sound Reading: Teaching Preschool Children Print-to-Sound Processing
ERIC Educational Resources Information Center
Wolf, Gail Marie
2016-01-01
This intervention study investigated the growth of letter sound reading and growth of consonant-vowel-consonant (CVC) word decoding abilities for a representative sample of 41 US children in preschool settings. Specifically, the study evaluated the effectiveness of a 3-step letter-sound teaching intervention in teaching preschool children to…
USDA-ARS?s Scientific Manuscript database
Laser induced breakdown spectroscopy (LIBS) is used as the basis for discrimination between 2 genera of gram-negative bacteria and 2 genera of gram-positive bacteria representing pathogenic threats commonly found in poultry processing rinse waters. Because LIBS-based discrimination relies primarily ...
ERIC Educational Resources Information Center
Gee, Gilbert C.; Pavalko, Eliza K.; Long, J. Scott
2007-01-01
Self-reported discrimination is linked to diminished well-being, but the processes generating these reports remain poorly understood. Employing the life course perspective, this paper examines the correspondence between expected age preferences for workers and perceived age discrimination among a nationally representative sample of 7,225 working…
Marchetti, Igor; Shumake, Jason; Grahek, Ivan; Koster, Ernst H W
2018-08-01
Temperamental effortful control and attentional networks are increasingly viewed as important underlying processes in depression and anxiety. However, it is still unknown whether these factors facilitate depressive and anxiety symptoms in the general population and, more specifically, in remitted depressed individuals. We investigated to what extent effortful control and attentional networks (i.e., Attention Network Task) explain concurrent depressive and anxious symptoms in healthy individuals (n = 270) and remitted depressed individuals (n = 90). Both samples were highly representative of the US population. Increased effortful control predicted a substantial decrease in symptoms of both depression and anxiety in the whole sample, whereas decreased efficiency of executive attention predicted a modest increase in depressive symptoms. Remitted depressed individuals did not show less effortful control nor less efficient attentional networks than healthy individuals. Moreover, clinical status did not moderate the relationship between temperamental factors and either depressive or anxiety symptoms. Limitations include the cross-sectional nature of the study. Our study shows that temperamental effortful control represents an important transdiagnostic process for depressive and anxiety symptoms in adults. Copyright © 2018 Elsevier B.V. All rights reserved.
Scarano, Christian; Giacometti, Federica; Manfreda, Gerardo; Lucchi, Alex; Pes, Emanuela; Spanu, Carlo; De Santis, Enrico Pietro Luigi; Serraino, Andrea
2014-11-01
This study aimed to evaluate Arcobacter species contamination of industrial sheep ricotta cheese purchased at retail and to establish if the dairy plant environment may represent a source of contamination. A total of 32 sheep ricotta cheeses (1.5 kg/pack) packed in a modified atmosphere were purchased at retail, and 30 samples were collected in two sampling sessions performed in the cheese factory from surfaces in contact with food and from surfaces not in contact with food. Seven out of 32 samples (21.9%) of ricotta cheese collected at retail tested positive for Arcobacter butzleri at cultural examination; all positive samples were collected during the same sampling and belonged to the same batch. Ten surface samples (33.3%) collected in the dairy plant were positive for A. butzleri. Cluster analysis identified 32 pulsed-field gel electrophoresis (PFGE) patterns. The same PFGE pattern was isolated from more than one ricotta cheese sample, indicating a common source of contamination, while more PFGE patterns could be isolated in single samples, indicating different sources of contamination. The results of the environmental sampling showed that A. butzleri may be commonly isolated from the dairy processing plant investigated and may survive over time, as confirmed by the isolation of the same PFGE pattern in different industrial plant surface samples. Floor contamination may represent a source of A. butzleri spread to different areas of the dairy plant, as demonstrated by isolation of the same PFGE pattern in different production areas. Isolation of the same PFGE pattern from surface samples in the dairy plant and from ricotta cheese purchased at retail showed that plant surfaces may represent a source of A. butzleri postprocessing contamination in cheeses produced in industrial dairy plants. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Advanced core-analyses for subsurface characterization
NASA Astrophysics Data System (ADS)
Pini, R.
2017-12-01
The heterogeneity of geological formations varies over a wide range of length scales and represents a major challenge for predicting the movement of fluids in the subsurface. Although they are inherently limited in the accessible length-scale, laboratory measurements on reservoir core samples still represent the only way to make direct observations on key transport properties. Yet, properties derived on these samples are of limited use and should be regarded as sample-specific (or `pseudos'), if the presence of sub-core scale heterogeneities is not accounted for in data processing and interpretation. The advent of imaging technology has significantly reshaped the landscape of so-called Special Core Analysis (SCAL) by providing unprecedented insight on rock structure and processes down to the scale of a single pore throat (i.e. the scale at which all reservoir processes operate). Accordingly, improved laboratory workflows are needed that make use of such wealth of information by e.g., referring to the internal structure of the sample and in-situ observations, to obtain accurate parameterisation of both rock- and flow-properties that can be used to populate numerical models. We report here on the development of such workflow for the study of solute mixing and dispersion during single- and multi-phase flows in heterogeneous porous systems through a unique combination of two complementary imaging techniques, namely X-ray Computed Tomography (CT) and Positron Emission Tomography (PET). The experimental protocol is applied to both synthetic and natural porous media, and it integrates (i) macroscopic observations (tracer effluent curves), (ii) sub-core scale parameterisation of rock heterogeneities (e.g., porosity, permeability and capillary pressure), and direct 3D observation of (iii) fluid saturation distribution and (iv) the dynamic spreading of the solute plumes. Suitable mathematical models are applied to reproduce experimental observations, including both 1D and 3D numerical schemes populated with the parameterisation above. While it validates the core-flooding experiments themselves, the calibrated mathematical model represents a key element for extending them to conditions prevalent in the subsurface, which would be otherwise not attainable in the laboratory.
Fong, Sophia Yui Kau; Poulsen, Jessie; Brandl, Martin; Bauer-Brandl, Annette
2017-01-01
A novel microdialysis-dissolution/permeation (M-D/P) system was developed for the biopharmaceutical assessment of oral drug formulations. This system consists of a side-by-side diffusion chamber, a microdialysis unit fixed within the dissolution chamber for continuous sampling, and a biomimetic Permeapad® as the intestinal barrier. In the M-D/P system, the concentration of the molecularly dissolved drug (with MWCO <20kDa) was measured over time in the dissolution compartment (representing the gastrointestinal tract) while the concentration of the permeated drug was measured in the acceptor compartment (representing the blood). The kinetics of both the dissolution process and the permeation process were simultaneously quantified under circumstances that mimic physiological conditions. For the current proof-of-concept study, hydrocortisone (HCS) in the form of slowly dissolving solvate crystals and buffer and the biorelevant fasted state simulated intestinal fluids (FaSSIF), were employed as the model drug and dissolution media, respectively. The applicability of the M-D/P system to dissolution and permeation profiling of HCS in buffer and in FaSSIF has been successfully demonstrated. Compared to the conventional direct sampling method (using filter of 0.1-0.45μm), sampling by the M-D/P system exhibited distinct advantages, including (1) showing minimal disturbance of the permeation process, (2) differentiating "molecularly" dissolved drugs from "apparently" dissolved drugs during dissolution of HCS in FaSSIF, and (3) being less laborious and having better sampling temporal resolution. M-D/P system appeared to be a promising, simple and routine tool that allows for the researchers' intensive comprehension of the interplay of dissolution and permeation thus helping for better oral formulation screening and as an ultimate goal, for better dosage forms assessment. Copyright © 2016. Published by Elsevier B.V.
Decision making and sequential sampling from memory
Shadlen, Michael N.; Shohamy, Daphna
2016-01-01
Decisions take time, and as a rule more difficult decisions take more time. But this only raises the question of what consumes the time. For decisions informed by a sequence of samples of evidence, the answer is straightforward: more samples are available with more time. Indeed the speed and accuracy of such decisions are explained by the accumulation of evidence to a threshold or bound. However, the same framework seems to apply to decisions that are not obviously informed by sequences of evidence samples. Here we proffer the hypothesis that the sequential character of such tasks involves retrieval of evidence from memory. We explore this hypothesis by focusing on value-based decisions and argue that mnemonic processes can account for regularities in choice and decision time. We speculate on the neural mechanisms that link sampling of evidence from memory to circuits that represent the accumulated evidence bearing on a choice. We propose that memory processes may contribute to a wider class of decisions that conform to the regularities of choice-reaction time predicted by the sequential sampling framework. PMID:27253447
Mendoza, Lucía M; Neef, Alexander; Vignolo, Graciela; Belloch, Carmela
2017-10-01
Diversity and dynamics of yeasts associated with the fermentation of Argentinian maize-based beverage chicha was investigated. Samples taken at different stages from two chicha productions were analyzed by culture-dependent and culture-independent methods. Five hundred and ninety six yeasts were isolated by classical microbiological methods and 16 species identified by RFLPs and sequencing of D1/D2 26S rRNA gene. Genetic typing of isolates from the dominant species, Saccharomyces cerevisiae, by PCR of delta elements revealed up to 42 different patterns. High-throughput sequencing (HTS) of D1/D2 26S rRNA gene amplicons from chicha samples detected more than one hundred yeast species and almost fifty filamentous fungi taxa. Analysis of the data revealed that yeasts dominated the fermentation, although, a significant percentage of filamentous fungi appeared in the first step of the process. Statistical analysis of results showed that very few taxa were represented by more than 1% of the reads per sample at any step of the process. S. cerevisiae represented more than 90% of the reads in the fermentative samples. Other yeast species dominated the pre-fermentative steps and abounded in fermented samples when S. cerevisiae was in percentages below 90%. Most yeasts species detected by pyrosequencing were not recovered by cultivation. In contrast, the cultivation-based methodology detected very few yeast taxa, and most of them corresponded with very few reads in the pyrosequencing analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Akhtar, Naveed; Mian, Ajmal
2017-10-03
We present a principled approach to learn a discriminative dictionary along a linear classifier for hyperspectral classification. Our approach places Gaussian Process priors over the dictionary to account for the relative smoothness of the natural spectra, whereas the classifier parameters are sampled from multivariate Gaussians. We employ two Beta-Bernoulli processes to jointly infer the dictionary and the classifier. These processes are coupled under the same sets of Bernoulli distributions. In our approach, these distributions signify the frequency of the dictionary atom usage in representing class-specific training spectra, which also makes the dictionary discriminative. Due to the coupling between the dictionary and the classifier, the popularity of the atoms for representing different classes gets encoded into the classifier. This helps in predicting the class labels of test spectra that are first represented over the dictionary by solving a simultaneous sparse optimization problem. The labels of the spectra are predicted by feeding the resulting representations to the classifier. Our approach exploits the nonparametric Bayesian framework to automatically infer the dictionary size--the key parameter in discriminative dictionary learning. Moreover, it also has the desirable property of adaptively learning the association between the dictionary atoms and the class labels by itself. We use Gibbs sampling to infer the posterior probability distributions over the dictionary and the classifier under the proposed model, for which, we derive analytical expressions. To establish the effectiveness of our approach, we test it on benchmark hyperspectral images. The classification performance is compared with the state-of-the-art dictionary learning-based classification methods.
Ultra-accelerated natural sunlight exposure testing
Jorgensen, Gary J.; Bingham, Carl; Goggin, Rita; Lewandowski, Allan A.; Netter, Judy C.
2000-06-13
Process and apparatus for providing ultra accelerated natural sunlight exposure testing of samples under controlled weathering without introducing unrealistic failure mechanisms in exposed materials and without breaking reciprocity relationships between flux exposure levels and cumulative dose that includes multiple concurrent levels of temperature and relative humidity at high levels of natural sunlight comprising: a) concentrating solar flux uniformly; b) directing the controlled uniform sunlight onto sample materials in a chamber enclosing multiple concurrent levels of temperature and relative humidity to allow the sample materials to be subjected to accelerated irradiance exposure factors for a sufficient period of time in days to provide a corresponding time of about at least a years worth of representative weathering of the sample materials.
Results from tests of TFL Hydragard sampling loop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steimke, J.L.
When the Defense Waste Processing Facility (DWPF) is operational, processed radioactive sludge will be transferred in batches to the Slurry Mix Evaporator (SME), where glass frit will be added and the contents concentrated by boiling. Batches of the slurry mixture are transferred from the SME to the Melter Feed Tank (MFT). Hydragard{reg_sign} sampling systems are used on the SME and the MFT for collecting slurry samples in vials for chemical analysis. An accurate replica of the Hydragard sampling system was built and tested in the thermal Fluids Laboratory (TFL) to determine the hydragard accuracy. It was determined that the originalmore » Hydragard valve frequently drew a non-representative sample stream through the sample vial that ranged from frit enriched to frit depleted. The Hydragard valve was modified by moving the plunger and its seat backwards so that the outer surface of the plunger was flush with the inside diameter of the transfer line when the valve was open. The slurry flowing through the vial accurately represented the composition of the slurry in the reservoir for two types of slurries, different dilution factors, a range of transfer flows and a range of vial flows. It was then found that the 15 ml of slurry left in the vial when the Hydragard valve was closed, which is what will be analyzed at DWPF, had a lower ratio of frit to sludge as characterized by the lithium to iron ratio than the slurry flowing through it. The reason for these differences is not understood at this time but it is recommended that additional experimentation be performed with the TFL Hydragard loop to determine the cause.« less
Lu, David; Graf, Ryon P.; Harvey, Melissa; Madan, Ravi A.; Heery, Christopher; Marte, Jennifer; Beasley, Sharon; Tsang, Kwong Y.; Krupa, Rachel; Louw, Jessica; Wahl, Justin; Bales, Natalee; Landers, Mark; Marrinucci, Dena; Schlom, Jeffrey; Gulley, James L.; Dittamore, Ryan
2015-01-01
Retrospective analysis of patient tumour samples is a cornerstone of clinical research. CTC biomarker characterization offers a non-invasive method to analyse patient samples. However, current CTC technologies require prospective blood collection, thereby reducing the ability to utilize archived clinical cohorts with long-term outcome data. We sought to investigate CTC recovery from frozen, archived patient PBMC pellets. Matched samples from both mCRPC patients and mock samples, which were prepared by spiking healthy donor blood with cultured prostate cancer cell line cells, were processed “fresh” via Epic CTC Platform or from “frozen” PBMC pellets. Samples were analysed for CTC enumeration and biomarker characterization via immunofluorescent (IF) biomarkers, fluorescence in-situ hybridization (FISH) and CTC morphology. In the frozen patient PMBC samples, the median CTC recovery was 18%, compared to the freshly processed blood. However, abundance and localization of cytokeratin (CK) and androgen receptor (AR) protein, as measured by IF, were largely concordant between the fresh and frozen CTCs. Furthermore, a FISH analysis of PTEN loss showed high concordance in fresh vs. frozen. The observed data indicate that CTC biomarker characterization from frozen archival samples is feasible and representative of prospectively collected samples. PMID:28936240
NASA Astrophysics Data System (ADS)
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-01
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.
Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke
2017-08-05
In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.
Staphylococcus aureus Entrance into the Dairy Chain: Tracking S. aureus from Dairy Cow to Cheese
Kümmel, Judith; Stessl, Beatrix; Gonano, Monika; Walcher, Georg; Bereuter, Othmar; Fricker, Martina; Grunert, Tom; Wagner, Martin; Ehling-Schulz, Monika
2016-01-01
Staphylococcus aureus is one of the most important contagious mastitis pathogens in dairy cattle. Due to its zoonotic potential, control of S. aureus is not only of great economic importance in the dairy industry but also a significant public health concern. The aim of this study was to decipher the potential of bovine udder associated S. aureus as reservoir for S. aureus contamination in dairy production and processing. From 18 farms, delivering their milk to an alpine dairy plant for the production of smeared semi-hard and hard cheese. one thousand hundred seventy six one thousand hundred seventy six quarter milk (QM) samples of all cows in lactation (n = 294) and representative samples form bulk tank milk (BTM) of all farms were surveyed for coagulase positive (CPS) and coagulase negative Staphylococci (CNS). Furthermore, samples from different steps of the cheese manufacturing process were tested for CPS and CNS. As revealed by chemometric-assisted FTIR spectroscopy and molecular subtyping (spa typing and multi locus sequence typing), dairy cattle represent indeed an important, yet underreported, entrance point of S. aureus into the dairy chain. Our data clearly show that certain S. aureus subtypes are present in primary production as well as in the cheese processing at the dairy plant. However, although a considerable diversity of S. aureus subtypes was observed in QM and BTM at the farms, only certain S. aureus subtypes were able to enter and persist in the cheese manufacturing at the dairy plant and could be isolated from cheese until day 14 of ripening. Farm strains belonging to the FTIR cluster B1 and B3, which show genetic characteristics (t2953, ST8, enterotoxin profile: sea/sed/sej) of the recently described S. aureus genotype B, most successfully contaminated the cheese production at the dairy plant. Thus, our study fosters the hypothesis that genotype B S. aureus represent a specific challenge in control of S. aureus in the dairy chain that requires effective clearance strategies and hygienic measures already in primary production to avoid a potential transfer of enterotoxic strains or enterotoxins into the dairy processing and the final retail product. PMID:27790200
Progressive sample processing of band selection for hyperspectral imagery
NASA Astrophysics Data System (ADS)
Liu, Keng-Hao; Chien, Hung-Chang; Chen, Shih-Yu
2017-10-01
Band selection (BS) is one of the most important topics in hyperspectral image (HSI) processing. The objective of BS is to find a set of representative bands that can represent the whole image with lower inter-band redundancy. Many types of BS algorithms were proposed in the past. However, most of them can be carried on in an off-line manner. It means that they can only be implemented on the pre-collected data. Those off-line based methods are sometime useless for those applications that are timeliness, particular in disaster prevention and target detection. To tackle this issue, a new concept, called progressive sample processing (PSP), was proposed recently. The PSP is an "on-line" framework where the specific type of algorithm can process the currently collected data during the data transmission under band-interleavedby-sample/pixel (BIS/BIP) protocol. This paper proposes an online BS method that integrates a sparse-based BS into PSP framework, called PSP-BS. In PSP-BS, the BS can be carried out by updating BS result recursively pixel by pixel in the same way that a Kalman filter does for updating data information in a recursive fashion. The sparse regression is solved by orthogonal matching pursuit (OMP) algorithm, and the recursive equations of PSP-BS are derived by using matrix decomposition. The experiments conducted on a real hyperspectral image show that the PSP-BS can progressively output the BS status with very low computing time. The convergence of BS results during the transmission can be quickly achieved by using a rearranged pixel transmission sequence. This significant advantage allows BS to be implemented in a real time manner when the HSI data is transmitted pixel by pixel.
Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David
2016-01-01
The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/− 2°C following the ramp up. The system is demonstrated to provide linear results between 104 and 108 CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636
Li, Der-Chiang; Hu, Susan C; Lin, Liang-Sian; Yeh, Chun-Wu
2017-01-01
It is difficult for learning models to achieve high classification performances with imbalanced data sets, because with imbalanced data sets, when one of the classes is much larger than the others, most machine learning and data mining classifiers are overly influenced by the larger classes and ignore the smaller ones. As a result, the classification algorithms often have poor learning performances due to slow convergence in the smaller classes. To balance such data sets, this paper presents a strategy that involves reducing the sizes of the majority data and generating synthetic samples for the minority data. In the reducing operation, we use the box-and-whisker plot approach to exclude outliers and the Mega-Trend-Diffusion method to find representative data from the majority data. To generate the synthetic samples, we propose a counterintuitive hypothesis to find the distributed shape of the minority data, and then produce samples according to this distribution. Four real datasets were used to examine the performance of the proposed approach. We used paired t-tests to compare the Accuracy, G-mean, and F-measure scores of the proposed data pre-processing (PPDP) method merging in the D3C method (PPDP+D3C) with those of the one-sided selection (OSS), the well-known SMOTEBoost (SB) study, and the normal distribution-based oversampling (NDO) approach, and the proposed data pre-processing (PPDP) method. The results indicate that the classification performance of the proposed approach is better than that of above-mentioned methods.
Allman, Elizabeth S; Degnan, James H; Rhodes, John A
2011-06-01
Gene trees are evolutionary trees representing the ancestry of genes sampled from multiple populations. Species trees represent populations of individuals-each with many genes-splitting into new populations or species. The coalescent process, which models ancestry of gene copies within populations, is often used to model the probability distribution of gene trees given a fixed species tree. This multispecies coalescent model provides a framework for phylogeneticists to infer species trees from gene trees using maximum likelihood or Bayesian approaches. Because the coalescent models a branching process over time, all trees are typically assumed to be rooted in this setting. Often, however, gene trees inferred by traditional phylogenetic methods are unrooted. We investigate probabilities of unrooted gene trees under the multispecies coalescent model. We show that when there are four species with one gene sampled per species, the distribution of unrooted gene tree topologies identifies the unrooted species tree topology and some, but not all, information in the species tree edges (branch lengths). The location of the root on the species tree is not identifiable in this situation. However, for 5 or more species with one gene sampled per species, we show that the distribution of unrooted gene tree topologies identifies the rooted species tree topology and all its internal branch lengths. The length of any pendant branch leading to a leaf of the species tree is also identifiable for any species from which more than one gene is sampled.
IndeCut evaluates performance of network motif discovery algorithms.
Ansariola, Mitra; Megraw, Molly; Koslicki, David
2018-05-01
Genomic networks represent a complex map of molecular interactions which are descriptive of the biological processes occurring in living cells. Identifying the small over-represented circuitry patterns in these networks helps generate hypotheses about the functional basis of such complex processes. Network motif discovery is a systematic way of achieving this goal. However, a reliable network motif discovery outcome requires generating random background networks which are the result of a uniform and independent graph sampling method. To date, there has been no method to numerically evaluate whether any network motif discovery algorithm performs as intended on realistically sized datasets-thus it was not possible to assess the validity of resulting network motifs. In this work, we present IndeCut, the first method to date that characterizes network motif finding algorithm performance in terms of uniform sampling on realistically sized networks. We demonstrate that it is critical to use IndeCut prior to running any network motif finder for two reasons. First, IndeCut indicates the number of samples needed for a tool to produce an outcome that is both reproducible and accurate. Second, IndeCut allows users to choose the tool that generates samples in the most independent fashion for their network of interest among many available options. The open source software package is available at https://github.com/megrawlab/IndeCut. megrawm@science.oregonstate.edu or david.koslicki@math.oregonstate.edu. Supplementary data are available at Bioinformatics online.
Unified Model for the Overall Efficiency of Inlets Sampling from Horizontal Aerosol Flows
NASA Astrophysics Data System (ADS)
Hangal, Sunil Pralhad
When sampling aerosols from ambient or industrial air environments, the sampled aerosol must be representative of the aerosol in the free stream. The changes that occur during sampling must be assessed quantitatively so that sampling errors can be compensated for. In this study, unified models have been developed for the overall efficiency of tubular sharp-edged inlets sampling from horizontal aerosol flows oriented at 0 to 90^circ relative to the wind direction in the vertical (pitch) and horizontal plane(yaw). In the unified model, based on experimental data, the aspiration efficiency is represented by a single equation with different inertial parameters at 0 to 60^ circ and 45 to 90^circ . Tnt transmission efficiency is separated into two components: one due to gravitational settling in the boundary layer and the other due to impaction. The gravitational settling component is determined by extending a previously developed isoaxial sampling model to nonisoaxial sampling. The impaction component is determined by a new model that quantifies the particle losses caused by wall impaction. The model also quantifies the additional particle losses resulting from turbulent motion in the vena contracta which is formed in the inlet when the inlet velocity is higher than the wind velocity. When sampling aerosols in ambient or industrial environments with an inlet, small changes in wind direction or physical constraints in positioning the inlet in the system necessitates the assessment of sampling efficiency in both the vertical and horizontal plane. The overall sampling efficiency of tubular inlets has been experimentally investigated in yaw and pitch orientations at 0 to 20 ^circ from horizontal aerosol flows using a wind tunnel facility. The model for overall sampling efficiency has been extended to include both yaw and pitch sampling based on the new data. In this model, the difference between yaw and pitch is expressed by the effect of gravity on the impaction process inside the inlet described by a newly developed gravity effect angle. At yaw, the gravity effect angle on the wall impaction process does not change with sampling angle. At pitch, the gravity effect on the impaction process results in particle loss increase for upward and decrease for downward sampling. Using the unified model, graphical representations have been developed for sampling at small angles. These can be used in the field to determine the overall sampling efficiency of inlets at several operating conditions and the operating conditions that result in an acceptable sampling error. Pitch and diameter factors have been introduced for relating the efficiency values over a wide range of conditions to those of a reference condition. The pitch factor determines the overall sampling efficiency at pitch from yaw values, and the diameter factor determines the overall sampling efficiency at different inlet diameters.
Weiss, Agnes; Jérôme, Valérie; Freitag, Ruth
2007-06-15
The goal of the project was the extraction of PCR-compatible genomic DNA representative of the entire microbial community from municipal biogas plant samples (mash, bioreactor content, process water, liquid fertilizer). For the initial isolation of representative DNA from the respective lysates, methods were used that employed adsorption, extraction, or precipitation to specifically enrich the DNA. Since no dedicated method for biogas plant samples was available, preference was given to kits/methods suited to samples that resembled either the bioreactor feed, e.g. foodstuffs, or those intended for environmental samples including wastewater. None of the methods succeeded in preparing DNA that was directly PCR-compatible. Instead the DNA was found to still contain considerable amounts of difficult-to-remove enzyme inhibitors (presumably humic acids) that hindered the PCR reaction. Based on the isolation method that gave the highest yield/purity for all sample types, subsequent purification was attempted by agarose gel electrophoresis followed by electroelution, spermine precipitation, or dialysis through nitrocellulose membrane. A combination of phenol/chloroform extraction followed by purification via dialysis constituted the most efficient sample treatment. When such DNA preparations were diluted 1:100 they did no longer inhibit PCR reactions, while they still contained sufficient genomic DNA to allow specific amplification of specific target sequences.
Frequency of lucid dreaming in a representative German sample.
Schredl, Michael; Erlacher, Daniel
2011-02-01
Lucid dreams occur when a person is aware that he is dreaming while he is dreaming. In a representative sample of German adults (N = 919), 51% of the participants reported that they had experienced a lucid dream at least once. Lucid dream recall was significantly higher in women and negatively correlated with age. However, these effects might be explained by the frequency of dream recall, as there was a correlation of .57 between frequency of dream recall and frequency of lucid dreams. Other sociodemographic variables like education, marital status, or monthly income were not related to lucid dream frequency. Given the relatively high prevalence of lucid dreaming reported in the present study, research on lucid dreams might be pursued in the sleep laboratory to expand the knowledge about sleep, dreaming, and consciousness processes in general.
Filling in the Gaps: Xenoliths in Meteorites are Samples of "Missing" Asteroid Lithologies
NASA Technical Reports Server (NTRS)
Zolensky, Mike
2016-01-01
We know that the stones that fall to earth as meteorites are not representative of the full diversity of small solar system bodies, because of the peculiarities of the dynamical processes that send material into Earth-crossing paths [1] which result in severe selection biases. Thus, the bulk of the meteorites that fall are insufficient to understand the full range of early solar system processes. However, the situation is different for pebble- and smaller-sized objects that stream past the giant planets and asteroid belts into the inner solar system in a representative manner. Thus, micrometeorites and interplanetary dust particles have been exploited to permit study of objects that do not provide meteorites to earth. However, there is another population of materials that sample a larger range of small solar system bodies, but which have received little attention - pebble-sized foreign clasts in meteorites (also called xenoliths, dark inclusions, clasts, etc.). Unfortunately, most previous studies of these clasts have been misleading, in that these objects have simply been identified as pieces of CM or CI chondrites. In our work we have found this to be generally erroneous, and that CM and especially CI clasts are actually rather rare. We therefore test the hypothesis that these clasts sample the full range of small solar system bodies. We have located and obtained samples of clasts in 81 different meteorites, and have begun a thorough characterization of the bulk compositions, mineralogies, petrographies, and organic compositions of this unique sample set. In addition to the standard e-beam analyses, recent advances in technology now permit us to measure bulk O isotopic compositions, and major- though trace-element compositions of the sub-mm-sized discrete clasts. Detailed characterization of these clasts permit us to explore the full range of mineralogical and petrologic processes in the early solar system, including the nature of fluids in the Kuiper belt and the outer main asteroid belt, as revealed by the mineralogy of secondary phases.
Neuromimetic Sound Representation for Percept Detection and Manipulation
NASA Astrophysics Data System (ADS)
Zotkin, Dmitry N.; Chi, Taishih; Shamma, Shihab A.; Duraiswami, Ramani
2005-12-01
The acoustic wave received at the ears is processed by the human auditory system to separate different sounds along the intensity, pitch, and timbre dimensions. Conventional Fourier-based signal processing, while endowed with fast algorithms, is unable to easily represent a signal along these attributes. In this paper, we discuss the creation of maximally separable sounds in auditory user interfaces and use a recently proposed cortical sound representation, which performs a biomimetic decomposition of an acoustic signal, to represent and manipulate sound for this purpose. We briefly overview algorithms for obtaining, manipulating, and inverting a cortical representation of a sound and describe algorithms for manipulating signal pitch and timbre separately. The algorithms are also used to create sound of an instrument between a "guitar" and a "trumpet." Excellent sound quality can be achieved if processing time is not a concern, and intelligible signals can be reconstructed in reasonable processing time (about ten seconds of computational time for a one-second signal sampled at [InlineEquation not available: see fulltext.]). Work on bringing the algorithms into the real-time processing domain is ongoing.
Handling Heavenly Jewels - 35 Years of Antarctic Meteorite Processing at Johnson Space Center
NASA Technical Reports Server (NTRS)
Satterwhite, C. E.; McBridge, K. M.; Harrington, R.; Schwarz, C. M.
2011-01-01
The ANSMET program began in 1976, and since that time more than 18,000 meteorites have been processed in the Meteorite Processing Lab at Johnson Space Center in Houston, TX[1]. The meteorites are collected and returned to JSC on a freezer truck and remain frozen until they are initially processed. Initial Processing of Meteorites: Initial processing involves drying the meteorites in a nitrogen glove box for 24 to 48 hours, photographing, measuring, weighing and writing a description of the interior and exterior. The meteorite is broken and a representative sample is sent to the Smithsonian Institution for classification. Newsletter & Requests: Once initial processing has been complete and the meteorites have been classified, the information is published in the Antarctic Meteorite Newsletter[2,3]. The newsletter is published twice yearly and is sent electronically to researchers around the world and is also available on line. Researchers are asked to fill out a request form and submit it to the Meteorite Working Group secretary. All sample requests will be reviewed by either the meteorite curator or the Meteorite Working Group de-pending on the type of meteorite and the research being conducted. Processing for Sample Requests: In the meteorite processing lab, meteorite samples are prepared several different ways. Most samples are prepared as chips obtained by use of stainless steel chisels in a chipping bowl or rock splitter. In special situations where a researcher needs a slab the meteorite samples can be bandsawed in a dry nitrogen glove box with a diamond blade, no liquids are ever introduced into the cabinet. The last type of sample preparation is thin/thick sections. The meteorite thin section lab at JSC can prepare standard 30-micron thin sections, thick sections of variable thickness (100 to 200 microns), or demountable sections using superglue. Information for researchers: It is important that re-searchers fill the sample request form completely, in order to make sure the meteorite is processed correctly[4]. Re-searchers should list any special requirements on the form, i.e. packaging of samples (poly vs. stainless), thick sections and thickness needed, superglue needed, interior chips, exterior chips, fusion crust, contamination issues, all concerns should be listed so processing can be done accurately and any concerns the researcher has can be addressed be-fore the meteorites are broken.
From in situ coal to the final coal product: A case study of the Danville Coal Member (Indiana)
Mastalerz, Maria; Padgett, P.L.
1999-01-01
A surface coal mine operation and preparation plant in southwestern Indiana was sampled to examine variations in coal quality and coal petrography parameters for the Danville Coal Member of the Dugger Formation (Pennsylvanian-Desmoinesian, Westphalian D). Representative samples from in situ coal, preparation plant feeds, and a final coal product were collected in order to compare coal quality, coal petrography, trace element concentrations, and ash chemistry of the coal to those of the product. Coal quality parameters of the in situ samples and various feeds, coarse refuse, and final product were variable. The quality of the final coal product was best predicted by the coal quality of the clean coal feed (from the middle portions of the seam). Some trace element contents, especially lead and arsenic, varied between the coal feeds and the product. Lead contents increased in the feeds and product compared to the channel sample of the raw coal, possibly due to contamination in the handling process.A surface coal mine operation and preparation plant in southwestern Indiana was sampled to examine variations in coal quality and coal petrography parameters for the Danville Coal Member of the Dugger Formation (Pennsylvanian-Desmoinesian, Westphalian D). Representative samples from in situ coal, preparation plant feeds, and a final coal product were collected in order to compare coal quality, coal petrography, trace element concentrations, and ash chemistry of the coal to those of the product. Coal quality parameters of the in situ samples and various feeds, coarse refuse, and final product were variable. The quality of the final coal product was best predicted by the coal quality of the clean coal feed (from the middle portions of the seam). Some trace element contents, especially lead and arsenic, varied between the coal feeds and the product. Lead contents increased in the feeds and product compared to the channel sample of the raw coal, possibly due to contamination in the handling process.
NASA Technical Reports Server (NTRS)
Haggerty, James J.
1986-01-01
The major programs that generate new technology and therefore expand the bank of knowledge available for future transfer are outlined. The focal point of this volume contains a representative sampling of spinoff products and processes that resulted from technology utilization, or secondary application. The various mechanisms NASA employs to stimulate technology utilization are described and in an appendix, are listed contact sources for further information.
ERIC Educational Resources Information Center
Georgiades, Katholiki; Boyle, Michael H.; Duku, Eric
2007-01-01
Data from a nationally representative sample of 13,470 children aged 4-11 years were used to study contextual influences on children's mental health and school performance, the moderating effects of family immigrant status and underlying family processes that might explain these relationships. Despite greater socioeconomic disadvantage, children…
Quantifying Landscape Spatial Pattern: What Is the State of the Art?
Eric J. Gustafson
1998-01-01
Landscape ecology is based on the premise that there are strong links between ecological pattern and ecological function and process. Ecological systems are spatially heterogeneous, exhibiting consid-erable complexity and variability in time and space. This variability is typically represented by categorical maps or by a collection of samples taken at specific spatial...
ERIC Educational Resources Information Center
Gatti, Mario; Mereu, Maria Grazia; Tagliaferro, Claudio; Markowitsch, Jorg; Neuberger, Robert
Requirements for vocational skills in the engineering industry in Modena, Italy, and Vienna, Austria, were studied. In Modena, employees of a representative sample of 90 small, medium, and large firms in the mechanical processing, agricultural machinery, and sports car manufacturing sectors were interviewed. In Vienna, data were collected through…
Analysis of the Integration of Skill Standards into Community College Curriculum
ERIC Educational Resources Information Center
Aragon, Steven R.; Woo, Hui-Jeong; Marvel, Matthew R.
2005-01-01
The utilization of skill standards in the curriculum development process has become an increasingly prominent aspect of the reform movement in career and technical education (CTE) over the past 10 years. Data were collected across 10 CTE program areas from a nationally representative sample of community colleges. The authors discuss the extent to…
NASA Astrophysics Data System (ADS)
Belov, M. Ye.; Shayko-Shaykovskiy, O. G.; Makhrova, Ye. G.; Kramar, V. M.; Oleksuik, I. S.
2018-01-01
We represent here the theoretical justifications, block scheme and experimental sample of a new automated complex "Thermodyn" for remote contactless diagnostics of inflammatory processes of the surfaces and in subcutaneous areas of human body. Also we described here the methods and results of diagnostic measurements, and results of practical applications of this complex.
Interests as a Component of Adult Course Preferences: Four Australian Case Studies
ERIC Educational Resources Information Center
Athanasou, James A.
2013-01-01
The purpose of this paper is to examine the subliminal role of interest in preferences for 50 courses available at a community college. This is an idiographic study of educational decisions. It employed a sample of situations and a representative design. Four adults participated in an educational-vocational assessment and in the process of…
Modeling Signal-Noise Processes Supports Student Construction of a Hierarchical Image of Sample
ERIC Educational Resources Information Center
Lehrer, Richard
2017-01-01
Grade 6 (modal age 11) students invented and revised models of the variability generated as each measured the perimeter of a table in their classroom. To construct models, students represented variability as a linear composite of true measure (signal) and multiple sources of random error. Students revised models by developing sampling…
Exploring the Emotional Side of Job Search Behavior for Younger Workforce Entrants.
ERIC Educational Resources Information Center
Linnehan, Frank; Blau, Gary
1998-01-01
A sample of 18- to 23-year-old workforce entrants (N=332) was broken into subsamples. Study 1 found support for detached and interactive job-search behavior which seemed to represent different levels of emotional involvement in the job-search process. Study 2 involved working college students (N=117) and found that extroverts favored interactive…
ERIC Educational Resources Information Center
Freeney, Yseult; O'Connell, Michael
2012-01-01
Early school-leaving exerts substantial costs on the individual and society. The literature indicates that quitting school early is predicted by an enmeshed group of indicators including academic and behavioural difficulties in school, deprived economic background and disengagement with the educational process. The attitudes and background of a…
Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J
2018-06-01
The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cocchi, Marina; Durante, Caterina; Grandi, Margherita; Manzini, Daniela; Marchetti, Andrea
2008-01-15
The present research is aimed at monitoring the evolution of the volatile organic compounds of different samples of aceto balsamico tradizionale of modena (ABTM) during ageing. The flavouring compounds, headspace fraction, of the vinegars of four batterie were sampled by solid phase microextraction technique (SPME), and successively analysed by gas chromatography. Obtaining a data set characterized by different sources of variability such as, different producers, samples of different age and chromatographic profile. The gas chromatographic signals were processed by a three-way data analysis method (Tucker3), which allows an easy visualisation of the data by furnishing a distinct set of graphs for each source of variability. The obtained results indicate that the samples can be separated according to their age highlighting the chemical constituents, which play a major role for their differentiation. The present study represents an example of how the application of Tucker3 models, on gas chromatographic signals may help to follow the transformation processes of food products.
Erickson, Collin B; Ankenman, Bruce E; Sanchez, Susan M
2018-06-01
This data article provides the summary data from tests comparing various Gaussian process software packages. Each spreadsheet represents a single function or type of function using a particular input sample size. In each spreadsheet, a row gives the results for a particular replication using a single package. Within each spreadsheet there are the results from eight Gaussian process model-fitting packages on five replicates of the surface. There is also one spreadsheet comparing the results from two packages performing stochastic kriging. These data enable comparisons between the packages to determine which package will give users the best results.
Forensic discrimination of copper wire using trace element concentrations.
Dettman, Joshua R; Cassabaum, Alyssa A; Saunders, Christopher P; Snyder, Deanna L; Buscaglia, JoAnn
2014-08-19
Copper may be recovered as evidence in high-profile cases such as thefts and improvised explosive device incidents; comparison of copper samples from the crime scene and those associated with the subject of an investigation can provide probative associative evidence and investigative support. A solution-based inductively coupled plasma mass spectrometry method for measuring trace element concentrations in high-purity copper was developed using standard reference materials. The method was evaluated for its ability to use trace element profiles to statistically discriminate between copper samples considering the precision of the measurement and manufacturing processes. The discriminating power was estimated by comparing samples chosen on the basis of the copper refining and production process to represent the within-source (samples expected to be similar) and between-source (samples expected to be different) variability using multivariate parametric- and empirical-based data simulation models with bootstrap resampling. If the false exclusion rate is set to 5%, >90% of the copper samples can be correctly determined to originate from different sources using a parametric-based model and >87% with an empirical-based approach. These results demonstrate the potential utility of the developed method for the comparison of copper samples encountered as forensic evidence.
40 CFR Appendix I to Part 261 - Representative Sampling Methods
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Click, D. R.; Edwards, T. B.; Wiedenman, B. J.
2013-03-18
This report contains the results and comparison of data generated from inductively coupled plasma – atomic emission spectroscopy (ICP-AES) analysis of Aqua Regia (AR), Sodium Peroxide/Sodium Hydroxide Fusion Dissolution (PF) and Cold Chem (CC) method digestions and Cold Vapor Atomic Absorption analysis of Hg digestions from the DWPF Hg digestion method of Sludge Batch 8 (SB8) Sludge Receipt and Adjustment Tank (SRAT) Receipt and SB8 SRAT Product samples. The SB8 SRAT Receipt and SB8 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB8 Batch ormore » qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 7b (SB7b), to form the SB8 Blend composition.« less
Horowitz, A.J.; Lum, K.R.; Garbarino, J.R.; Hall, G.E.M.; Lemieux, C.; Demas, C.R.
1996-01-01
Field and laboratory experiments indicate that a number of factors associated with filtration other than just pore size (e.g., diameter, manufacturer, volume of sample processed, amount of suspended sediment in the sample) can produce significant variations in the 'dissolved' concentrations of such elements as Fe, Al, Cu, Zn, Pb, Co, and Ni. The bulk of these variations result from the inclusion/exclusion of colloidally associated trace elements in the filtrate, although dilution and sorption/desorption from filters also may be factors. Thus, dissolved trace element concentrations quantitated by analyzing filtrates generated by processing whole water through similar pore-sized filters may not be equal or comparable. As such, simple filtration of unspecified volumes of natural water through unspecified 0.45-??m membrane filters may no longer represent an acceptable operational definition for a number of dissolved chemical constituents.
137Cs as a tracer of recent sedimentary processes in Lake Michigan
Cahill, R.A.; Steele, J.D.
1986-01-01
To determine recent sediment movement, we measured the levels of 137Cs (an artificial radionuclide produced during nuclear weapons testing) of 118 southern Lake Michigan samples and 27 in Green Bay. These samples, taken from 286 grab samples of the upper 3 cm of sediment, were collected in 1975 as part of a systematic study of Lake Michigan sediment. 137Cs levels correlated well with concentrations of organic carbon, lead, and other anthropogenic trace metals in the sediment. 137Cs had a higher correlation with silt-sized than with clay-sized sediment (0.55 and 0.46, respectively). Atmospherically derived 137Cs and trace metals are being redistributed by sedimentary processes in Lake Michigan after being incorporated in suspended sediment. We determined a distribution pattern of 137Cs that represents areas of southern Lake Michigan where sediment deposition is occurring. ?? 1986 Dr W. Junk Publishers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarley, Brooke A.; Manero, Albert; Cotelo, Jose
2017-01-01
Selective laser melting (SLM) is an additive manufacturing process that uses laser scanning to achieve melting and solidification of a metal powder bed. This process, when applied to develop high temperature material systems, holds great promise for more efficient manufacturing of turbine components that withstand extreme temperatures, heat fluxes, and high mechanical stresses associated with engine environments. These extreme operational conditions demand stringent tolerances and an understanding of the material evolution under thermal loading. This work presents a real-time approach to elucidating the evolution of precipitate phases in SLM Inconel 718 (IN718) under high temperatures using high-energy synchrotron x-ray diffraction.more » Four representative samples (taken along variable build height) were studied in room temperature conditions. Two samples were studied as-processed (samples 1 and 4) and two samples after different thermal treatments (samples 2 and 3). The as-processed samples were found to contain greater amounts of weakening phase, δ. Precipitation hardening of Sample 2 reduced the detectable volume of δ, while also promoting growth of γ00 in the γ matrix. Inversely, solution treatment of Sample 3 produced an overall decrease in precipitate phases. High-temperature, in-situ synchrotron scans during ramp-up, hold, and cool down of two different thermal cycles show the development of precipitate phases. Sample 1 was held at 870°C and subsequently ramped up to 1100°C, during which the high temperature instability of strengthening precipitate, γ00, was seen. γ00 dissolution occurred after 15 minutes at 870°C and was followed by an increase of δ-phase. Sample 4 was held at 800°C and exhibited growth of γ00 after 20 minutes at this temperature. These experiments use in-situ observations to understand the intrinsic thermal effect of the SLM process and the use of heat treatment to manipulate the phase composition of SLM IN718.« less
Methanoculleus spp. as a biomarker of methanogenic activity in swine manure storage tanks.
Barret, Maialen; Gagnon, Nathalie; Morissette, Bruno; Topp, Edward; Kalmokoff, Martin; Brooks, Stephen P J; Matias, Fernando; Massé, Daniel I; Masse, Lucie; Talbot, Guylaine
2012-05-01
Greenhouse gas emissions represent a major problem associated with manure management in the livestock industry. A prerequisite to mitigate methane emissions occurring during manure storage is a clearer understanding of how the microbial consortia involved in methanogenesis function. Here, we have examined manure stored in outdoor tanks from two different farms, at different locations and depths. Physico-chemical and microbiological characterization of these samples indicated differences between each tank, as well as differences within each tank dependent on the depth of sampling. The dynamics of both the bacterial and archaeal communities within these samples were monitored over a 150-day period of anaerobic incubation to identify and track emerging microorganisms, which may be temporally important in the methanogenesis process. Analyses based on DNA fingerprinting of microbial communities identified trends common among all samples as well as trends specific to certain samples. All archaeal communities became enriched with Methanoculleus spp. over time, indicating that the hydrogenotrophic pathway of methanogenesis predominated. Although the emerging species differed in samples obtained from shallow depths compared to deep samples, the temporal enrichment of Methanoculleus suggests that this genus may represent a relevant indicator of methanogenic activity in swine manure storage tanks. © Her Majesty the Queen in Right of Canada 2012. Reproduced with the permission of the Minister of Agriculture and Agri-food Canada.
COMPARISON OF ECOLOGICAL COMMUNITIES: THE PROBLEM OF SAMPLE REPRESENTATIVENESS
Obtaining an adequate, representative sample of ecological communities to make taxon richness (TR) or compositional comparisons among sites is a continuing challenge. Sample representativeness literally means the similarity in species composition and relative abundance between a ...
Mirzaee, Seyyed Abbas; Nikaeen, Mahnaz; Hajizadeh, Yaghob; Nabavi, BiBi Fatemeh; Hassanzadeh, Akbar
2015-01-01
Background: Wastewater contains a variety of pathogens and bio -aerosols generated during the wastewater treatment process, which could be a potential health risk for exposed individuals. This study was carried out to detect Legionella spp. in the bio -aerosols generated from different processes of a wastewater treatment plant (WWTP) in Isfahan, Iran, and the downwind distances. Materials and Methods: A total of 54 air samples were collected and analyzed for the presence of Legionella spp. by a nested- polymerase chain reaction (PCR) assay. A liquid impingement biosampler was used to capture bio -aerosols. The weather conditions were also recorded. Results: Legionella were detected in 6% of the samples, including air samples above the aeration tank (1/9), belt filter press (1/9), and 250 m downwind (1/9). Conclusion: The result of this study revealed the presence of Legionella spp. in air samples of a WWTP and downwind distance, which consequently represent a potential health risk to the exposed individuals. PMID:25802817
Instrument to collect fogwater for chemical analysis
NASA Astrophysics Data System (ADS)
Jacob, Daniel J.; Waldman, Jed M.; Haghi, Mehrdad; Hoffmann, Michael R.; Flagan, Richard C.
1985-06-01
An instrument is presented which collects large samples of ambient fogwater by impaction of droplets on a screen. The collection efficiency of the instrument is determined as a function of droplet size, and it is shown that fog droplets in the range 3-100-μm diameter are efficiently collected. No significant evaporation or condensation occurs at any stage of the collection process. Field testing indicates that samples collected are representative of the ambient fogwater. The instrument may easily be automated, and is suitable for use in routine air quality monitoring programs.
DeLeon-Rodriguez, Natasha; Lathem, Terry L; Rodriguez-R, Luis M; Barazesh, James M; Anderson, Bruce E; Beyersdorf, Andreas J; Ziemba, Luke D; Bergin, Michael; Nenes, Athanasios; Konstantinidis, Konstantinos T
2013-02-12
The composition and prevalence of microorganisms in the middle-to-upper troposphere (8-15 km altitude) and their role in aerosol-cloud-precipitation interactions represent important, unresolved questions for biological and atmospheric science. In particular, airborne microorganisms above the oceans remain essentially uncharacterized, as most work to date is restricted to samples taken near the Earth's surface. Here we report on the microbiome of low- and high-altitude air masses sampled onboard the National Aeronautics and Space Administration DC-8 platform during the 2010 Genesis and Rapid Intensification Processes campaign in the Caribbean Sea. The samples were collected in cloudy and cloud-free air masses before, during, and after two major tropical hurricanes, Earl and Karl. Quantitative PCR and microscopy revealed that viable bacterial cells represented on average around 20% of the total particles in the 0.25- to 1-μm diameter range and were at least an order of magnitude more abundant than fungal cells, suggesting that bacteria represent an important and underestimated fraction of micrometer-sized atmospheric aerosols. The samples from the two hurricanes were characterized by significantly different bacterial communities, revealing that hurricanes aerosolize a large amount of new cells. Nonetheless, 17 bacterial taxa, including taxa that are known to use C1-C4 carbon compounds present in the atmosphere, were found in all samples, indicating that these organisms possess traits that allow survival in the troposphere. The findings presented here suggest that the microbiome is a dynamic and underappreciated aspect of the upper troposphere with potentially important impacts on the hydrological cycle, clouds, and climate.
Wayland, Karen G.; Long, David T.; Hyndman, David W.; Pijanowski, Bryan C.; Woodhams, Sarah M.; Haak, Sheridan K.
2003-01-01
The relationship between land use and stream chemistry is often explored through synoptic sampling rivers at baseflow condition. However, base flow chemistry is likely to vary temporally and spatially with land use. The purpose of our study is to examine the usefulness of the synoptic sampling approach for identifying the relationship between complex land use configurations and stream water quality. This study compares biogeochemical data from three synoptic sampling events representing the temporal variability of baseflow chemistry and land use using R-mode factor analysis. Separate R-mode factor analyses of the data from individual sampling events yielded only two consistent factors. Agricultural activity was associated with elevated levels of Ca2+, Mg2+, alkalinity, and frequently K+, SO42-, and NO3-. Urban areas were associated with higher concentrations of Na+, K+, and Cl-. Other retained factors were not consistent among sampling events, and some factors were difficult to interpret in the context of biogeochemical sources and processes. When all data were combined, further associations were revealed such as an inverse relationship between the proportion of wetlands and stream nitrate concentrations. We also found that barren lands were associated with elevated sulfate levels. This research suggests that an individual sampling event is unlikely to characterize adequately the complex processes controlling interactions between land uses and stream chemistry. Combining data collected over two years during three synoptic sampling events appears to enhance our ability to understand processes linking stream chemistry and land use.
Rothrock, M J; Locatelli, A; Glenn, T C; Thomas, J C; Caudill, A C; Kiepper, B H; Hiett, K L
2016-10-01
The commercial poultry processing environment plays a significant role in reducing foodborne pathogens and spoilage organisms from poultry products prior to being supplied to consumers. While understanding the microbiological quality of these products is essential, little is known about the microbiota of processing water tanks within the processing plant. Therefore, the goal of this study was to assess the microbiomes of the scalder and chiller tanks during a typical commercial processing d, and determine how bacterial populations, including foodborne pathogens and spoilage organisms, change during the processing day in relation to the bacterial communities as a whole. Additionally, considering this is the first microbiomic analysis of processing tank waters, 2 water sampling methods also were compared. Results of this study show that Proteobacteria and Firmicutes represented over half of the sequences recovered from both tanks at the phylum level, but the microbiomic profiles needed to be analyzed at the genus level to observe more dynamic population shifts. Bacteria known to predominate in the live production environment were found to increase in the scalder tank and gram negative spoilage-related bacteria were found to decrease in the chiller tank throughout the processing day. Directly sampling the scalder water, as compared to analyzing filtered samples, resulted in significantly different microbiomic profiles dominated by Anoxybacillus species. While no sequences related to major foodborne pathogens were found, further sampling collection and processing optimization should provide researchers and the poultry industry a new tool to understand the ecological role of spoilage and pathogenic bacteria within processing tank waters. Published by Oxford University Press on behalf of Poultry Science Association 2016. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Zaveri, Toral; Running, Cordelia A; Surapaneni, Lahari; Ziegler, Gregory R; Hayes, John E
2016-01-01
Vaginal microbicides are a promising means to prevent the transmission of HIV, empowering women by putting protection under their control. We have been using gel technology to develop microbicides in the intermediate texture space to overcome shortcomings of current solid and liquid forms. We recently formulated semisoft ovules from mixed polymer combinations of carrageenan and Carbopol 940P to overcome some of the flaws with our previous generation of formulations based solely on carrageenan. To determine the user acceptability of the reformulated gels, women first evaluated intact semisoft ovules before evaluating ovules that had been subjected to mechanical crushing to simulate samples that represent post-use discharge. Women then evaluated combinations of intact and discharge samples to understand how ovule textures correlated with texture of the resulting discharge samples. Carbopol concentration directly and inversely correlated with willingness to try for discharge samples and intact samples respectively. When evaluating intact samples, women focused on the ease of inserting the product and preferred firmer samples; conversely, when evaluating discharge samples, softer samples that resulted in a smooth paste were preferred. Significant differences between samples were lost when evaluating pairs as women made varying tradeoffs between their preference for ease of inserting intact ovules and acceptability of discharge appearance. Evaluating samples that represent different stages of the use cycle reveals a more holistic measure of product acceptability. Studying sensory acceptability in parallel with biophysical performance enables an iterative design process that considers what women prefer in terms of insertion as well as possibility of leakage. PMID:27357703
Zaveri, Toral; Running, Cordelia A; Surapaneni, Lahari; Ziegler, Gregory R; Hayes, John E
2016-10-01
Vaginal microbicides are a promising means to prevent the transmission of HIV, empowering women by putting protection under their control. We have been using gel technology to develop microbicides in the intermediate texture space to overcome shortcomings of current solid and liquid forms. We recently formulated semisoft ovules from mixed polymer combinations of carrageenan and Carbopol 940P to overcome some of the flaws with our previous generation of formulations based solely on carrageenan. To determine the user acceptability of the reformulated gels, women first evaluated intact semisoft ovules before evaluating ovules that had been subjected to mechanical crushing to simulate samples that represent post-use discharge. Women then evaluated combinations of intact and discharge samples to understand how ovule textures correlated with texture of the resulting discharge samples. Carbopol concentration directly and inversely correlated with willingness to try for discharge samples and intact samples, respectively. When evaluating intact samples, women focused on the ease of inserting the product and preferred firmer samples; conversely, when evaluating discharge samples, softer samples that resulted in a smooth paste were preferred. Significant differences between samples were lost when evaluating pairs as women made varying trade-offs between their preference for ease of inserting intact ovules and acceptability of discharge appearance. Evaluating samples that represent different stages of the use cycle reveals a more holistic measure of product acceptability. Studying sensory acceptability in parallel with biophysical performance enables an iterative design process that considers what women prefer in terms of insertion as well as possibility of leakage.
Inference from clustering with application to gene-expression microarrays.
Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M
2002-01-01
There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.
Evidentiary, extraevidentiary, and deliberation process predictors of real jury verdicts.
Devine, Dennis J; Krouse, Paige C; Cavanaugh, Caitlin M; Basora, Jaime Colon
2016-12-01
In contrast to the extensive literature based on mock jurors, large-sample studies of decision making by real juries are relatively rare. In this field study, we examined relationships between jury verdicts and variables representing 3 classes of potential determinants-evidentiary, extraevidentiary, and deliberation process-using a sample of 114 criminal jury trials. Posttrial data were collected from 11 presiding judges, 31 attorneys, and 367 jurors using a Web-based questionnaire. The strength of the prosecution's evidence was strongly related to the occurrence of a conviction, whereas most extraevidentiary and deliberation process variables were only weakly to modestly related in bivariate form and when the prosecution's evidence strength was controlled. Notable exceptions to this pattern were jury demographic diversity as represented by the number of different race-gender subgroups (e.g., Black males) present in the jury, and several deliberation process variables reflecting advocacy for acquittal (e.g., presence of an identifiable proacquittal faction within the jury and proacquittal advocacy by the foreperson). Variables reflecting advocacy for conviction were essentially unrelated to jury verdict. Sets of extraevidentiary and deliberation variables were each able to modestly improve the explanation of jury verdicts over prosecution evidence strength in multivariate models. This study highlights the predictive efficacy of prosecution evidence strength with respect to jury verdicts, as well as the potential importance of jury demographic diversity and advocacy for acquittal during deliberation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du Preez, S. P.; Beukes, J. P.; Van Dalen, W. P. J.
The production of ferrochrome (FeCr) is a reducing process. However, it is impossible to completely exclude oxygen from all of the high-temperature production process steps, which may lead to unintentional formation of small amounts of Cr(VI). The majority of Cr(VI) is associated with particles found in the off-gas of the high-temperature processes, which are cleaned by means of venturi scrubbers or bag filter dust (BFD) systems. BFD contains the highest concentration of Cr(VI) of all FeCr wastes. In this study, the solubility of Cr(VI) present in BFD was determined by evaluating four different BFD samples. The results indicate that themore » currently applied Cr(VI) treatment strategies of the FeCr producer (with process water pH ≤ 9) only effectively extract and treat the water-soluble Cr(VI) compounds, which merely represented approximately 31% of the total Cr(VI) present in the BFD samples evaluated. Extended extraction time, within the afore-mentioned pH range, proved futile in extracting sparingly-soluble and water-insoluble Cr(VI) species, which represented approximately 34% and 35% of the total Cr(VI), respectively. Due to the deficiencies of the current treatment strategies, it is highly likely that sparingly water-soluble Cr(VI) compounds will leach from waste storage facilities (e.g. slimes dams) over time. Therefore, it is critical that improved Cr(VI) treatment strategies be formulated, which should be an important future perspective for FeCr producers and researchers alike.« less
Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online
Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary
2018-01-01
Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574
Advanced Interrogation of Fiber-Optic Bragg Grating and Fabry-Perot Sensors with KLT Analysis
Tosi, Daniele
2015-01-01
The Karhunen-Loeve Transform (KLT) is applied to accurate detection of optical fiber sensors in the spectral domain. By processing an optical spectrum, although coarsely sampled, through the KLT, and subsequently processing the obtained eigenvalues, it is possible to decode a plurality of optical sensor results. The KLT returns higher accuracy than other demodulation techniques, despite coarse sampling, and exhibits higher resilience to noise. Three case studies of KLT-based processing are presented, representing most of the current challenges in optical fiber sensing: (1) demodulation of individual sensors, such as Fiber Bragg Gratings (FBGs) and Fabry-Perot Interferometers (FPIs); (2) demodulation of dual (FBG/FPI) sensors; (3) application of reverse KLT to isolate different sensors operating on the same spectrum. A simulative outline is provided to demonstrate the KLT operation and estimate performance; a brief experimental section is also provided to validate accurate FBG and FPI decoding. PMID:26528975
Advanced Interrogation of Fiber-Optic Bragg Grating and Fabry-Perot Sensors with KLT Analysis.
Tosi, Daniele
2015-10-29
The Karhunen-Loeve Transform (KLT) is applied to accurate detection of optical fiber sensors in the spectral domain. By processing an optical spectrum, although coarsely sampled, through the KLT, and subsequently processing the obtained eigenvalues, it is possible to decode a plurality of optical sensor results. The KLT returns higher accuracy than other demodulation techniques, despite coarse sampling, and exhibits higher resilience to noise. Three case studies of KLT-based processing are presented, representing most of the current challenges in optical fiber sensing: (1) demodulation of individual sensors, such as Fiber Bragg Gratings (FBGs) and Fabry-Perot Interferometers (FPIs); (2) demodulation of dual (FBG/FPI) sensors; (3) application of reverse KLT to isolate different sensors operating on the same spectrum. A simulative outline is provided to demonstrate the KLT operation and estimate performance; a brief experimental section is also provided to validate accurate FBG and FPI decoding.
NASA Technical Reports Server (NTRS)
Figueredo, P. H.; Tanaka, K.; Senske, D.; Greeley, R.
2003-01-01
Knowledge of the geology, style and time history of crustal processes on the icy Galilean satellites is necessary to understanding how these bodies formed and evolved. Data from the Galileo mission have provided a basis for detailed geologic and geo- physical analysis. Due to constrained downlink, Galileo Solid State Imaging (SSI) data consisted of global coverage at a -1 km/pixel ground sampling and representative, widely spaced regional maps at -200 m/pixel. These two data sets provide a general means to extrapolate units identified at higher resolution to lower resolution data. A sampling of key sites at much higher resolution (10s of m/pixel) allows evaluation of processes on local scales. We are currently producing the first global geological map of Europa using Galileo global and regional-scale data. This work is demonstrating the necessity and utility of planet-wide contiguous image coverage at global, regional, and local scales.
Kovacevik, Biljana; Boev, Blazo; Panova, Vesna Zajkova; Mitrev, Sasa
2016-12-05
The aim of this study was to investigate the groundwater pollution from alluvial aquifers lying under surface agriculture activities in two geologically different areas: alluvial and prolluvial. The groundwater in investigated areas is neutral to alkaline (pH 7.05-8.45), and the major dissolved ions are bicarbonate and calcium. Groundwater samples from the alluvial area are characterized by nitrate concentration above the national maximum concentration limit (MCL) at 20.5% of samples [mean value (Me) 6.31 mg/L], arsenic concentrations greater than national MCL at 35.6% of investigated samples (Me 12.12 µg/L) and elevated concentrations of iron (Me 202.37 µg/L) and manganese (Me 355.22 µg/L) at 22.7% and 81% of investigated samples, respectively. Groundwater samples from the prolluvial area did not show significantly elevated concentrations of heavy metals, but the concentration of nitrate was considerably higher (Me 65.06 mg/L). Factor analysis positively correlates As with Mn and Fe, suggesting its natural origin. Nitrate was found in positive correlation with SO 4 2- and Ni but in negative with NH 4 + , suggesting its anthropogenic origin and the relationship of these ions in the process of denitrification. The t-test analysis showed a significant difference between nitrate pollution of groundwater from alluvial and prolluvial areas. According to the chemical composition of groundwater, the process of denitrification is considered to be the main reason for the reduced presence of nitrate in the groundwater lying under alluvial deposits represented by chalk and sandstones. Denitrification in groundwater lying under prolluvial deposits represented by magmatic and metamorphic rock formations was not observed.
Alessandria, Valentina; Rantsiou, Kalliopi; Dolci, Paola; Cocolin, Luca
2010-07-31
In this study we investigated the occurrence of Listeria monocytogenes in a dairy processing plant during two sampling campaigns in 2007 and 2008. Samples represented by semifinished and finished cheeses, swabs from the equipment and brines from the salting step, were subjected to analysis by using traditional and molecular methods, represented mainly by quantitative PCR. Comparing the results obtained by the application of the two approaches used, it became evident how traditional microbiological analysis underestimated the presence of L. monocytogenes in the dairy plant. Especially samples of the brines and the equipment swabs were positive only with qPCR. For some equipment swabs it was possible to detect a load of 10(4)-10(5) cfu/cm(2), while the modified ISO method employed gave negative results both before and after the enrichment step. The evidences collected during the first sampling year, highlighting a heavy contamination of the brines and of the equipment, lead to the implementation of specific actions that decreased the contamination in these samples during the 2008 campaign. However, no reduction in the number of L. monocytogenes positive final products was observed, suggesting that a more strict control is necessary to avoid the presence of the pathogen. All the isolates of L. monocytogenes were able to attach to abiotic surfaces, and, interestingly, considering the results obtained from their molecular characterization it became evident how strains present in the brines, were genetically connected with isolates from the equipment and from the final product, suggesting a clear route of contamination of the pathogen in the dairy plant. This study underlines the necessity to use appropriate analytical tools, such as molecular methods, to fully understand the spread and persistence of L. monocytogenes in food producing companies. Copyright 2010 Elsevier B.V. All rights reserved.
Fatoyinbo, Henry O; McDonnell, Martin C; Hughes, Michael P
2014-07-01
Detection of pathogens from environmental samples is often hampered by sensors interacting with environmental particles such as soot, pollen, or environmental dust such as soil or clay. These particles may be of similar size to the target bacterium, preventing removal by filtration, but may non-specifically bind to sensor surfaces, fouling them and causing artefactual results. In this paper, we report the selective manipulation of soil particles using an AC electrokinetic microfluidic system. Four heterogeneous soil samples (smectic clay, kaolinitic clay, peaty loam, and sandy loam) were characterised using dielectrophoresis to identify the electrical difference to a target organism. A flow-cell device was then constructed to evaluate dielectrophoretic separation of bacteria and clay in a continous flow through mode. The average separation efficiency of the system across all soil types was found to be 68.7% with a maximal separation efficiency for kaolinitic clay at 87.6%. This represents the first attempt to separate soil particles from bacteria using dielectrophoresis and indicate that the technique shows significant promise; with appropriate system optimisation, we believe that this preliminary study represents an opportunity to develop a simple yet highly effective sample processing system.
Kumar, Pawan; Yadav, Sudesh
2013-03-01
Atmospheric condensate (AC) and rainwater samples were collected during 2010-2011 winter season from Delhi and characterized for major cations and anions. The observed order of abundance of cations and anions in AC samples was NH (4) (+) > Ca(2+) > Na(+) > K(+) > Mg(2+) and HCO (3) (-) > SO (4) (2-) > Cl(-) > NO (2) (-) > NO (3) (-) > F(-), respectively. All samples were alkaline in nature and Σ (cation)/Σ (anion) ratio was found to be close to one. NH (4) (+) emissions followed by Ca(2+) and Mg(2+) were largely responsible for neutralization of acidity caused by high NO( x ) and SO(2) emissions from vehicles and thermal power plants in the region. Interestingly, AC samples show low nitrate content compared with its precursor nitrite, which is commonly reversed in case of rainwater. It could be due to (1) slow light-mediated oxidation of HONO; (2) larger emission of NO(2) and temperature inversion conditions entrapping them; and (3) formation and dissociation of ammonium nitrite, which seems to be possible as both carry close correlation in our data set. Principal component analysis indicated three factors (marine mixed with biomass burning, anthropogenic and terrestrial, and carbonates) for all ionic species. Significantly higher sulfate/nitrate ratio indicates greater anthropogenic contributions in AC samples compared with rainwater. Compared with rainwater, AC samples show higher abundance of all ionic species except SO(4), NO(3), and Ca suggesting inclusion of these ions by wash out process during rain events. Ionic composition and related variations in AC and rainwater samples indicate that two represent different processes in time and space coordinates. AC represents the near-surface interaction whereas rainwater chemistry is indicative of regional patterns. AC could be a suitable way to understand atmospheric water interactions with gas and solid particle species in the lower atmosphere.
RNA extraction from decaying wood for (meta)transcriptomic analyses.
Adamo, Martino; Voyron, Samuele; Girlanda, Mariangela; Marmeisse, Roland
2017-10-01
Wood decomposition is a key step of the terrestrial carbon cycle and is of economic importance. It is essentially a microbiological process performed by fungi and to an unknown extent by bacteria. To gain access to the genes expressed by the diverse microbial communities participating in wood decay, we developed an RNA extraction protocol from this recalcitrant material rich in polysaccharides and phenolic compounds. This protocol was implemented on 22 wood samples representing as many tree species from 11 plant families in the Angiosperms and Gymnosperms. RNA was successfully extracted from all samples and converted into cDNAs from which were amplified both fungal and bacterial protein coding genes, including genes encoding hydrolytic enzymes participating in lignocellulose hydrolysis. This protocol applicable to a wide range of decomposing wood types represents a first step towards a metatranscriptomic analysis of wood degradation under natural conditions.
Wills, Thomas A.; Sargent, James D.; Stoolmiller, Mike; Gibbons, Frederick X.; Gerrard, Meg
2009-01-01
The authors tested 2 mechanisms for the relation of movie smoking exposure with onset of cigarette smoking in adolescence. Longitudinal data with 8-month follow-up were obtained from a representative sample of 6,522 U.S. adolescents, ages 10–14 years. Structural modeling analysis based on initial nonsmokers, which controlled for 10 covariates associated with movie exposure, showed that viewing more smoking in movies was related to increases in positive expectancies about smoking and increases in affiliation with smoking peers, and these variables were both related to smoking onset. A direct effect of movie exposure on smoking onset was also noted. Mediation findings were replicated across cross-sectional and longitudinal analyses. Tests for gender differences indicated that girls showed larger effects of movie exposure for some variables. Implications for policy and prevention research are discussed. PMID:18540724
Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.
Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner
2016-01-01
Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.
Fasoli, Marianna; Dal Santo, Silvia; Zenoni, Sara; Tornielli, Giovanni Battista; Farina, Lorenzo; Zamboni, Anita; Porceddu, Andrea; Venturini, Luca; Bicego, Manuele; Murino, Vittorio; Ferrarini, Alberto; Delledonne, Massimo; Pezzotti, Mario
2012-09-01
We developed a genome-wide transcriptomic atlas of grapevine (Vitis vinifera) based on 54 samples representing green and woody tissues and organs at different developmental stages as well as specialized tissues such as pollen and senescent leaves. Together, these samples expressed ∼91% of the predicted grapevine genes. Pollen and senescent leaves had unique transcriptomes reflecting their specialized functions and physiological status. However, microarray and RNA-seq analysis grouped all the other samples into two major classes based on maturity rather than organ identity, namely, the vegetative/green and mature/woody categories. This division represents a fundamental transcriptomic reprogramming during the maturation process and was highlighted by three statistical approaches identifying the transcriptional relationships among samples (correlation analysis), putative biomarkers (O2PLS-DA approach), and sets of strongly and consistently expressed genes that define groups (topics) of similar samples (biclustering analysis). Gene coexpression analysis indicated that the mature/woody developmental program results from the reiterative coactivation of pathways that are largely inactive in vegetative/green tissues, often involving the coregulation of clusters of neighboring genes and global regulation based on codon preference. This global transcriptomic reprogramming during maturation has not been observed in herbaceous annual species and may be a defining characteristic of perennial woody plants.
On the Representativeness of Behavior Observation Samples in Classrooms
ERIC Educational Resources Information Center
Tiger, Jeffrey H.; Miller, Sarah J.; Mevers, Joanna Lomas; Mintz, Joslyn Cynkus; Scheithauer, Mindy C.; Alvarez, Jessica
2013-01-01
School consultants who rely on direct observation typically conduct observational samples (e.g., 1 30-min observation per day) with the hopes that the sample is representative of performance during the remainder of the day, but the representativeness of these samples is unclear. In the current study, we recorded the problem behavior of 3 referred…
Vizentin-Bugoni, Jeferson; Maruyama, Pietro K; Debastiani, Vanderlei J; Duarte, L da S; Dalsgaard, Bo; Sazima, Marlies
2016-01-01
Virtually all empirical ecological interaction networks to some extent suffer from undersampling. However, how limitations imposed by sampling incompleteness affect our understanding of ecological networks is still poorly explored, which may hinder further advances in the field. Here, we use a plant-hummingbird network with unprecedented sampling effort (2716 h of focal observations) from the Atlantic Rainforest in Brazil, to investigate how sampling effort affects the description of network structure (i.e. widely used network metrics) and the relative importance of distinct processes (i.e. species abundances vs. traits) in determining the frequency of pairwise interactions. By dividing the network into time slices representing a gradient of sampling effort, we show that quantitative metrics, such as interaction evenness, specialization (H2 '), weighted nestedness (wNODF) and modularity (Q; QuanBiMo algorithm) were less biased by sampling incompleteness than binary metrics. Furthermore, the significance of some network metrics changed along the sampling effort gradient. Nevertheless, the higher importance of traits in structuring the network was apparent even with small sampling effort. Our results (i) warn against using very poorly sampled networks as this may bias our understanding of networks, both their patterns and structuring processes, (ii) encourage the use of quantitative metrics little influenced by sampling when performing spatio-temporal comparisons and (iii) indicate that in networks strongly constrained by species traits, such as plant-hummingbird networks, even small sampling is sufficient to detect their relative importance for the frequencies of interactions. Finally, we argue that similar effects of sampling are expected for other highly specialized subnetworks. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.
Abel, Gary A; Saunders, Catherine L; Lyratzopoulos, Georgios
2016-04-01
Surveys of the experience of cancer patients are increasingly being introduced in different countries and used in cancer epidemiology research. Sampling processes, post-sampling mortality and survey non-response can influence the representativeness of cancer patient surveys. We examined predictors of post-sampling mortality and non-response among patients initially included in the sampling frame of the English Cancer Patient Experience Survey. We also compared the respondents' diagnostic case-mix to other relevant populations of cancer patients, including incident and prevalent cases. Of 109,477 initially sampled cancer patients, 6273 (5.7%) died between sampling and survey mail-out. Older age and diagnosis of brain, lung and pancreatic cancer were associated with higher risk of post-sampling mortality. The overall response rate was 67% (67,713 respondents), being >70% for the most affluent patients and those diagnosed with colon or breast cancer and <50% for Asian or Black patients, those under 35 and those diagnosed with brain cancer. The diagnostic case-mix of respondents varied substantially from incident or prevalent cancer cases. Respondents to the English Cancer Patient Experience Survey represent a population of recently treated cancer survivors. Although patient survey data can provide unique insights for improving cancer care quality, features of survey populations need to be acknowledged when analysing and interpreting findings from studies using such data. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Production of continuous glass fiber using lunar simulant
NASA Technical Reports Server (NTRS)
Tucker, Dennis S.; Ethridge, Edwin C.; Curreri, Peter A.
1991-01-01
The processing parameters and mechanical properties of glass fibers pulled from simulated lunar basalt are tested. The simulant was prepared using a plasma technique. The composition is representative of a low titanium mare basalt (Apollo sample 10084). Lunar gravity experiments are to be performed utilizing parabolic aircraft free-fall maneuvers which yield 30 seconds of 1/6-g per maneuver.
Does the public notice visual resource problems on the federal estate?
John D. Peine
1979-01-01
Results of the 1977 Federal estate are highlighted. The survey of recreation on the Federal estate represents a unique data set which was uniformly collected across all Federal land managing agencies and sections of the country. The on-site sampling procedures utilized in this survey process have never before been applied on such a large scale. Procedures followed and...
ERIC Educational Resources Information Center
Henderson, Charles; Dancy, Melissa; Niewiadomska-Bugaj, Magdalena
2012-01-01
During the fall of 2008 a web survey, designed to collect information about pedagogical knowledge and practices, was completed by a representative sample of 722 physics faculty across the United States (50.3% response rate). This paper presents partial results to describe how 20 potential predictor variables correlate with faculty knowledge about…
NASA Technical Reports Server (NTRS)
Haggerty, J. J.
1984-01-01
A pictorial resume that underlines the challenging nature of NASA programs and their extraordinary demands for technological input, is presented. Also, NASA's current mainline programs, which require development of new technology, are given. A representative sampling of spinoff products and processes resulting from technology utiliization, or secondary application, and the mechanisms NASA employs to stimulate technology utilization are provided. Contact sources for further information are presented.
Rare earth element geochemistry of outcrop and core samples from the Marcellus Shale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noack, Clinton W.; Jain, Jinesh C.; Stegmeier, John
In this paper, we studied the geochemistry of the rare earth elements (REE) in eleven outcrop samples and six, depth-interval samples of a core from the Marcellus Shale. The REE are classically applied analytes for investigating depositional environments and inferring geochemical processes, making them of interest as potential, naturally occurring indicators of fluid sources as well as indicators of geochemical processes in solid waste disposal. However, little is known of the REE occurrence in the Marcellus Shale or its produced waters, and this study represents one of the first, thorough characterizations of the REE in the Marcellus Shale. In thesemore » samples, the abundance of REE and the fractionation of REE profiles were correlated with different mineral components of the shale. Namely, samples with a larger clay component were inferred to have higher absolute concentrations of REE but have less distinctive patterns. Conversely, samples with larger carbonate fractions exhibited a greater degree of fractionation, albeit with lower total abundance. Further study is necessary to determine release mechanisms, as well as REE fate-and-transport, however these results have implications for future brine and solid waste management applications.« less
Rare earth element geochemistry of outcrop and core samples from the Marcellus Shale
Noack, Clinton W.; Jain, Jinesh C.; Stegmeier, John; ...
2015-06-26
In this paper, we studied the geochemistry of the rare earth elements (REE) in eleven outcrop samples and six, depth-interval samples of a core from the Marcellus Shale. The REE are classically applied analytes for investigating depositional environments and inferring geochemical processes, making them of interest as potential, naturally occurring indicators of fluid sources as well as indicators of geochemical processes in solid waste disposal. However, little is known of the REE occurrence in the Marcellus Shale or its produced waters, and this study represents one of the first, thorough characterizations of the REE in the Marcellus Shale. In thesemore » samples, the abundance of REE and the fractionation of REE profiles were correlated with different mineral components of the shale. Namely, samples with a larger clay component were inferred to have higher absolute concentrations of REE but have less distinctive patterns. Conversely, samples with larger carbonate fractions exhibited a greater degree of fractionation, albeit with lower total abundance. Further study is necessary to determine release mechanisms, as well as REE fate-and-transport, however these results have implications for future brine and solid waste management applications.« less
Tang, Gang; Hou, Wei; Wang, Huaqing; Luo, Ganggang; Ma, Jianwei
2015-01-01
The Shannon sampling principle requires substantial amounts of data to ensure the accuracy of on-line monitoring of roller bearing fault signals. Challenges are often encountered as a result of the cumbersome data monitoring, thus a novel method focused on compressed vibration signals for detecting roller bearing faults is developed in this study. Considering that harmonics often represent the fault characteristic frequencies in vibration signals, a compressive sensing frame of characteristic harmonics is proposed to detect bearing faults. A compressed vibration signal is first acquired from a sensing matrix with information preserved through a well-designed sampling strategy. A reconstruction process of the under-sampled vibration signal is then pursued as attempts are conducted to detect the characteristic harmonics from sparse measurements through a compressive matching pursuit strategy. In the proposed method bearing fault features depend on the existence of characteristic harmonics, as typically detected directly from compressed data far before reconstruction completion. The process of sampling and detection may then be performed simultaneously without complete recovery of the under-sampled signals. The effectiveness of the proposed method is validated by simulations and experiments. PMID:26473858
Aspects of the tribological behaviour of powders recycled from rapid steel treated sub-zero
NASA Astrophysics Data System (ADS)
Radu, S.; Ciobanu, M.
2017-02-01
The recycling of high-alloyed steels represents a significant opportunity in Powder Metallurgy as it permits the use of raw materials with relatively low prices compared to the conventional methods. Recycling can be achieved by two methods: from spraying debris resulted from worn cutting tools and processes obtained from processing chip drilling and re-sharpening of tools. The research aims to confirm that wastes from rapid steels can become, by the successive processing, metal powders that can thereafter be used for cutting tools of lathe type removable plate. After pressing and sintering the recycling powder, cylindrical samples were obtained that were subsequently applied a subcritical annealing. Wear tests conducted on a tribometer type TRB-01-02541 confirmed that their wear resistance is superior to the same samples that were sintered, hardened and tempered in oil. This paper was accepted for publication in Proceedings after double peer reviewing process but was not presented at the Conference ROTRIB’16.
Q-Sample Construction: A Critical Step for a Q-Methodological Study.
Paige, Jane B; Morin, Karen H
2016-01-01
Q-sample construction is a critical step in Q-methodological studies. Prior to conducting Q-studies, researchers start with a population of opinion statements (concourse) on a particular topic of interest from which a sample is drawn. These sampled statements are known as the Q-sample. Although literature exists on methodological processes to conduct Q-methodological studies, limited guidance exists on the practical steps to reduce the population of statements to a Q-sample. A case exemplar illustrates the steps to construct a Q-sample in preparation for a study that explored perspectives nurse educators and nursing students hold about simulation design. Experts in simulation and Q-methodology evaluated the Q-sample for readability, clarity, and for representativeness of opinions contained within the concourse. The Q-sample was piloted and feedback resulted in statement refinement. Researchers especially those undertaking Q-method studies for the first time may benefit from the practical considerations to construct a Q-sample offered in this article. © The Author(s) 2014.
Processes in scientific workflows for information seeking related to physical sample materials
NASA Astrophysics Data System (ADS)
Ramdeen, S.
2014-12-01
The majority of State Geological Surveys have repositories containing cores, cuttings, fossils or other physical sample material. State surveys maintain these collections to support their own research as well as the research conducted by external users from other organizations. This includes organizations such as government agencies (state and federal), academia, industry and the public. The preliminary results presented in this paper will look at the research processes of these external users. In particular: how they discover, access and use digital surrogates, which they use to evaluate and access physical items in these collections. Data such as physical samples are materials that cannot be completely replaced with digital surrogates. Digital surrogates may be represented as metadata, which enable discovery and ultimately access to these samples. These surrogates may be found in records, databases, publications, etc. But surrogates do not completely prevent the need for access to the physical item as they cannot be subjected to chemical testing and/or other similar analysis. The goal of this research is to document the various processes external users perform in order to access physical materials. Data for this study will be collected by conducting interviews with these external users. During the interviews, participants will be asked to describe the workflow that lead them to interact with state survey repositories, and what steps they took afterward. High level processes/categories of behavior will be identified. These processes will be used in the development of an information seeking behavior model. This model may be used to facilitate the development of management tools and other aspects of cyberinfrastructure related to physical samples.
Reed, M.F.; Bartholomay, R.C.; Hughes, S.S.
1997-01-01
Thirty-nine samples of basaltic core were collected from wells 121 and 123, located approximately 1.8 km apart north and south of the Idaho Chemical Processing Plant at the Idaho National Engineering Laboratory. Samples were collected from depths ranging from 15 to 221 m below land surface for the purpose of establishing stratigraphic correlations between these two wells. Elemental analyses indicate that the basalts consist of three principal chemical types. Two of these types are each represented by a single basalt flow in each well. The third chemical type is represented by many basalt flows and includes a broad range of chemical compositions that is distinguished from the other two types. Basalt flows within the third type were identified by hierarchical K-cluster analysis of 14 representative elements: Fe, Ca, K, Na, Sc, Co, La, Ce, Sm, Eu, Yb, Hf, Ta, and Th. Cluster analyses indicate correlations of basalt flows between wells 121 and 123 at depths of approximately 38-40 m, 125-128 m, 131-137 m, 149-158 m, and 183-198 m. Probable correlations also are indicated for at least seven other depth intervals. Basalt flows in several depth intervals do not correlate on the basis of chemical compositions, thus reflecting possible flow margins in the sequence between the wells. Multi-element chemical data provide a useful method for determining stratigraphic correlations of basalt in the upper 1-2 km of the eastern Snake River Plain.
SLUDGE BATCH 7B QUALIFICATION ACTIVITIES WITH SRS TANK FARM SLUDGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pareizs, J.; Click, D.; Lambert, D.
2011-11-16
Waste Solidification Engineering (WSE) has requested that characterization and a radioactive demonstration of the next batch of sludge slurry - Sludge Batch 7b (SB7b) - be completed in the Shielded Cells Facility of the Savannah River National Laboratory (SRNL) via a Technical Task Request (TTR). This characterization and demonstration, or sludge batch qualification process, is required prior to transfer of the sludge from Tank 51 to the Defense Waste Processing Facility (DWPF) feed tank (Tank 40). The current WSE practice is to prepare sludge batches in Tank 51 by transferring sludge from other tanks. Discharges of nuclear materials from Hmore » Canyon are often added to Tank 51 during sludge batch preparation. The sludge is washed and transferred to Tank 40, the current DWPF feed tank. Prior to transfer of Tank 51 to Tank 40, SRNL typically simulates the Tank Farm and DWPF processes with a Tank 51 sample (referred to as the qualification sample). With the tight schedule constraints for SB7b and the potential need for caustic addition to allow for an acceptable glass processing window, the qualification for SB7b was approached differently than past batches. For SB7b, SRNL prepared a Tank 51 and a Tank 40 sample for qualification. SRNL did not receive the qualification sample from Tank 51 nor did it simulate all of the Tank Farm washing and decanting operations. Instead, SRNL prepared a Tank 51 SB7b sample from samples of Tank 7 and Tank 51, along with a wash solution to adjust the supernatant composition to the final SB7b Tank 51 Tank Farm projections. SRNL then prepared a sample to represent SB7b in Tank 40 by combining portions of the SRNL-prepared Tank 51 SB7b sample and a Tank 40 Sludge Batch 7a (SB7a) sample. The blended sample was 71% Tank 40 (SB7a) and 29% Tank 7/Tank 51 on an insoluble solids basis. This sample is referred to as the SB7b Qualification Sample. The blend represented the highest projected Tank 40 heel (as of May 25, 2011), and thus, the highest projected noble metals content for SB7b. Characterization was performed on the Tank 51 SB7b samples and SRNL performed DWPF simulations using the Tank 40 SB7b material. This report documents: (1) The preparation and characterization of the Tank 51 SB7b and Tank 40 SB7b samples. (2) The performance of a DWPF Chemical Process Cell (CPC) simulation using the SB7b Tank 40 sample. The simulation included a Sludge Receipt and Adjustment Tank (SRAT) cycle, where acid was added to the sludge to destroy nitrite and reduce mercury, and a Slurry Mix Evaporator (SME) cycle, where glass frit was added to the sludge in preparation for vitrification. The SME cycle also included replication of five canister decontamination additions and concentrations. Processing parameters were based on work with a nonradioactive simulant. (3) Vitrification of a portion of the SME product and characterization and durability testing (as measured by the Product Consistency Test (PCT)) of the resulting glass. (4) Rheology measurements of the SRAT receipt, SRAT product, and SME product. This program was controlled by a Task Technical and Quality Assurance Plan (TTQAP), and analyses were guided by an Analytical Study Plan. This work is Technical Baseline Research and Development (R&D) for the DWPF. It should be noted that much of the data in this document has been published in interoffice memoranda. The intent of this technical report is bring all of the SB7b related data together in a single permanent record and to discuss the overall aspects of SB7b processing.« less
NASA Astrophysics Data System (ADS)
Yan, Yifang; Yang, Chunyu; Ma, Xiaoping; Zhou, Linna
2018-02-01
In this paper, sampled-data H∞ filtering problem is considered for Markovian jump singularly perturbed systems with time-varying delay and missing measurements. The sampled-data system is represented by a time-delay system, and the missing measurement phenomenon is described by an independent Bernoulli random process. By constructing an ɛ-dependent stochastic Lyapunov-Krasovskii functional, delay-dependent sufficient conditions are derived such that the filter error system satisfies the prescribed H∞ performance for all possible missing measurements. Then, an H∞ filter design method is proposed in terms of linear matrix inequalities. Finally, numerical examples are given to illustrate the feasibility and advantages of the obtained results.
Investigation of digital encoding techniques for television transmission
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1983-01-01
Composite color television signals are sampled at four times the color subcarrier and transformed using intraframe two dimensional Walsh functions. It is shown that by properly sampling a composite color signal and employing a Walsh transform the YIQ time signals which sum to produce the composite color signal can be represented, in the transform domain, by three component signals in space. By suitably zonal quantizing the transform coefficients, the YIQ signals can be processed independently to achieve data compression and obtain the same results as component coding. Computer simulations of three bandwidth compressors operating at 1.09, 1.53 and 1.8 bits/ sample are presented. The above results can also be applied to the PAL color system.
New Noble Gas Studies on Popping Rocks from the Mid-Atlantic Ridge near 14°N
NASA Astrophysics Data System (ADS)
Kurz, M. D.; Curtice, J.; Jones, M.; Péron, S.; Wanless, V. D.; Mittelstaedt, E. L.; Soule, S. A.; Klein, F.; Fornari, D. J.
2017-12-01
New Popping Rocks were recovered in situ on the Mid-Atlantic Ridge (MAR) near 13.77° N, using HOV Alvin on cruise AT33-03 in 2016 on RV Atlantis. We report new helium, neon, argon, and CO2 step-crushing measurements on a subset of the glass samples, with a focus on a new procedure to collect seafloor samples with minimal exposure to air. Glassy seafloor basalts were collected in sealed containers using the Alvin mechanical arm and transported to the surface without atmospheric exposure. On the ship, the seawater was drained, the volcanic glass was transferred to stainless steel ultra-high-vacuum containers (in an oxygen-free glove box), which were then evacuated using a turbo-molecular pump and sealed for transport under vacuum. All processing was carried out under a nitrogen atmosphere. A control sample was collected from each pillow outcrop and processed normally in air. The preliminary step-crushing measurements show that the anaerobically collected samples have systematically higher 20Ne/22Ne, 21Ne/22Ne and 40Ar/36Ar than the control samples. Helium abundances and isotopes are consistent between anaerobically collected samples and control samples. These results suggest that minimizing atmospheric exposure during sample processing can significantly reduce air contamination for heavy noble gases, providing a new option for seafloor sampling. Higher vesicle abundances appear to yield a greater difference in neon and argon isotopes between the anaerobic and control samples, suggesting that atmospheric contamination is related to vesicle abundance, possibly through micro-fractures. The new data show variability in the maximum mantle neon and argon isotopic compositions, and abundance ratios, suggesting that the samples experienced variable outgassing prior to eruption, and may represent different phases of a single eruption, or multiple eruptions.
Review of image processing fundamentals
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1985-01-01
Image processing through convolution, transform coding, spatial frequency alterations, sampling, and interpolation are considered. It is postulated that convolution in one domain (real or frequency) is equivalent to multiplication in the other (frequency or real), and that the relative amplitudes of the Fourier components must be retained to reproduce any waveshape. It is suggested that all digital systems may be considered equivalent, with a frequency content approximately at the Nyquist limit, and with a Gaussian frequency response. An optimized cubic version of the interpolation continuum image is derived as a set of cubic spines. Pixel replication has been employed to enlarge the visable area of digital samples, however, suitable elimination of the extraneous high frequencies involved in the visable edges, by defocusing, is necessary to allow the underlying object represented by the data values to be seen.
Chandarana, Keval; Drew, Megan E; Emmanuel, Julian; Karra, Efthimia; Gelegen, Cigdem; Chan, Philip; Cron, Nicholas J; Batterham, Rachel L
2009-06-01
Gut hormones represent attractive therapeutic targets for the treatment of obesity and type 2 diabetes. However, controversy surrounds the effects that adiposity, dietary manipulations, and bariatric surgery have on their circulating concentrations. We sought to determine whether these discrepancies are due to methodologic differences. Ten normal-weight males participated in a 4-way crossover study investigating whether fasting appetite scores, plasma acyl-ghrelin, active glucagon-like peptide-1 (GLP-1), and peptide YY3-36 (PYY3-36) levels are altered by study-induced stress, prior food consumption, and sample processing. Study visit order affected anxiety, plasma cortisol, and temporal profiles of appetite and plasma PYY3-36, with increased anxiety and cortisol concentrations on the first study day. Plasma cortisol area under the curve (AUC) correlated positively with plasma PYY3-36 AUC. Despite a 14-hour fast, baseline hunger, PYY3-36 concentrations, temporal appetite profiles, PYY3-36 AUC, and active GLP-1 were affected by the previous evening's meal. Sample processing studies revealed that sample acidification and esterase inhibition are required when measuring acyl-ghrelin and dipeptidyl-peptidase IV inhibitor addition for active GLP-1. However, plasma PYY3-36 concentrations were unaffected by addition of dipeptidyl-peptidase IV. Accurate assessment of appetite, feeding behavior, and gut hormone concentrations requires standardization of prior food consumption and subject acclimatization to the study protocol. Moreover, because of the labile nature of acyl-ghrelin and active GLP-1, specialized sample processing needs to be undertaken.
Ghrefat, H.A.; Goodell, P.C.; Hubbard, B.E.; Langford, R.P.; Aldouri, R.E.
2007-01-01
Visible and Near-Infrared (VNIR) through Short Wavelength Infrared (SWIR) (0.4-2.5????m) AVIRIS data, along with laboratory spectral measurements and analyses of field samples, were used to characterize grain size variations in aeolian gypsum deposits across barchan-transverse, parabolic, and barchan dunes at White Sands, New Mexico, USA. All field samples contained a mineralogy of ?????100% gypsum. In order to document grain size variations at White Sands, surficial gypsum samples were collected along three Transects parallel to the prevailing downwind direction. Grain size analyses were carried out on the samples by sieving them into seven size fractions ranging from 45 to 621????m, which were subjected to spectral measurements. Absorption band depths of the size fractions were determined after applying an automated continuum-removal procedure to each spectrum. Then, the relationship between absorption band depth and gypsum size fraction was established using a linear regression. Three software processing steps were carried out to measure the grain size variations of gypsum in the Dune Area using AVIRIS data. AVIRIS mapping results, field work and laboratory analysis all show that the interdune areas have lower absorption band depth values and consist of finer grained gypsum deposits. In contrast, the dune crest areas have higher absorption band depth values and consist of coarser grained gypsum deposits. Based on laboratory estimates, a representative barchan-transverse dune (Transect 1) has a mean grain size of 1.16 ??{symbol} (449????m). The error bar results show that the error ranges from - 50 to + 50????m. Mean grain size for a representative parabolic dune (Transect 2) is 1.51 ??{symbol} (352????m), and 1.52 ??{symbol} (347????m) for a representative barchan dune (Transect 3). T-test results confirm that there are differences in the grain size distributions between barchan and parabolic dunes and between interdune and dune crest areas. The t-test results also show that there are no significant differences between modeled and laboratory-measured grain size values. Hyperspectral grain size modeling can help to determine dynamic processes shaping the formation of the dunes such as wind directions, and the relative strengths of winds through time. This has implications for studying such processes on other planetary landforms that have mineralogy with unique absorption bands in VNIR-SWIR hyperspectral data. ?? 2006 Elsevier B.V. All rights reserved.
Vongkamjan, Kitiya; Benjakul, Soottawat; Kim Vu, Hue Thi; Vuddhakul, Varaporn
2017-09-01
Listeria monocytogenes is a foodborne pathogen commonly found in environments of seafood processing, thus presenting a challenge for eradication from seafood processing facilities. Monitoring the prevalence and subtype diversity of L. monocytogenes together with phages that are specific to Listeria spp. ("Listeria phages") will provide knowledge on the bacteria-phage ecology in food processing plants. In this work, a total of 595 samples were collected from raw material, finished seafood products and environmental samples from different sites of a seafood processing plant during 17 sampling visits in 1.5 years of study. L. monocytogenes and Listeria spp. (non-monocytogenes) were found in 22 (3.7%) and 43 (7.2%) samples, respectively, whereas 29 Listeria phages were isolated from 9 (1.5%) phage-positive samples. DNA fingerprint analysis of L. monocytogenes isolates revealed 11 Random Amplified Polymorphic DNA (RAPD) profiles, with two subtypes were frequently observed over time. Our data reveal a presence of Listeria phages within the same seafood processing environments where a diverse set of L. monocytogenes subtypes was also found. Although serotype 4b was observed at lower frequency, data indicate that isolates from this seafood processing plant belonged to both epidemiologically important serotypes 1/2a and 4b, which may suggest a potential public health risk. Phages (all showed a unique genome size of 65 ± 2 kb) were classified into 9 host range groups, representing both broad- and narrow-host range. While most L. monocytogenes isolates from this facility were susceptible to phages, five isolates showed resistance to 12-20 phages. Variations in phage host range among Listeria phages isolated from food processing plant may affect a presence of a diverse set of L. monocytogenes isolates derived from the same processing environment in Thailand. Copyright © 2017 Elsevier Ltd. All rights reserved.
Remote sensing of coal mine pollution in the upper Potomac River basin
NASA Technical Reports Server (NTRS)
1974-01-01
A survey of remote sensing data pertinent to locating and monitoring sources of pollution resulting from surface and shaft mining operations was conducted in order to determine the various methods by which ERTS and aircraft remote sensing data can be used as a replacement for, or a supplement to traditional methods of monitoring coal mine pollution of the upper Potomac Basin. The gathering and analysis of representative samples of the raw and processed data obtained during the survey are described, along with plans to demonstrate and optimize the data collection processes.
2007-11-15
Intelligence and Information Systems (IIS) Enterprise CMMI® ML3 SCAMPI(SM) SE/SW/IPPD/SS #5382 Raymond L. Kile , SEI Authorized Lead Appraiser Kathryn...Kirby, Raytheon IIS Process Assessments IPT Lead Picking a Representative Sample For CMMI® Enterprise Appraisals Page 2 Introductions Ray Kile has thirty...University of Missouri. Raymond L. Kile Chief Engineer Center for Systems Management 1951 Kidwell Drive, Suite 750 Vienna, VA 22182 303-601-8978 rkile@csm.com
Kerfriden, P.; Schmidt, K.M.; Rabczuk, T.; Bordas, S.P.A.
2013-01-01
We propose to identify process zones in heterogeneous materials by tailored statistical tools. The process zone is redefined as the part of the structure where the random process cannot be correctly approximated in a low-dimensional deterministic space. Such a low-dimensional space is obtained by a spectral analysis performed on pre-computed solution samples. A greedy algorithm is proposed to identify both process zone and low-dimensional representative subspace for the solution in the complementary region. In addition to the novelty of the tools proposed in this paper for the analysis of localised phenomena, we show that the reduced space generated by the method is a valid basis for the construction of a reduced order model. PMID:27069423
Possible Detection of Perchlorates by Evolved Gas Analysis of Rocknest Soils: Global Implication
NASA Technical Reports Server (NTRS)
Archer, P. D., Jr.; Sutter, B.; Ming, D. W.; McKay, C. P.; Navarro-Gonzalez, R.; Franz, H. B.; McAdam, A.; Mahaffy, P. R.
2013-01-01
The Sample Analysis at Mars (SAM) instrument suite on board the Mars Science Laboratory (MSL) recently ran four samples from an aeolian bedform named Rocknest. Rocknest was selected as the source of the first samples analyzed because it is representative of both windblown material in Gale crater as well as the globally-distributed dust. The four samples analyzed by SAM were portioned from the fifth scoop at this location. The material delivered to SAM passed through a 150 m sieve and should have been well mixed during the sample acquisition/ preparation/handoff process. Rocknest samples were heated to 835 C at a 35 C/minute ramp rate with a He carrier gas flow rate of 1.5 standard cubic centimeters per minute and at an oven pressure of 30 mbar. Evolved gases were detected by a quadrupole mass spectrometer (QMS).
Alpha Matting with KL-Divergence Based Sparse Sampling.
Karacan, Levent; Erdem, Aykut; Erdem, Erkut
2017-06-22
In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robbins, G.A.; Brandes, S.D.; Winschel, R.A.
1995-05-01
The objectives of this project are to support the DOE direct coal liquefaction process development program and to improve the useful application of analytical chemistry to direct coal liquefaction process development. Independent analyses by well-established methods will be obtained of samples produced in direct coal liquefaction processes under evaluation by DOE. Additionally, analytical instruments and techniques which are currently underutilized for the purpose of examining coal-derived samples will be evaluated. The data obtained from this study will be used to help guide current process development and to develop an improved data base on coal and coal liquids properties. A samplemore » bank will be established and maintained for use in this project and will be available for use by other researchers. The reactivity of the non-distillable resids toward hydrocracking at liquefaction conditions (i.e., resid reactivity) will be examined. From the literature and data experimentally obtained, a mathematical kinetic model of resid conversion will be constructed. It is anticipated that such a model will provide insights useful for improving process performance and thus the economics of direct coal liquefaction. During this quarter, analyses were completed on 65 process samples from representative periods of HRI Run POC-2 in which coal, coal/plastics, and coal/rubber were the feedstocks. A sample of the oil phase of the oil/water separator from HRI Run POC-1 was analyzed to determine the types and concentrations of phenolic compounds. Chemical analyses and microautoclave tests were performed to monitor the oxidation and measure the reactivity of the standard coal (Old Ben Mine No. 1) which has been used for the last six years to determine solvent quality of process oils analyzed in this and previous DOE contracts.« less
Kiepper, B H; Merka, W C; Fletcher, D L
2008-08-01
An experiment was conducted to compare the proximate composition of particulate matter recovered from poultry processing wastewater (PPW) generated by broiler slaughter plants. Poultry processing wastewater is the cumulative wastewater stream generated during the processing of poultry following primary and secondary physical screening (typically to 500 mum) that removes gross offal. Composite samples of PPW from 3 broiler slaughter plants (southeast United States) were collected over 8 consecutive weeks. All 3 broiler slaughter plants process young chickens with an average live weight of 2.0 kg. At each plant, a single 72-L composite sample was collected using an automatic sampler programmed to collect 1 L of wastewater every 20 min for 24 h during one normal processing day each week. Each composite sample was thoroughly mixed, and 60 L was passed through a series of sieves (2.0 mm, 1.0 mm, 500 mum, and 53 mum). The amount of particulate solids collected on the 2.0 mm, 1.0 mm, and 500 mum sieves was insignificant. The solids recovered from the 53-mum sieve were subjected to proximate analysis to determine percent moisture, fat, protein, ash, and fiber. The average percentages of fat, protein, ash, and fiber for all samples on a dry-weight basis were 55.3, 27.1, 6.1, and 4.1, respectively. Fat made up over half of the dry-weight matter recovered, representing PPW particulate matter between 500 and 53 mum. Despite the variation in number of birds processed daily, further processing operations, and number and type of wastewater screens utilized, there were no significance differences in percentage of fat and fiber between the slaughter plants. There were significant differences in percent protein and ash between the slaughter plants.
Transactional processes in the development of adult personality disorder symptoms.
Carlson, Elizabeth A; Ruiz, Sarah K
2016-08-01
The development of adult personality disorder symptoms, including transactional processes of relationship representational and behavioral experience from infancy to early adolescence, was examined using longitudinal data from a risk sample (N = 162). Significant preliminary correlations were found between early caregiving experience and adult personality disorder symptoms and between representational and behavioral indices across time and adult symptomatology. Significant correlations were also found among diverse representational assessments (e.g., interview, drawing, and projective narrative) and between concurrent representational and observational measures of relationship functioning. Path models were analyzed to investigate the combined relations of caregiving experience in infancy; relationship representation and experience in early childhood, middle childhood, and early adolescence; and personality disorder symptoms in adulthood. The hypothesized model representing interactive contributions of representational and behavioral experience represented the data significantly better than competing models representing noninteractive contributions. Representational and behavioral indicators mediated the link between early caregiving quality and personality disorder symptoms. The findings extend previous studies of normative development and support an organizational developmental view that early relationship experiences contribute to socioemotional maladaptation as well as adaptation through the progressive transaction of mutually informing expectations and experience.
Phylogenetic Copy-Number Factorization of Multiple Tumor Samples.
Zaccaria, Simone; El-Kebir, Mohammed; Klau, Gunnar W; Raphael, Benjamin J
2018-04-16
Cancer is an evolutionary process driven by somatic mutations. This process can be represented as a phylogenetic tree. Constructing such a phylogenetic tree from genome sequencing data is a challenging task due to the many types of mutations in cancer and the fact that nearly all cancer sequencing is of a bulk tumor, measuring a superposition of somatic mutations present in different cells. We study the problem of reconstructing tumor phylogenies from copy-number aberrations (CNAs) measured in bulk-sequencing data. We introduce the Copy-Number Tree Mixture Deconvolution (CNTMD) problem, which aims to find the phylogenetic tree with the fewest number of CNAs that explain the copy-number data from multiple samples of a tumor. We design an algorithm for solving the CNTMD problem and apply the algorithm to both simulated and real data. On simulated data, we find that our algorithm outperforms existing approaches that either perform deconvolution/factorization of mixed tumor samples or build phylogenetic trees assuming homogeneous tumor samples. On real data, we analyze multiple samples from a prostate cancer patient, identifying clones within these samples and a phylogenetic tree that relates these clones and their differing proportions across samples. This phylogenetic tree provides a higher resolution view of copy-number evolution of this cancer than published analyses.
Gaussian process based intelligent sampling for measuring nano-structure surfaces
NASA Astrophysics Data System (ADS)
Sun, L. J.; Ren, M. J.; Yin, Y. H.
2016-09-01
Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.
Temporal Variability of Microplastic Concentrations in Freshwater Streams
NASA Astrophysics Data System (ADS)
Watkins, L.; Walter, M. T.
2016-12-01
Plastic pollution, specifically the size fraction less than 5mm known as microplastics, is an emerging contaminant in waterways worldwide. The ability of microplastics to adsorb and transport contaminants and microbes, as well as be ingested by organisms, makes them a concern in both freshwater and marine ecosystems. Recent efforts to determine the extent of microplastic pollution are increasingly focused on freshwater systems, but most studies have reported concentrations at a single time-point; few have begun to uncover how plastic concentrations in riverine systems may change through time. We hypothesize the time of day and season of sampling influences the concentrations of microplastics in water samples and more specifically, that daytime stormflow samples contain the highest microplastic concentrations due to maximized runoff and wastewater discharge. In order to test this hypothesis, we sampled in two similar streams in Ithaca, New York using a 333µm mesh net deployed within the thalweg. Repeat samples were collected to identify diurnal patterns as well as monthly variation. Samples were processed in the laboratory following the NOAA wet peroxide oxidation protocol. This work improves our ability to interpret existing single-time-point survey results by providing information on how microplastic concentrations change over time and whether concentrations in existing stream studies are likely representative of their location. Additionally, these results will inform future studies by providing insight into representative sample timing and capturing temporal trends for the purposes of modeling and of developing regulations for microplastic pollution.
Gobrecht, Alexia; Bendoula, Ryad; Roger, Jean-Michel; Bellon-Maurel, Véronique
2015-01-01
Visible and Near Infrared (Vis-NIR) Spectroscopy is a powerful non destructive analytical method used to analyze major compounds in bulk materials and products and requiring no sample preparation. It is widely used in routine analysis and also in-line in industries, in-vivo with biomedical applications or in-field for agricultural and environmental applications. However, highly scattering samples subvert Beer-Lambert law's linear relationship between spectral absorbance and the concentrations. Instead of spectral pre-processing, which is commonly used by Vis-NIR spectroscopists to mitigate the scattering effect, we put forward an optical method, based on Polarized Light Spectroscopy to improve the absorbance signal measurement on highly scattering samples. This method selects part of the signal which is less impacted by scattering. The resulted signal is combined in the Absorption/Remission function defined in Dahm's Representative Layer Theory to compute an absorbance signal fulfilling Beer-Lambert's law, i.e. being linearly related to concentration of the chemicals composing the sample. The underpinning theories have been experimentally evaluated on scattering samples in liquid form and in powdered form. The method produced more accurate spectra and the Pearson's coefficient assessing the linearity between the absorbance spectra and the concentration of the added dye improved from 0.94 to 0.99 for liquid samples and 0.84-0.97 for powdered samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Stelpflug, Scott C.; Sekhon, Rajandeep S.; Vaillancourt, Brieanne; ...
2015-12-30
Comprehensive and systematic transcriptome profiling provides valuable insight into biological and developmental processes that occur throughout the life cycle of a plant. We have enhanced our previously published microarray-based gene atlas of maize ( Zea mays L.) inbred B73 to now include 79 distinct replicated samples that have been interrogated using RNA sequencing (RNA-seq). The current version of the atlas includes 50 original array-based gene atlas samples, a time-course of 12 stalk and leaf samples postflowering, and an additional set of 17 samples from the maize seedling and adult root system. The entire dataset contains 4.6 billion mapped reads, withmore » an average of 20.5 million mapped reads per biological replicate, allowing for detection of genes with lower transcript abundance. As the new root samples represent key additions to the previously examined tissues, we highlight insights into the root transcriptome, which is represented by 28,894 (73.2%) annotated genes in maize. Additionally, we observed remarkable expression differences across both the longitudinal (four zones) and radial gradients (cortical parenchyma and stele) of the primary root supported by fourfold differential expression of 9353 and 4728 genes, respectively. Among the latter were 1110 genes that encode transcription factors, some of which are orthologs of previously characterized transcription factors known to regulate root development in Arabidopsis thaliana (L.) Heynh., while most are novel, and represent attractive targets for reverse genetics approaches to determine their roles in this important organ. As a result, this comprehensive transcriptome dataset is a powerful tool toward understanding maize development, physiology, and phenotypic diversity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stelpflug, Scott C.; Sekhon, Rajandeep S.; Vaillancourt, Brieanne
Comprehensive and systematic transcriptome profiling provides valuable insight into biological and developmental processes that occur throughout the life cycle of a plant. We have enhanced our previously published microarray-based gene atlas of maize ( Zea mays L.) inbred B73 to now include 79 distinct replicated samples that have been interrogated using RNA sequencing (RNA-seq). The current version of the atlas includes 50 original array-based gene atlas samples, a time-course of 12 stalk and leaf samples postflowering, and an additional set of 17 samples from the maize seedling and adult root system. The entire dataset contains 4.6 billion mapped reads, withmore » an average of 20.5 million mapped reads per biological replicate, allowing for detection of genes with lower transcript abundance. As the new root samples represent key additions to the previously examined tissues, we highlight insights into the root transcriptome, which is represented by 28,894 (73.2%) annotated genes in maize. Additionally, we observed remarkable expression differences across both the longitudinal (four zones) and radial gradients (cortical parenchyma and stele) of the primary root supported by fourfold differential expression of 9353 and 4728 genes, respectively. Among the latter were 1110 genes that encode transcription factors, some of which are orthologs of previously characterized transcription factors known to regulate root development in Arabidopsis thaliana (L.) Heynh., while most are novel, and represent attractive targets for reverse genetics approaches to determine their roles in this important organ. As a result, this comprehensive transcriptome dataset is a powerful tool toward understanding maize development, physiology, and phenotypic diversity.« less
DeLeon-Rodriguez, Natasha; Lathem, Terry L.; Rodriguez-R, Luis M.; Barazesh, James M.; Anderson, Bruce E.; Beyersdorf, Andreas J.; Ziemba, Luke D.; Bergin, Michael; Nenes, Athanasios; Konstantinidis, Konstantinos T.
2013-01-01
The composition and prevalence of microorganisms in the middle-to-upper troposphere (8–15 km altitude) and their role in aerosol-cloud-precipitation interactions represent important, unresolved questions for biological and atmospheric science. In particular, airborne microorganisms above the oceans remain essentially uncharacterized, as most work to date is restricted to samples taken near the Earth’s surface. Here we report on the microbiome of low- and high-altitude air masses sampled onboard the National Aeronautics and Space Administration DC-8 platform during the 2010 Genesis and Rapid Intensification Processes campaign in the Caribbean Sea. The samples were collected in cloudy and cloud-free air masses before, during, and after two major tropical hurricanes, Earl and Karl. Quantitative PCR and microscopy revealed that viable bacterial cells represented on average around 20% of the total particles in the 0.25- to 1-μm diameter range and were at least an order of magnitude more abundant than fungal cells, suggesting that bacteria represent an important and underestimated fraction of micrometer-sized atmospheric aerosols. The samples from the two hurricanes were characterized by significantly different bacterial communities, revealing that hurricanes aerosolize a large amount of new cells. Nonetheless, 17 bacterial taxa, including taxa that are known to use C1–C4 carbon compounds present in the atmosphere, were found in all samples, indicating that these organisms possess traits that allow survival in the troposphere. The findings presented here suggest that the microbiome is a dynamic and underappreciated aspect of the upper troposphere with potentially important impacts on the hydrological cycle, clouds, and climate. PMID:23359712
Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis
Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín
2010-01-01
Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506
Karraker, Amelia; Schoeni, Robert F.; Cornman, Jennifer C.
2015-01-01
Growing evidence suggests that psychological factors, such as conscientiousness and anger, as well as cognitive ability are related to mortality. Less is known about 1) the relative importance of each of these factors in predicting mortality, 2) through what social, economic, and behavioral mechanisms these factors influence mortality, and 3) how these processes unfold over long periods of time in nationally-representative samples. We use 35 years (1972–2007) of data from men (ages 20–40) in the Panel Study of Income Dynamics (PSID), a nationally representative sample in the United States, and discrete time event history analysis (n=27,373 person-years) to examine the importance of measures of follow-through (a dimension of conscientiousness), anger, and cognitive ability in predicting mortality. We also assess the extent to which income, marriage, and smoking explain the relationship between psychological and cognitive factors with mortality. We find that while follow-through, anger, and cognitive ability are all associated with subsequent mortality when modeled separately, when they are modeled together and baseline demographic characteristics are controlled, only anger remains associated with mortality: being in the top quartile for anger is associated with a 1.57 fold increase in the risk of dying at follow-up compared with those in the bottom quartile. This relationship is robust to the inclusion of income, marriage, and smoking as mediators. PMID:26397865
Escalante, H; Castro, L; Amaya, M P; Jaimes, L; Jaimes-Estévez, J
2018-01-01
Cheese whey (CW) is the main waste generated in the cheesemaking process and has high organic matter content and acidity. Therefore, CW disposal is a challenge for small to medium enterprises (SMEs) in the dairy industry that do not have any type of treatment plant. Anaerobic digestion (AD) is an attractive process for solving this problem. The aim of this research was to determine the biomethane and struvite precipitation potentials of CW from four dairy SMEs. First, changes in CW properties (organic matter and pH) were evaluated. Second, biomethane and struvite potentials were assessed using cattle slurry as inoculum. The organic matter in CW varied from 40 to 65gVS/kg, 65 to 140g COD/L, and 2 to 10g/L for VFAs depending on the sampling time and type of sample. The pH of the CW samples ranged from 3 to 6.5. In the anaerobic biodegradability analysis, methane yields reached 0.51 to 0.60L CH 4 /g VS added , which represented electrical and caloric potentials of 54 and 108kWh/m 3 for CW, respectively. Organic matter removal in all experiments was above 83%. Moreover, anaerobic digestates presented NH 4 + /PO 4 3- molar ratios between 2.6 and 4.0, which are adequate for struvite precipitation with potential production of 8.5-10.4g struvite/L CW. Finally, the use of biogas as energetic supplement and struvite as soil fertilizer, represents economics saves of US$ 6.91/m 3 CW and US$ 5.75/m 3 CW in therms of electricity and fertilizer use, respectively. The energetic, agricultural and economic potentials, evidence that AD process is a feasible alternative for cheese whey treatment. Copyright © 2017. Published by Elsevier Ltd.
Raman spectroscopy as a PAT for pharmaceutical blending: Advantages and disadvantages.
Riolo, Daniela; Piazza, Alessandro; Cottini, Ciro; Serafini, Margherita; Lutero, Emilio; Cuoghi, Erika; Gasparini, Lorena; Botturi, Debora; Marino, Iari Gabriel; Aliatis, Irene; Bersani, Danilo; Lottici, Pier Paolo
2018-02-05
Raman spectroscopy has been positively evaluated as a tool for the in-line and real-time monitoring of powder blending processes and it has been proved to be effective in the determination of the endpoint of the mixing, showing its potential role as process analytical technology (PAT). The aim of this study is to show advantages and disadvantages of Raman spectroscopy with respect to the most traditional HPLC analysis. The spectroscopic results, obtained directly on raw powders, sampled from a two-axis blender in real case conditions, were compared with the chromatographic data obtained on the same samples. The formulation blend used for the experiment consists of active pharmaceutical ingredient (API, concentrations 6.0% and 0.5%), lactose and magnesium stearate (as excipients). The first step of the monitoring process was selecting the appropriate wavenumber region where the Raman signal of API is maximal and interference from the spectral features of excipients is minimal. Blend profiles were created by plotting the area ratios of the Raman peak of API (A API ) at 1598cm -1 and the Raman bands of excipients (A EXC ), in the spectral range between 1560 and 1630cm -1 , as a function of mixing time: the API content can be considered homogeneous when the time-dependent dispersion of the area ratio is minimized. In order to achieve a representative sampling with Raman spectroscopy, each sample was mapped in a motorized XY stage by a defocused laser beam of a micro-Raman apparatus. Good correlation between the two techniques has been found only for the composition at 6.0% (w/w). However, standard deviation analysis, applied to both HPLC and Raman data, showed that Raman results are more substantial than HPLC ones, since Raman spectroscopy enables generating data rich blend profiles. In addition, the relative standard deviation calculated from a single map (30 points) turned out to be representative of the degree of homogeneity for that blend time. Copyright © 2017 Elsevier B.V. All rights reserved.
Noise Power Spectrum in PROPELLER MR Imaging.
Ichinoseki, Yuki; Nagasaka, Tatsuo; Miyamoto, Kota; Tamura, Hajime; Mori, Issei; Machida, Yoshio
2015-01-01
The noise power spectrum (NPS), an index for noise evaluation, represents the frequency characteristics of image noise. We measured the NPS in PROPELLER (Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction) magnetic resonance (MR) imaging, a nonuniform data sampling technique, as an initial study for practical MR image evaluation using the NPS. The 2-dimensional (2D) NPS reflected the k-space sampling density and showed agreement with the shape of the k-space trajectory as expected theoretically. Additionally, the 2D NPS allowed visualization of a part of the image reconstruction process, such as filtering and motion correction.
NASA Global Atmospheric Sampling Program (GASP) data report for tape VL0014
NASA Technical Reports Server (NTRS)
Briehl, D.; Dudzinski, T. J.; Liu, D. C.
1980-01-01
The data currently available from GASP, including flight routes and dates, instrumentation, data processing procedures, and data tape specifications are described. Measurements of atmospheric ozone, cabin ozine, carbon monoxide, water vapor, particles, clouds, condensation nuclei, filter samples and related meteorological and flight information obtained during 562 flights of aircraft N533PA, N4711U, N655PA, and VH-EBE from October 3, 1977 through January 5, 1978 are reported. Data representing tropopause pressures obtained from time and space interpolation of National Meteorological Center archived data for the dates of the flights are included.
Jairo A. Diaz; Robert J. Moon; Jeffrey P. Youngblood
2014-01-01
Thermal expansion represents a vital indicator of the processing history and dimensional stability of materials. Solvent-sensitive, thin, and compliant samples are particularly challenging to test. Here we describe how textures highlighted by contrast enhanced optical microscopy modes (i.e., polarized light (PL), phase contrast (PC)) and bright field (BF) can be used...
Insights in groundwater organic matter from Liquid Chromatography-Organic Carbon Detection
NASA Astrophysics Data System (ADS)
Rutlidge, H.; Oudone, P.; McDonough, L.; Andersen, M. S.; Baker, A.; Meredith, K.; O'Carroll, D. M.
2017-12-01
Understanding the processes that control the concentration and characteristics of organic matter in groundwater has important implications for the terrestrial global carbon budget. Liquid Chromatography - Organic Carbon Detection (LC-OCD) is a size-exclusion based chromatography technique that separates the organic carbon into molecular weight size fractions of biopolymers, humic substances, building blocks (degradation products of humic substances), low molecular weight acids and low molecular weight neutrals. Groundwater and surface water samples were collected from a range of locations in Australia representing different surface soil, land cover, recharge type and hydrological properties. At one site hyporheic zone samples were also collected from beneath a stream. The results showed a general decrease in the aromaticity and molecular weight indices going from surface water, hyporheic downwelling and groundwater samples. The aquifer substrate also affected the organic composition. For example, groundwater samples collected from a zone of fractured rock showed a relative decrease in the proportion of humic substances, suggestive of sorption or degradation of humic substances. This work demonstrates the potential for using LC-OCD in elucidating the processes that control the concentration and characteristics of organic matter in groundwater.
Concreteness of idiographic worry and anticipatory processing.
McGowan, Sarah Kate; Stevens, Elizabeth S; Behar, Evelyn; Judah, Matt R; Mills, Adam C; Grant, DeMond M
2017-03-01
Worry and anticipatory processing are forms of repetitive negative thinking (RNT) that are associated with maladaptive characteristics and negative consequences. One key maladaptive characteristic of worry is its abstract nature (Goldwin & Behar, 2012; Stöber & Borkovec, 2002). Several investigations have relied on inductions of worry that are social-evaluative in nature, which precludes distinctions between worry and RNT about social-evaluative situations. The present study examined similarities and distinctions between worry and anticipatory processing on potentially important maladaptive characteristics. Participants (N = 279) engaged in idiographic periods of uninstructed mentation, worry, and anticipatory processing and provided thought samples during each minute of each induction. Thought samples were assessed for concreteness, degree of verbal-linguistic activity, and degree of imagery-based activity. Both worry and anticipatory processing were characterized by reduced concreteness, increased abstraction of thought over time, and a predominance of verbal-linguistic activity. However, worry was more abstract, more verbal-linguistic, and less imagery-based relative to anticipatory processing. Finally, worry demonstrated reductions in verbal-linguistic activity over time, whereas anticipatory processing demonstrated reductions in imagery-based activity over time. Worry was limited to non-social topics to distinguish worry from anticipatory processing, and may not represent worry that is social in nature. Generalizability may also be limited by use of an undergraduate sample. Results from the present study provide support for Stöber's theory regarding the reduced concreteness of worry, and suggest that although worry and anticipatory processing share some features, they also contain characteristics unique to each process. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Rivera, J. D.; Moraes, B.; Merson, A. I.; Jouvel, S.; Abdalla, F. B.; Abdalla, M. C. B.
2018-07-01
We perform an analysis of photometric redshifts estimated by using a non-representative training sets in magnitude space. We use the ANNz2 and GPz algorithms to estimate the photometric redshift both in simulations and in real data from the Sloan Digital Sky Survey (DR12). We show that for the representative case, the results obtained by using both algorithms have the same quality, using either magnitudes or colours as input. In order to reduce the errors when estimating the redshifts with a non-representative training set, we perform the training in colour space. We estimate the quality of our results by using a mock catalogue which is split samples cuts in the r band between 19.4 < r < 20.8. We obtain slightly better results with GPz on single point z-phot estimates in the complete training set case, however the photometric redshifts estimated with ANNz2 algorithm allows us to obtain mildly better results in deeper r-band cuts when estimating the full redshift distribution of the sample in the incomplete training set case. By using a cumulative distribution function and a Monte Carlo process, we manage to define a photometric estimator which fits well the spectroscopic distribution of galaxies in the mock testing set, but with a larger scatter. To complete this work, we perform an analysis of the impact on the detection of clusters via density of galaxies in a field by using the photometric redshifts obtained with a non-representative training set.
NASA Astrophysics Data System (ADS)
Rivera, J. D.; Moraes, B.; Merson, A. I.; Jouvel, S.; Abdalla, F. B.; Abdalla, M. C. B.
2018-04-01
We perform an analysis of photometric redshifts estimated by using a non-representative training sets in magnitude space. We use the ANNz2 and GPz algorithms to estimate the photometric redshift both in simulations as well as in real data from the Sloan Digital Sky Survey (DR12). We show that for the representative case, the results obtained by using both algorithms have the same quality, either using magnitudes or colours as input. In order to reduce the errors when estimating the redshifts with a non-representative training set, we perform the training in colour space. We estimate the quality of our results by using a mock catalogue which is split samples cuts in the r-band between 19.4 < r < 20.8. We obtain slightly better results with GPz on single point z-phot estimates in the complete training set case, however the photometric redshifts estimated with ANNz2 algorithm allows us to obtain mildly better results in deeper r-band cuts when estimating the full redshift distribution of the sample in the incomplete training set case. By using a cumulative distribution function and a Monte-Carlo process, we manage to define a photometric estimator which fits well the spectroscopic distribution of galaxies in the mock testing set, but with a larger scatter. To complete this work, we perform an analysis of the impact on the detection of clusters via density of galaxies in a field by using the photometric redshifts obtained with a non-representative training set.
Code of Federal Regulations, 2014 CFR
2014-07-01
... turbidity level of representative samples of a system's filtered water must be less than or equal to 0.5 NTU....74 (a)(1) and (c)(1). (2) The turbidity level of representative samples of a system's filtered water... filtration, the turbidity level of representative samples of a system's filtered water must be less than or...
Impact glasses from the ultrafine fraction of lunar soils
NASA Technical Reports Server (NTRS)
Norris, J. A.; Keller, L. P.; Mckay, D. S.
1993-01-01
The chemical compositions of microscopic glasses produced during meteoroid impacts on the lunar surface provide information regarding the various fractionation processes which accompany these events. To learn more about these fractionation processes, we studied the compositions of submicrometer glass spheres from two Apollo 17 sampling sites using electron microscopy. The majority of the analyzed glasses show evidence for varying degrees of impact induced chemical fractionation. Among these are HASP glasses (High-Al, Si-Poor) which are believed to represent the refractory residuum left after the loss of volatile elements (e.g. Si, Fe, N) from the precursor material. In addition to HASP-type glasses, we also observed a group of VRAP glasses (volatile-rich, Al-poor) that represent condensates of vaporized volatile constituents and are complementary to the HASP compositions. High-Ti glasses were also found during the course of the study, and are documented here for the first time.
Attentional Modulation of Brain Responses to Primary Appetitive and Aversive Stimuli
Field, Brent A.; Buck, Cara L.; McClure, Samuel M.; Nystrom, Leigh E.; Kahneman, Daniel; Cohen, Jonathan D.
2015-01-01
Studies of subjective well-being have conventionally relied upon self-report, which directs subjects’ attention to their emotional experiences. This method presumes that attention itself does not influence emotional processes, which could bias sampling. We tested whether attention influences experienced utility (the moment-by-moment experience of pleasure) by using functional magnetic resonance imaging (fMRI) to measure the activity of brain systems thought to represent hedonic value while manipulating attentional load. Subjects received appetitive or aversive solutions orally while alternatively executing a low or high attentional load task. Brain regions associated with hedonic processing, including the ventral striatum, showed a response to both juice and quinine. This response decreased during the high-load task relative to the low-load task. Thus, attentional allocation may influence experienced utility by modulating (either directly or indirectly) the activity of brain mechanisms thought to represent hedonic value. PMID:26158468
NASA Astrophysics Data System (ADS)
Alkhorayef, M.; Mansour, A.; Sulieman, A.; Alnaaimi, M.; Alduaij, M.; Babikir, E.; Bradley, D. A.
2017-12-01
Butylatedhydroxytoluene (BHT) rods represent a potential dosimeter in radiation processing, with readout via electron paramagnetic resonance (EPR) spectroscopy. Among the possible sources of uncertainty are those associated with the performance of the dosimetric medium and the conditions under which measurements are made, including sampling and environmental conditions. Present study makes estimate of the uncertainties, investigating physical response in different resonance regions. BHT, a white crystalline solid with a melting point of between 70-73 °C, was investigated using 60Co gamma irradiation over the dose range 0.1-100 kGy. The intensity of the EPR signal increases linearly in the range 0.1-35 kGy, the uncertainty budget for high doses being 3.3% at the 2σ confidence level. The rod form represents an excellent alternative dosimeter for high level dosimetry, of small uncertainty compared to powder form.
Microbial Characterization and Comparison of Isolates During the Mir and ISS Missions
NASA Technical Reports Server (NTRS)
Fontenot, Sondra L.; Castro, Victoria; Bruce, Rebekah; Ott, C. Mark; Pierson, Duane L.
2004-01-01
Spacecraft represent a semi-closed ecosystem that provides a unique model of microbial interaction with other microbes, potential hosts, and their environment. Environmental samples from the Mir Space Station (1995-1998) and the International Space Station (ISS) (2000-Present) were collected and processed to provide insight into the characterization of microbial diversity aboard spacecraft over time and assess any potential health risks to the crew. All microbiota were isolated using standard media-based methodologies. Isolates from Mir and ISS were processed using various methods of analysis, including VITEK biochemical analysis, 16s ribosomal identification, and fingerprinting using rep-PCR analysis. Over the first 41 months of habitation, the diversity of the microbiota from air and surface samples aboard ISS increased from an initial six to 53 different bacterial species. During the same period, fungal diversity increased from 2 to 24 species. Based upon rep-PCR analysis, the majority of isolates were unique suggesting the need for increased sampling frequency and a more thorough analysis of samples to properly characterize the ISS microbiota. This limited fungal and bacterial data from environmental samples acquired during monitoring currently do not indicate a microbial hazard to ISS or any trends suggesting potential health risks.
Online Recruitment: Feasibility, Cost, and Representativeness in a Study of Postpartum Women.
Leach, Liana S; Butterworth, Peter; Poyser, Carmel; Batterham, Philip J; Farrer, Louise M
2017-03-08
Online recruitment is feasible, low-cost, and can provide high-quality epidemiological data. However, little is known about the feasibility of recruiting postpartum women online, or sample representativeness. The current study investigates the feasibility of recruiting a population of postpartum women online for health research and examines sample representativeness. Two samples of postpartum women were compared: those recruited online as participants in a brief survey of new mothers (n=1083) and those recruited face-to-face as part of a nationally representative study (n=579). Sociodemographic, general health, and mental health characteristics were compared between the two samples. Obtaining a sample of postpartum women online for health research was highly efficient and low-cost. The online sample over-represented those who were younger (aged 25-29 years), were in a de facto relationship, had higher levels of education, spoke only English at home, and were first-time mothers. Members of the online sample were significantly more likely to have poor self-rated health and poor mental health than the nationally representative sample. Health differences remained after adjusting for sociodemographic differences. Potential exists for feasible and low-cost e-epidemiological research with postpartum populations; however, researchers should consider the potential influence of sample nonrepresentativeness. ©Liana S Leach, Peter Butterworth, Carmel Poyser, Philip J Batterham, Louise M Farrer. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 08.03.2017.
NASA Technical Reports Server (NTRS)
Stern, Jennifer Claire; Mcadam, Amy Catherine; Ten Kate, Inge L.; Bish, David L.; Blake, David F.; Morris, Richard V.; Bowden, Roxane; Fogel, Marilyn L.; Glamoclija, Mihaela; Mahaffy, Paul R.;
2013-01-01
The 2010 Arctic Mars Analog Svalbard Expedition (AMASE) investigated two distinct geologic settings on Svalbard, using methodologies and techniques to be deployed on Mars Science Laboratory (MSL). AMASErelated research comprises both analyses conducted during the expedition and further analyses of collected samples using laboratory facilities at a variety of institutions. The Sample Analysis at Mars (SAM) instrument suite on MSL includes pyrolysis ovens, a gas-processing manifold, a quadrupole mass spectrometer (QMS), several gas chromatography columns, and a Tunable Laser Spectrometer (TLS). An integral part of SAM development is the deployment of SAM-like instrumentation in the field. During AMASE 2010, two parts of SAM participated as stand-alone instruments. A Hiden Evolved Gas Analysis- Mass Spectrometer (EGA-QMS) system represented the EGA-QMS component of SAM, and a Picarro Cavity Ring Down Spectrometer (EGA-CRDS), represented the EGA-TLS component of SAM. A field analog of CheMin, the XRD/XRF on MSL, was also deployed as part of this field campaign. Carbon isotopic measurements of CO2 evolved during thermal decomposition of carbonates were used together with EGA-QMS geochemical data, mineral composition information and contextual observations made during sample collection to distinguish carbonates formation associated with chemosynthetic activity at a fossil methane seep from abiotic processes forming carbonates associated with subglacial basaltic eruptions. Carbon and oxygen isotopes of the basalt-hosted carbonates suggest cryogenic carbonate formation, though more research is necessary to clarify the history of these rocks.
ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS
R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...
Aqueous solubility of Cr(VI) compounds in ferrochrome bag filter dust and the implications thereof
Du Preez, S. P.; Beukes, J. P.; Van Dalen, W. P. J.; ...
2017-04-21
The production of ferrochrome (FeCr) is a reducing process. However, it is impossible to completely exclude oxygen from all of the high-temperature production process steps, which may lead to unintentional formation of small amounts of Cr(VI). The majority of Cr(VI) is associated with particles found in the off-gas of the high-temperature processes, which are cleaned by means of venturi scrubbers or bag filter dust (BFD) systems. BFD contains the highest concentration of Cr(VI) of all FeCr wastes. In this study, the solubility of Cr(VI) present in BFD was determined by evaluating four different BFD samples. The results indicate that themore » currently applied Cr(VI) treatment strategies of the FeCr producer (with process water pH ≤ 9) only effectively extract and treat the water-soluble Cr(VI) compounds, which merely represented approximately 31% of the total Cr(VI) present in the BFD samples evaluated. Extended extraction time, within the afore-mentioned pH range, proved futile in extracting sparingly-soluble and water-insoluble Cr(VI) species, which represented approximately 34% and 35% of the total Cr(VI), respectively. Due to the deficiencies of the current treatment strategies, it is highly likely that sparingly water-soluble Cr(VI) compounds will leach from waste storage facilities (e.g. slimes dams) over time. Therefore, it is critical that improved Cr(VI) treatment strategies be formulated, which should be an important future perspective for FeCr producers and researchers alike.« less
Interpolation of Water Quality Along Stream Networks from Synoptic Data
NASA Astrophysics Data System (ADS)
Lyon, S. W.; Seibert, J.; Lembo, A. J.; Walter, M. T.; Gburek, W. J.; Thongs, D.; Schneiderman, E.; Steenhuis, T. S.
2005-12-01
Effective catchment management requires water quality monitoring that identifies major pollutant sources and transport and transformation processes. While traditional monitoring schemes involve regular sampling at fixed locations in the stream, there is an interest synoptic or `snapshot' sampling to quantify water quality throughout a catchment. This type of sampling enables insights to biogeochemical behavior throughout a stream network at low flow conditions. Since baseflow concentrations are temporally persistence, they are indicative of the health of the ecosystems. A major problem with snapshot sampling is the lack of analytical techniques to represent the spatially distributed data in a manner that is 1) easily understood, 2) representative of the stream network, and 3) capable of being used to develop land management scenarios. This study presents a kriging application using the landscape composition of the contributing area along a stream network to define a new distance metric. This allows for locations that are more `similar' to stay spatially close together while less similar locations `move' further apart. We analyze a snapshot sampling campaign consisting of 125 manually collected grab samples during a summer recession flow period in the Townbrook Research Watershed. The watershed is located in the Catskill region of New York State and represents the mixed forest-agriculture land uses of the region. Our initial analysis indicated that stream nutrients (nitrogen and phosphorus) and chemical (major cations and anions) concentrations are controlled by the composition of landscape characteristics (landuse classes and soil types) surrounding the stream. Based on these relationships, an intuitively defined distance metric is developed by combining the traditional distance between observations and the relative difference in composition of contributing area. This metric is used to interpolate between the sampling locations with traditional geostatistic techniques (semivariograms and ordinary kriging). The resulting interpolations provide continuous stream nutrient and chemical concentrations with reduced kriging RMSE (i.e., the interpolation fits the actual data better) performed without path restriction to the stream channel (i.e., the current default for most geostatistical packages) or performed with an in-channel, Euclidean distance metric (i.e., `as the fish swims' distance). In addition to being quantifiably better, the new metric also produces maps of stream concentrations that match expected continuous stream concentrations based on expert knowledge of the watershed. This analysis and its resulting stream concentration maps provide a representation of spatially distributed synoptic data that can be used to quantify water quality for more effective catchment management that focuses on pollutant sources and transport and transformation processes.
Assessment of Normal Variability in Peripheral Blood Gene Expression
Campbell, Catherine; Vernon, Suzanne D.; Karem, Kevin L.; ...
2002-01-01
Peripheral blood is representative of many systemic processes and is an ideal sample for expression profiling of diseases that have no known or accessible lesion. Peripheral blood is a complex mixture of cell types and some differences in peripheral blood gene expression may reflect the timing of sample collection rather than an underlying disease process. For this reason, it is important to assess study design factors that may cause variability in gene expression not related to what is being analyzed. Variation in the gene expression of circulating peripheral blood mononuclear cells (PBMCs) from three healthy volunteers sampled three times onemore » day each week for one month was examined for 1,176 genes printed on filter arrays. Less than 1% of the genes showed any variation in expression that was related to the time of collection, and none of the changes were noted in more than one individual. These results suggest that observed variation was due to experimental variability.« less
Processing AIRS Scientific Data Through Level 3
NASA Technical Reports Server (NTRS)
Granger, Stephanie; Oliphant, Robert; Manning, Evan
2010-01-01
The Atmospheric Infra-Red Sounder (AIRS) Science Processing System (SPS) is a collection of computer programs, known as product generation executives (PGEs). The AIRS SPS PGEs are used for processing measurements received from the AIRS suite of infrared and microwave instruments orbiting the Earth onboard NASA's Aqua spacecraft. Early stages of the AIRS SPS development were described in a prior NASA Tech Briefs article: Initial Processing of Infrared Spectral Data (NPO-35243), Vol. 28, No. 11 (November 2004), page 39. In summary: Starting from Level 0 (representing raw AIRS data), the AIRS SPS PGEs and the data products they produce are identified by alphanumeric labels (1A, 1B, 2, and 3) representing successive stages or levels of processing. The previous NASA Tech Briefs article described processing through Level 2, the output of which comprises geo-located atmospheric data products such as temperature and humidity profiles among others. The AIRS Level 3 PGE samples selected information from the Level 2 standard products to produce a single global gridded product. One Level 3 product is generated for each day s collection of Level 2 data. In addition, daily Level 3 products are aggregated into two multiday products: an eight-day (half the orbital repeat cycle) product and monthly (calendar month) product.
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
Design of Microstructured Conducting Polymer Films for Enhanced Trace Explosives Detection
NASA Astrophysics Data System (ADS)
Laster, Jennifer S.
The detection of trace amounts of explosive material is critical to national security. Ion mobility spectrometer (IMS)-based contact sampling continues to be a common method employed for the detection of explosives in high security checkpoint applications, such as airport security. In this process a surface of interest, such as a passenger's hands or luggage, is probed by a swab or particle trap to collect and transfer residue to an IMS for analysis. The collection of residue on a sampling swab has been shown to be a limiting step in this detection process. As such, there is significant need to develop new materials with increased adhesion to explosive analytes and with superior particle removal abilities. Here, the design of novel sampling swabs is presented for the enhanced collection of trace explosive residue from surfaces. First, the influence of the swab microstructure on the ability to remove particles from representative substrates is demonstrated. Free-standing microstructured polypyrrole (PPy) films of a variety of dimensions and form factors are fabricated using a templated electropolymerization process. The removal of polystyrene fluorescent particles from an aluminum substrate of varying surface roughness is examined as a function of the polymer microstructure. PPy microstructured films display enhanced particle removal abilities compared to PPy non-structured and current commercial films. This increase in particle removal is attributed to the increased particle-swab contact from the microstructured films. Next, the influence of the surface chemistry of sampling swabs on the collection of a representative explosive analyte, trinitrotoluene (TNT) is explored. The surface chemistry of PPy films is modified by electropolymerizaton of an N-substituted pyrrole monomer. The surface chemistries examined include a methyl, carboxylic acid, and amino-phenyl functionality. The vapor deposition of TNT on the surface of the functionalized PPy films is quantified through ultraviolet-visible (UV-vis) absorption and compared to commercial swabbing materials of varying chemistry and surface roughness. The PPy modified films with potential sites for hydrogen bonding display the highest deposition of TNT, while the Teflon coated commercial films display the lowest interaction with TNT. Finally, the desorption and release of TNT from sampling swabs is studied as an effect of temperature and of applied bias. For successful analyte detection within an IMS, the residue collected on a sampling swab must be released from the swab, typically through a thermal desorption process. In this work the release of TNT from sampling swabs is determined through solid-phase microextraction-gas chromatography mass spectrometry (SPME-GCMS). The results of this thesis provide important information on the design considerations for the development of novel particle sampling swabs with increased performance.
Hydrocarbons in particulate samples from wildfire events in central Portugal in summer 2010.
Vicente, Ana; Calvo, Ana; Fernandes, Ana P; Nunes, Teresa; Monteiro, Cristina; Pio, Casimiro; Alves, Célia
2017-03-01
In summer 2010, twenty eight (14 PM 2.5 samples plus 14 samples PM 2.5-10 ) smoke samples were collected during wildfires that occurred in central Portugal. A portable high-volume sampler was used to perform the sampling, on quartz fibre filters of coarse (PM 2.5-10 ) and fine (PM 2.5 ) smoke samples. The carbonaceous content (elemental and organic carbon) of particulate matter was analysed by a thermal-optical technique. Subsequently, the particulate samples were solvent extracted and fractionated by vacuum flash chromatography into three different classes of organic compounds (aliphatics, polycyclic aromatic hydrocarbons (PAHs) and carbonyl compounds). The organic speciation was performed by gas chromatography-mass spectrometry (GC-MS). Emissions were dominated by the fine particles, which represented around 92% of the PM 10 . A clear predominance of carbonaceous constituents was observed, with organic to elemental carbon (OC/EC) ratios ranging between 1.69 and 245 in both size fractions. The isoprenoid ketone 6,10,14-trimethyl-2-pentadecanone, a tracer for secondary organic aerosol formation, was one of the dominant constituents in both fine and coarse particles. Retene was the most abundant compound in all samples. Good correlations were obtained between OC and both aliphatic and PAH compounds. Pyrogenic processes, thermal release of biogenic compounds and secondary processing accounted for 97% of the apportioned PM 2.5 levels. Copyright © 2016. Published by Elsevier B.V.
Online Deviation Detection for Medical Processes
Christov, Stefan C.; Avrunin, George S.; Clarke, Lori A.
2014-01-01
Human errors are a major concern in many medical processes. To help address this problem, we are investigating an approach for automatically detecting when performers of a medical process deviate from the acceptable ways of performing that process as specified by a detailed process model. Such deviations could represent errors and, thus, detecting and reporting deviations as they occur could help catch errors before harm is done. In this paper, we identify important issues related to the feasibility of the proposed approach and empirically evaluate the approach for two medical procedures, chemotherapy and blood transfusion. For the evaluation, we use the process models to generate sample process executions that we then seed with synthetic errors. The process models describe the coordination of activities of different process performers in normal, as well as in exceptional situations. The evaluation results suggest that the proposed approach could be applied in clinical settings to help catch errors before harm is done. PMID:25954343
NASA Astrophysics Data System (ADS)
Tully, B. J.; Heidelberg, J. F.; Kraft, B.; Girguis, P. R.; Huber, J. A.
2016-12-01
The oceanic crust contains the largest aquifer on Earth with a volume approximately 2% of the global ocean. Ongoing research at the North Pond (NP) site, west of the Mid-Atlantic Ridge, provides an environment representative of oxygenated crustal aquifers beneath oligotrophic surface waters. Using subseafloor CORK observatories for multiple sampling depths beneath the seafloor, crustal fluids were sampled along the predicted aquifer fluid flow path over a two-year period. DNA was extracted and sequenced for metagenomic analysis from 22 crustal fluid samples, along with the overlying bottom. At broad taxonomic groupings, the aquifer system is highly dynamic over time and space, with shifts in dominant taxa and "blooms" of transient groups that appear at discreet time points and sample depths. We were able to reconstruct 194 high-quality, low-contamination bacterial and archaeal metagenomic-assembled genomes (MAGs) with estimated completeness >50% (429 MAGs >20% complete). Environmental genomes were assigned to phylogenies from the major bacterial phyla, putative novel groups, and poorly sampled phylogenetic groups, including the Marinimicrobia, Candidate Phyla Radiation, and Planctomycetes. Biogeochemically relevant processes were assigned to MAGs, including denitrification, dissimilatory sulfur and hydrogen cycling, and carbon fixation. Collectively, the oxic NP aquifer system represents a diverse, dynamic microbial habitat with the metabolic potential to impact multiple globally relevant biogeochemical cycles, including nitrogen, sulfur, and carbon.
230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paces, James B.
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rindsmore » on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.« less
230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis
Paces, James B.
2014-01-01
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.
Listeria spp. in Street-Vended Ready-to-Eat Foods
El-Shenawy, Moustafa; El-Shenawy, Mohamed; Mañes, Jordi; Soriano, Jose M.
2011-01-01
Street-vended ready-to-eat food sold in Egypt, including sandwiches and dishes of traditional food, was examined for the presence of Listeria species. Out of 576 samples, 24% were found to contain Listeria species. L. monocytogenes and L. innocua were isolated from 57% and 39% of the contaminated samples, respectively. Other Listeria spp. were detected with lower frequency. L. monocytogenes of ≥103 CFU/g were detected in 7% of the total examined samples, which represent 49% of the contaminated food samples (meat, poultry, seafood, dairy products, and products of plant origin). Most of the samples contaminated by L. monocytogenes had high levels of total viable bacterial counts. The results obtained may help to clarify the epidemiology of listeriosis in the country and draw the attention of the decision makers to issue hygienic regulations for food processing industries as well as street vendors in order to ensure safe street-vended ready-to-eat food. PMID:22194742
Segments from red blood cell units should not be used for quality testing.
Kurach, Jayme D R; Hansen, Adele L; Turner, Tracey R; Jenkins, Craig; Acker, Jason P
2014-02-01
Nondestructive testing of blood components could permit in-process quality control and reduce discards. Tubing segments, generated during red blood cell (RBC) component production, were tested to determine their suitability as a sample source for quality testing. Leukoreduced RBC components were produced from whole blood (WB) by two different methods: WB filtration and buffy coat (BC). Components and their corresponding segments were tested on Days 5 and 42 of hypothermic storage (HS) for spun hematocrit (Hct), hemoglobin (Hb) content, percentage hemolysis, hematologic indices, and adenosine triphosphate concentration to determine whether segment quality represents unit quality. Segment samples overestimated hemolysis on Days 5 and 42 of HS in both BC- and WB filtration-produced RBCs (p < 0.001 for all). Hct and Hb levels in the segments were also significantly different from the units at both time points for both production methods (p < 0.001 for all). Indeed, for all variables tested different results were obtained from segment and unit samples, and these differences were not consistent across production methods. The quality of samples from tubing segments is not representative of the quality of the corresponding RBC unit. Segments are not suitable surrogates with which to assess RBC quality. © 2013 American Association of Blood Banks.
Fantuzzi, G; Righi, E; Predieri, G; Giacobazzi, P; Mastroianni, K; Aggazzotti, G
2010-01-01
The aim of the present study was to investigate the environmental and healthy aspects from a representative sample of indoor swimming pools located in the Emilia Romagna region. During the sampling sessions, the occupational environment was evaluated in terms of microclimate parameters and thermal comfort/discomfort conditions. Moreover the chemical risk was assessed by analyzing from the pool water the presence of disinfection by-products (DBPs), such as: trihalomethanes (THMs), haloacetic acids (HAAs), chlorite, chlorate and bromate. The analytical results are in agreement with the Italian legislation (Accordo Stato-Regioni; 2003) even if in some of the sampled indoor swimming pools, the dosed combined chlorine levels, were greater than the Italian limit. With the regard to the microclimate conditions evaluation, the considered thermal indices, Predicted Mean Vote (PMV) and Predicted Percentage of Dissatisfied (PPD%), described a satisfactory occupational environment. Among DBPs, the THMs mean levels (41.4 +/- 30.0 microg/l) resulted close to the values of the current Italian drinking water legislation, and seem to not represent an health issue. The pool waters chlorate levels (range: 5 - 19537 microg/l) need further investigations as recent epidemiological studies on drinking water hypothesized a potential genotoxicity effect of these compounds which are involved in cellular oxidative processes.
Fish tracking by combining motion based segmentation and particle filtering
NASA Astrophysics Data System (ADS)
Bichot, E.; Mascarilla, L.; Courtellemont, P.
2006-01-01
In this paper, we suggest a new importance sampling scheme to improve a particle filtering based tracking process. This scheme relies on exploitation of motion segmentation. More precisely, we propagate hypotheses from particle filtering to blobs of similar motion to target. Hence, search is driven toward regions of interest in the state space and prediction is more accurate. We also propose to exploit segmentation to update target model. Once the moving target has been identified, a representative model is learnt from its spatial support. We refer to this model in the correction step of the tracking process. The importance sampling scheme and the strategy to update target model improve the performance of particle filtering in complex situations of occlusions compared to a simple Bootstrap approach as shown by our experiments on real fish tank sequences.
Copper/solder intermetallic growth studies.
Kirchner, K W; Lucey, G K; Geis, J
1993-08-01
Copper samples, hot solder (eutectic) dipped and thermally aged, were cross-sectioned and placed in an environmental scanning electronic microscope (ESEM). While in the ESEM the samples were heated for approximately 2.5 h at 170 degrees C to stimulate the growth of additional Cu/Sn intermetallic compound. The intent of the study was to obtain a continuous real-time videotape record of the diffusion process and compare the observations to static SEM images reported to represent long-term, naturally aged intermetallic growth. The video obtained allows the observation of the diffusion process and relativistic growth phenomena at the Cu, Cu3Sn, Cu6Sn5, and solder interfaces as well as effects on the bulk Cu and solder. Effects contrary to earlier reports were observed; for example, growth rates of Cu3Sn were found to greatly exceed those of Cu6Sn5.
Raman spectroscopic analysis of real samples: Brazilian bauxite mineralogy
NASA Astrophysics Data System (ADS)
Faulstich, Fabiano Richard Leite; Castro, Harlem V.; de Oliveira, Luiz Fernando Cappa; Neumann, Reiner
2011-10-01
In this investigation, Raman spectroscopy with 1064 and 632.8 nm excitation was used to investigate real mineral samples of bauxite ore from mines of Northern Brazil, together with Raman mapping and X-rays diffraction. The obtained results show clearly that the use of microRaman spectroscopy is a powerful tool for the identification of all the minerals usually found in bauxites: gibbsite, kaolinite, goethite, hematite, anatase and quartz. Bulk samples can also be analysed, and FT-Raman is more adequate due to better signal-to-noise ratio and representativity, although not efficient for kaolinite. The identification of fingerprinting vibrations for all the minerals allows the acquisition of Raman-based chemical maps, potentially powerful tools for process mineralogy applied to bauxite ores.
Soil Geochemical Data for the Wyoming Landscape Conservation Initiative Study Area
Smith, David B.; Ellefsen, Karl J.
2010-01-01
In 2008, soil samples were collected at 139 sites throughout the Wyoming Landscape Conservation Initiative study area in southwest Wyoming. These samples, representing a density of 1 site per 440 square kilometers, were collected from a depth of 0-5 cm and analyzed for a suite of more than 40 major and trace elements following a near-total multi-acid extraction. In addition, soil pH, electrical conductivity, total nitrogen, total and organic carbon, and sodium adsorption ratio were determined. The resulting data set provides a baseline for detecting changes in soil composition that might result from natural processes or anthropogenic activities. This report describes the sampling and analytical protocols used, and makes available all the soil geochemical data generated in the study.
Neutron Tomography at the Los Alamos Neutron Science Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, William Riley
Neutron imaging is an incredibly powerful tool for non-destructive sample characterization and materials science. Neutron tomography is one technique that results in a three-dimensional model of the sample, representing the interaction of the neutrons with the sample. This relies both on reliable data acquisition and on image processing after acquisition. Over the course of the project, the focus has changed from the former to the latter, culminating in a large-scale reconstruction of a meter-long fossilized skull. The full reconstruction is not yet complete, though tools have been developed to improve the speed and accuracy of the reconstruction. This project helpsmore » to improve the capabilities of LANSCE and LANL with regards to imaging large or unwieldy objects.« less
Hu, Jia; Liden, Robert C
2011-07-01
Integrating theories of self-regulation with team and leadership literatures, this study investigated goal and process clarity and servant leadership as 3 antecedents of team potency and subsequent team effectiveness, operationalized as team performance and organizational citizenship behavior. Our sample of 304 employees represented 71 teams in 5 banks. Results showed that team-level goal and process clarity as well as team servant leadership served as 3 antecedents of team potency and subsequent team performance and team organizational citizenship behavior. Furthermore, we found that servant leadership moderated the relationships between both goal and process clarity and team potency, such that the positive relationships between both goal and process clarity and team potency were stronger in the presence of servant leadership.
The Apollo Lunar Sample Image Collection: Digital Archiving and Online Access
NASA Technical Reports Server (NTRS)
Todd, Nancy S.; Lofgren, Gary E.; Stefanov, William L.; Garcia, Patricia A.
2014-01-01
The primary goal of the Apollo Program was to land human beings on the Moon and bring them safely back to Earth. This goal was achieved during six missions - Apollo 11, 12, 14, 15, 16, and 17 - that took place between 1969 and 1972. Among the many noteworthy engineering and scientific accomplishments of these missions, perhaps the most important in terms of scientific impact was the return of 382 kg (842 lb.) of lunar rocks, core samples, pebbles, sand, and dust from the lunar surface to Earth. Returned samples were curated at JSC (then known as the Manned Spacecraft Center) and, as part of the original processing, high-quality photographs were taken of each sample. The top, bottom, and sides of each rock sample were photographed, along with 16 stereo image pairs taken at 45-degree intervals. Photographs were also taken whenever a sample was subdivided and when thin sections were made. This collection of lunar sample images consists of roughly 36,000 photographs; all six Apollo missions are represented.
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
ERIC Educational Resources Information Center
Kalindi, Sylvia Chanda; McBride, Catherine; Tong, Xiuhong; Wong, Natalie Lok Lee; Chung, Kien Hoa Kevin; Lee, Chia-Ying
2015-01-01
To examine cognitive correlates of dyslexia in Chinese and reading difficulties in English as a foreign language, a total of 14 Chinese dyslexic children (DG), 16 poor readers of English (PE), and 17 poor readers of both Chinese and English (PB) were compared to a control sample (C) of 17 children, drawn from a statistically representative sample…
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Div. of Human Resources.
This document comprises the General Accounting Office's (GAO) report to the House Subcommittee on Intergovernmental Relations concerning the GAO's review of the contract awarding process of the National Institute of Education (NIE) for fiscal year 1983. The review focused on 39 of NIE's 52 newly negotiated procurement contracts; a sample of…
Dobrokhotskiĭ, O N; Diatlov, I A; Orlov, O I; Novikova, N D; Khamidullina, N M; Deshevaia, E A
2012-01-01
The necessity of microbial studying the soil from Phobos and terrestrial biological objects, which were for a long time in deep space and may represent a biological hazard has been shown. Developed medical and technical requirements for laboratories comply with Russian regulations and recommendations of international instruments for the ensuring of biosafety on the basis of process (continuous) biorisk management.
Ružičková, Silvia; Remeteiová, Dagmar; Mičková, Vladislava; Dirner, Vojtech
2018-02-21
In this work, the matrix characterization (mineralogy, total and local chemical composition, and total organic (TOC) and inorganic carbon (TIC) contents) of different types of sediments from mining- and metallurgy-influenced areas and the assessment of the impact of the matrix on the association of potentially hazardous metals with the mineral phases of these samples, which affect their mobility in the environment, are presented. For these purposes, sediment samples with different origins and from different locations in the environment were analyzed. Anthropogenic sediments from metal-rich post-flotation tailings (Lintich, Slovakia) represent waste from ore processing, natural river sediments from the Hornád River (Košice, Slovakia) represent areas influenced predominantly by the metallurgical industry, and lake sediments from a water reservoir Ružín (inflow from the Hornád and Hnilec Rivers, Slovakia) represent the impact of the metallurgical and/or mining industries. The total metal contents were determined by X-ray fluorescence (XRF) analysis, the local chemical and morphological microanalysis by scanning electron microscopy with energy-dispersive spectroscopy (SEM-EDS), and the TOC and TIC contents by infrared (IR) spectrometry. The mobility/bioavailability of Cu, Pb, and Zn in/from sediments at the studied areas was assessed by ethylenediaminetetraacetic acid (EDTA) and acetic acid (AA) extraction and is discussed in the context of the matrix composition. The contents of selected potentially hazardous elements in the extracts were determined by the high-resolution continuum source flame atomic absorption spectrometry (HR-CS FAAS).
Nagy, Attila; Kovács, Nóra; Pálinkás, Anita; Sipos, Valéria; Vincze, Ferenc; Szőllősi, Gergő; Csenteri, Orsolya; Ádány, Róza; Sándor, János
2018-06-01
The study aimed to launch a T2DM adult cohort that is representative of Hungary through a cross-sectional study, to produce the most important quality indicators for T2DM care, to describe social inequalities, and to estimate the absolute number of T2DM adult patients with uncontrolled HbA1c levels in Hungary. A representative sample of the Hungarian T2DM adults (N=1280) was selected in 2016. GPs collected data on socio-demographic status by questionnaire, and on history and laboratory parameters from medical records. The process and outcome indicators used in the international monitoring practice were calculated. The socio-economic status influence was determined by multivariate logistic regression models. Target achievement was 61.66%, 53.48%, and 54.00% for HbA1c, LDL-C, and blood pressure, respectively, in the studied sample (N=1176). In Hungary, 294,534 patients have above target HbA1c value out of 495,801 T2DM adults. The education-dependent positive association with majority of process indicators was not reflected in HbA1c, LDL-C, and blood pressure target achievements. The risk of microvascular complications and requirement of insulin treatment were higher among less educated. According to our observations, the education-independent target achievement for HbA1c and LDL-C is similar as, for blood pressure is less effective in Hungary than in Europe. Copyright © 2017 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.
Permuting input for more effective sampling of 3D conformer space
NASA Astrophysics Data System (ADS)
Carta, Giorgio; Onnis, Valeria; Knox, Andrew J. S.; Fayne, Darren; Lloyd, David G.
2006-03-01
SMILES strings and other classic 2D structural formats offer a convenient way to represent molecules as a simplistic connection table, with the inherent advantages of ease of handling and storage. In the context of virtual screening, chemical databases to be screened are often initially represented by canonicalised SMILES strings that can be filtered and pre-processed in a number of ways, resulting in molecules that occupy similar regions of chemical space to active compounds of a therapeutic target. A wide variety of software exists to convert molecules into SMILES format, namely, Mol2smi (Daylight Inc.), MOE (Chemical Computing Group) and Babel (Openeye Scientific Software). Depending on the algorithm employed, the atoms of a SMILES string defining a molecule can be ordered differently. Upon conversion to 3D coordinates they result in the production of ostensibly the same molecule. In this work we show how different permutations of a SMILES string can affect conformer generation, affecting reliability and repeatability of the results. Furthermore, we propose a novel procedure for the generation of conformers, taking advantage of the permutation of the input strings—both SMILES and other 2D formats, leading to more effective sampling of conformation space in output, and also implementing fingerprint and principal component analyses step to post process and visualise the results.
Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations
Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro
2015-01-01
Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic. PMID:26177039
Distance-Based Configurational Entropy of Proteins from Molecular Dynamics Simulations.
Fogolari, Federico; Corazza, Alessandra; Fortuna, Sara; Soler, Miguel Angel; VanSchouwen, Bryan; Brancolini, Giorgia; Corni, Stefano; Melacini, Giuseppe; Esposito, Gennaro
2015-01-01
Estimation of configurational entropy from molecular dynamics trajectories is a difficult task which is often performed using quasi-harmonic or histogram analysis. An entirely different approach, proposed recently, estimates local density distribution around each conformational sample by measuring the distance from its nearest neighbors. In this work we show this theoretically well grounded the method can be easily applied to estimate the entropy from conformational sampling. We consider a set of systems that are representative of important biomolecular processes. In particular: reference entropies for amino acids in unfolded proteins are obtained from a database of residues not participating in secondary structure elements;the conformational entropy of folding of β2-microglobulin is computed from molecular dynamics simulations using reference entropies for the unfolded state;backbone conformational entropy is computed from molecular dynamics simulations of four different states of the EPAC protein and compared with order parameters (often used as a measure of entropy);the conformational and rototranslational entropy of binding is computed from simulations of 20 tripeptides bound to the peptide binding protein OppA and of β2-microglobulin bound to a citrate coated gold surface. This work shows the potential of the method in the most representative biological processes involving proteins, and provides a valuable alternative, principally in the shown cases, where other approaches are problematic.
Impacts of variability in geomechanical properties on hydrate bearing sediment responses
NASA Astrophysics Data System (ADS)
Lin, J. S.; Uchida, S.; Choi, J. H.; Seol, Y.
2017-12-01
Hydrate bearing sediments (HBS) may become unstable during the gas production operation, or from natural processes such as change in the landform or temperature. The geomechanical modeling is a rational way to assess HBS stability regardless of the process involved. At the present time, such modeling is laced with uncertainties. The uncertainties come from many sources that include the adequacy of a modeling framework to accurately project the response of HBS, the gap in the available field information, and the variability in the laboratory test results from limited samples. For a reasonable stability assessment, the impact of the various uncertainties have to be addressed. This study looks into one particular aspect of the uncertainty, namely, the uncertainty caused by the scatter in the laboratory tests and the ability of a constitutive model to adequately represent them. Specifically this study focuses on the scatter in the results from laboratory tests on high quality pressured core samples from a marine site, and use a critical state constitutive model to represent them. The study investigates how the HBS responses shift when the parameters of the constitutive model are varied to reflect the different aspects of experimental results. Also investigated are impacts on the responses by altering certain formulations of the constitutive model to suit particular sets of results.
Untargeted analysis to monitor metabolic changes of garlic along heat treatment by LC-QTOF MS/MS.
Molina-Calle, María; Sánchez de Medina, Verónica; Calderón-Santiago, Mónica; Priego-Capote, Feliciano; Luque de Castro, María D
2017-09-01
Black garlic is increasing its popularity in cuisine around the world; however, scant information exists on the composition of this processed product. In this study, polar compounds in fresh garlic and in samples taken at different times during the heat treatment process to obtain black garlic have been characterized by liquid chromatography coupled to tandem mass spectrometry in high resolution mode. Ninety-five compounds (mainly amino acids and metabolites, organosulfur compounds, and saccharides and derivatives) were tentatively identified in all the analysed samples and classified as a function of the family they belong to. Statistical analysis of the results allowed establishing that the major changes in garlic occur during the first days of treatment, and they mainly affect to the three representative families. The main pathways involved in the synthesis of the compounds affected by heat treatment, and their evolution during the process were studied. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Microbiological survey of five poultry processing plants in the UK.
Mead, G C; Hudson, W R; Hinton, M H
1993-07-01
1. Neck skin samples were taken from chickens and turkeys at all the main stages of processing to monitor changes in total viable count (TVC) and counts of coliforms and pseudomonads. 2. Processing reduced TVC by up to 100-fold. Geometric mean counts after packaging were log10 4.4 to 5.3 CFU/g whilst corresponding counts of coliforms were 2.7 to 3.8 CFU/g. 3. Increases in mean TVC or coliforms as a result of either defeathering or evisceration did not exceed 0.6 log. 4. Pseudomonads represented only a minor fraction of the initial microflora of the bird and were often reduced by scalding to a figure which could not be detected by direct plating of samples; however, subsequent contamination resulted in means between log10 2.9 and 4.0 CFU/g for packaged carcases. 5. Although Staphylococcus aureus was readily isolated from defeathering equipment, mean counts from defeathered carcases were always below log10 3.0 CFU/g.
NASA Astrophysics Data System (ADS)
Cen, Huoshi; Liu, Wenlong; Nan, Zhaodong
In situ microcalorimetry was first used to study the crystal formation processes of porous metal-organic frameworks (MOF), [((CH3)2NH2)Cd(MIPA)]4ṡ×G, where H3MIPA is 5-Mercaptoisophthalic acid, G represents guest of DMA and H2O. An endothermic process occurred firstly, which is corresponding to the chemical reaction among the reactants. Exothermic processes followed the endothermal process are corresponding to crystal nucleation and growth. The experimental results demonstrate that a solid sample was first obtained at 150∘C for 24h. X-ray powder diffraction (XRD) peaks of the samples enhanced with the experimental time increasing from 24 to 60h (as 24, 35, 48 and 60h). The adsorption properties of the crystal obtained at 150∘C for 60h are more excellent than those reported and the same MOF synthesized at 150∘C for 72h. This study may give a method for investigation on MOFs formation mechanism and help to synthesize this kind of functional materials.
Personality Processes: Mechanisms by which Personality Traits “Get Outside the Skin”
Hampson, Sarah E.
2011-01-01
It is time to better understand why personality traits predict consequential outcomes, which calls for a closer look at personality processes. Personality processes are mechanisms that unfold over time to produce the effects of personality traits. They include reactive and instrumental processes that moderate or mediate the association between traits and outcomes. These mechanisms are illustrated here by a selection of studies of traits representing the three broad domains of personality and temperament: negative emotionality, positive emotionality, and constraint. Personality processes are studied over the short-term, as in event-sampling studies, and over the long-term, as in lifespan research. Implications of findings from the study of processes are considered for resolving issues in models of personality structure, improving and extending methods of personality assessment, and identifying targets for personality interventions. PMID:21740225
Predictors of Physical Altercation among Adolescents in Residential Substance Abuse Treatment
Crawley, Rachel D.; Becan, Jennifer Edwards; Knight, Danica Kalling; Joe, George W.; Flynn, Patrick M.
2014-01-01
This study tested the hypothesis that basic social information-processing components represented by family conflict, peer aggression, and pro-aggression cognitive scripts are related to aggression and social problems among adolescents in substance abuse treatment. The sample consisted of 547 adolescents in two community-based residential facilities. Correlation results indicated that more peer aggression is related to more pro-aggression scripts; scripts, peer aggression, and family conflict are associated with social problems; and in-treatment physical altercation involvement is predicted by higher peer aggression. Findings suggest that social information-processing components are valuable for treatment research. PMID:26622072
Invited Review Small is beautiful: The analysis of nanogram-sized astromaterials
NASA Astrophysics Data System (ADS)
Zolensky, M. E.; Pieters, C.; Clark, B.; Papike, J. J.
2000-01-01
The capability of modern methods to characterize ultra-small samples is well established from analysis of interplanetary dust particles (IDPs), interstellar grains recovered from meteorites, and other materials requiring ultra-sensitive analytical capabilities. Powerful analytical techniques are available that require, under favorable circumstances, single particles of only a few nanograms for entire suites of fairly comprehensive characterizations. A returned sample of >1,000 particles with total mass of just one microgram permits comprehensive quantitative geochemical measurements that are impractical to carry out in situ by flight instruments. The main goal of this paper is to describe the state-of-the-art in microanalysis of astromaterials. Given that we can analyze fantastically small quantities of asteroids and comets, etc., we have to ask ourselves how representative are microscopic samples of bodies that measure a few to many km across? With the Galileo flybys of Gaspra and Ida, it is now recognized that even very small airless bodies have indeed developed a particulate regolith. Acquiring a sample of the bulk regolith, a simple sampling strategy, provides two critical pieces of information about the body. Regolith samples are excellent bulk samples since they normally contain all the key components of the local environment, albeit in particulate form. Furthermore, since this fine fraction dominates remote measurements, regolith samples also provide information about surface alteration processes and are a key link to remote sensing of other bodies. Studies indicate that a statistically significant number of nanogram-sized particles should be able to characterize the regolith of a primitive asteroid, although the presence of larger components within even primitive meteorites (e.g.. Murchison), e.g. chondrules, CAI, large crystal fragments, etc., points out the limitations of using data obtained from nanogram-sized samples to characterize entire primitive asteroids. However, most important asteroidal geological processes have left their mark on the matrix, since this is the finest-grained portion and therefore most sensitive to chemical and physical changes. Thus, the following information can be learned from this fine grain size fraction alone: (1) mineral paragenesis; (2) regolith processes, (3) bulk composition; (4) conditions of thermal and aqueous alteration (if any); (5) relationships to planets, comets, meteorites (via isotopic analyses, including oxygen; (6) abundance of water and hydrated material; (7) abundance of organics; (8) history of volatile mobility, (9) presence and origin of presolar and/or interstellar material. Most of this information can even be obtained from dust samples from bodies for which nanogram-sized samples are not truly representative. Future advances in sensitivity and accuracy of laboratory analytical techniques can be expected to enhance the science value of nano- to microgram sized samples even further. This highlights a key advantage of sample returns - that the most advanced analysis techniques can always be applied in the laboratory, and that well-preserved samples are available for future investigations.
NASA Astrophysics Data System (ADS)
Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd
2017-05-01
This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.
Kauppinen, Ari; Toiviainen, Maunu; Korhonen, Ossi; Aaltonen, Jaakko; Järvinen, Kristiina; Paaso, Janne; Juuti, Mikko; Ketolainen, Jarkko
2013-02-19
During the past decade, near-infrared (NIR) spectroscopy has been applied for in-line moisture content quantification during a freeze-drying process. However, NIR has been used as a single-vial technique and thus is not representative of the entire batch. This has been considered as one of the main barriers for NIR spectroscopy becoming widely used in process analytical technology (PAT) for freeze-drying. Clearly it would be essential to monitor samples that reliably represent the whole batch. The present study evaluated multipoint NIR spectroscopy for in-line moisture content quantification during a freeze-drying process. Aqueous sucrose solutions were used as model formulations. NIR data was calibrated to predict the moisture content using partial least-squares (PLS) regression with Karl Fischer titration being used as a reference method. PLS calibrations resulted in root-mean-square error of prediction (RMSEP) values lower than 0.13%. Three noncontact, diffuse reflectance NIR probe heads were positioned on the freeze-dryer shelf to measure the moisture content in a noninvasive manner, through the side of the glass vials. The results showed that the detection of unequal sublimation rates within a freeze-dryer shelf was possible with the multipoint NIR system in use. Furthermore, in-line moisture content quantification was reliable especially toward the end of the process. These findings indicate that the use of multipoint NIR spectroscopy can achieve representative quantification of moisture content and hence a drying end point determination to a desired residual moisture level.
VLC-beacon detection with an under-sampled ambient light sensor
NASA Astrophysics Data System (ADS)
Green, Jacob; Pérez-Olivas, Huetzin; Martínez-Díaz, Saúl; García-Márquez, Jorge; Domínguez-González, Carlos; Santiago-Montero, Raúl; Guan, Hongyu; Rozenblat, Marc; Topsu, Suat
2017-08-01
LEDs will replace in a near future the current worldwide lighting mainly due to their low production-cost and energy-saving assets. Visible light communications (VLC) will turn gradually the existing lighting network into a communication network. Nowadays VLC transceivers can be found in some commercial centres in Europe; some of them broadcast continuously an identification tag that contains its coordinate position. In such a case, the transceiver acts as a geolocation beacon. Nevertheless, mobile transceivers represent a challenge in the VLC communication chain, as smartphones have not integrated yet a VLC customized detection stage. In order to make current smartphones capable to detect VLC broadcasted signals, their Ambient Light Sensor (ALS) is adapted as a VLC detector. For this to be achieved, lighting transceivers need to adapt their modulation scheme. For instance, frequencies representing start bit, 1, and 0 logic values can be set to avoid flicker from illumination and to permit detecting the under-sampled signal. Decoding the signal requires a multiple steps real-time signal processing as shown here.
Data Analysis with Graphical Models: Software Tools
NASA Technical Reports Server (NTRS)
Buntine, Wray L.
1994-01-01
Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Ma, Ruoshui; Zhang, Xiumei; Wang, Yi; Zhang, Xiao
2018-04-27
The heterogeneous and complex structural characteristics of lignin present a significant challenge to predict its processability (e.g. depolymerization, modifications etc) to valuable products. This study provides a detailed characterization and comparison of structural properties of seven representative biorefinery lignin samples derived from forest and agricultural residues, which were subjected to representative pretreatment methods. A range of wet chemistry and spectroscopy methods were applied to determine specific lignin structural characteristics such as functional groups, inter-unit linkages and peak molecular weight. In parallel, oxidative depolymerization of these lignin samples to either monomeric phenolic compounds or dicarboxylic acids were conducted, and the product yields were quantified. Based on these results (lignin structural characteristics and monomer yields), we demonstrated for the first time to apply multiple-variable linear estimations (MVLE) approach using R statistics to gain insight toward a quantitative correlation between lignin structural properties and their conversion reactivity toward oxidative depolymerization to monomers. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Krueger, Martin; Straaten, Nontje; Mazzini, Adriano
2015-04-01
The Lusi eruption represents one of the largest ongoing sedimentary hosted geothermal systems. This eruption started in 2006 following to a 6.3 M earthquake that stroke Java Island. Since then it has been spewing boiling mud from a central crater with peaks reaching 180.000 m3 per day. Today an area of about 8 km2 is covered by locally dried mud breccia where a network of hundreds of satellite seeping pools is active. Numerous investigations focused on the study of offshore microbial colonies that commonly thrive at offshore methane seeps and mud volcanoes, however very little has been done for onshore seeping structures. Lusi represents a unique opportunity to complete a comprehensive study of onshore microbial communities fed by the seepage of CH4 and CO2 as well as of heavier liquid hydrocarbons originating from several km below the surface. We conducted a sampling campaign at the Lusi site collecting samples of fresh mud close to the erupting crater using a remote controlled drone. In addition we completed a transect towards outer parts of the crater to collect older, weathered samples for comparison. In all samples active microorganisms were present. The highest activities for CO2 and CH4 production as well as for CH4 oxidation and hydrocarbon degradation were observed in medium-age mud samples collected roughly in the middle of the transect. Rates for aerobic methane oxidation were high, as was the potential of the microbial communities to degrade hydrocarbons (oils, alkanes, BTEX tested). The data suggests a transition of microbial populations from an anaerobic, hydrocarbon-driven metabolism in fresher samples from center or from small seeps to more generalistic, aerobic microbial communities in older, more consolidated sediments. Currently, the microbial communities in the different sediment samples are analyzed using quantitative PCR and T-RFLP combined with MiSeq sequencing. This study represents an initial step to better understand onshore seepage systems and provides an ideal analogue for comparison with the better investigated offshore structures.
Design and evaluation of a nondestructive fissile assay device for HTGR fuel samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeany, S. R.; Knoll, R. W.; Jenkins, J. D.
1979-02-01
Nondestructive assay of fissile material plays an important role in nuclear fuel processing facilities. Information for product quality control, plant criticality safety, and nuclear materials accountability can be obtained from assay devices. All of this is necessary for a safe, efficient, and orderly operation of a production plant. Presented here is a design description and an operational evaluation of a device developed to nondestructively assay small samples of High-Temperature Gas-Cooled Reactor (HTGR) fuel. The measurement technique employed consists in thermal-neutron irradiation of a sample followed by pneumatic transfer to a high-efficiency neutron detector where delayed neutrons are counted. In general,more » samples undergo several irradiation and count cycles during a measurement. The total number of delayed-neutron counts accumulated is translated into grams of fissile mass through comparison with the counts accumulated in an identical irradiation and count sequence of calibration standards. Successful operation of the device through many experiments over a one-year period indicates high operational reliability. Tests of assay precision show this to be better than 0.25% for measurements of 10 min. Assay biases may be encountered if calibration standards are not representative of unknown samples, but reasonable care in construction and control of standards should lead to no more than 0.2% bias in the measurements. Nondestructive fissile assay of HTGR fuel samples by thermal-neutron irradiation and delayed-neutron detection has been demonstrated to be a rapid and accurate analysis technique. However, careful attention and control must be given to calibration standards to see that they remain representative of unknown samples.« less
Antle, John M.; Stoorvogel, Jetse J.; Valdivia, Roberto O.
2014-01-01
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models. PMID:24535388
Antle, John M; Stoorvogel, Jetse J; Valdivia, Roberto O
2014-04-05
This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models.
Sample design effects in landscape genetics
Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.
2012-01-01
An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-01-01
Representatives of US Gypsum Co., Pfizer Inc., and the Environmental Protection Agency (EPA) testified on the Asbestos Information Clearinghouse Act of 1986 (H.R. 5078), which calls for an information center with samples of materials containing asbestos to simplify the task of identifying their characteristics. The goal of the bill is to make judiciary processes more efficient. EPA opposes the bill on the grounds that the rulemaking and collection of samples from building owners and asbestos manufacturers and processors for analysis would shift the inefficiency from the judiciary arena to EPA. EPA argued that the identification of defendants is a private-sectormore » issue, and that the activities that would be assigned to EPA would be outside its mission. Pfizer supported the legislation, while the spokesman for US Gypsum pointed out that if the purpose is to remove those defendants from litigation who are not involved, extensive sampling would be a waste of time. Additional material submitted for the record follows the text of H.R. 5078 and the testimony of four witnesses.« less
Christian, Julie; Armitage, Christopher J; Abrams, Dominic
2007-09-01
This article reports findings from two studies (N = 88, N = 100) using Ajzen's theory of planned behaviour (TPB) to predict homeless people's uptake of service programmes. Study 1 was conducted with an opportunity sample whereas Study 2 employed a representative sample. Both studies provide support for the application of the TPB, and demonstrate that the effects of demographic characteristics on behaviour were mediated by TPB variables. The discussion focuses on the role of attitudinal and normative components in actual behaviour, and on the potential role of social normative processes and stigmatization in homeless people's uptake of services.
Asynchronous sampling of speech with some vocoder experimental results
NASA Technical Reports Server (NTRS)
Babcock, M. L.
1972-01-01
The method of asynchronously sampling speech is based upon the derivatives of the acoustical speech signal. The following results are apparent from experiments to date: (1) It is possible to represent speech by a string of pulses of uniform amplitude, where the only information contained in the string is the spacing of the pulses in time; (2) the string of pulses may be produced in a simple analog manner; (3) the first derivative of the original speech waveform is the most important for the encoding process; (4) the resulting pulse train can be utilized to control an acoustical signal production system to regenerate the intelligence of the original speech.
NASA Astrophysics Data System (ADS)
Robson, E. I.; Stevens, J. A.; Jenness, T.
2001-11-01
Calibrated data for 65 flat-spectrum extragalactic radio sources are presented at a wavelength of 850μm, covering a three-year period from 1997 April. The data, obtained from the James Clerk Maxwell Telescope using the SCUBA camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control-Data Reduction (orac-dr) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves.
Kaufman, John A; Brown, Mary Jean; Umar-Tsafe, Nasir T; Adbullahi, Muhammad Bashir; Getso, Kabiru I; Kaita, Ibrahim M; Sule, Binta Bako; Ba'aba, Ahmed; Davis, Lora; Nguku, Patrick M; Sani-Gwarzo, Nasir
2016-09-01
In March 2010, Medecins Sans Frontieres/Doctors Without Borders detected an outbreak of acute lead poisoning in Zamfara State, northwestern Nigeria, linked to low-technology gold ore processing. The outbreak killed more than 400 children ≤5 years of age in the first half of 2010 and has left more than 2,000 children with permanent disabilities. The aims of this study were to estimate the statewide prevalence of children ≤5 years old with elevated blood lead levels (BLLs) in gold ore processing and non-ore-processing communities, and to identify factors associated with elevated blood lead levels in children. A representative, population-based study of ore processing and non-ore-processing villages was conducted throughout Zamfara in 2012. Blood samples from children, outdoor soil samples, indoor dust samples, and survey data on ore processing activities and other lead sources were collected from 383 children ≤5 years old in 383 family compounds across 56 villages. 17.2% of compounds reported that at least one member had processed ore in the preceding 12 months (95% confidence intervals (CI): 9.7, 24.7). The prevalence of BLLs ≥10 µg/dL in children ≤5 years old was 38.2% (95% CI: 26.5, 51.4) in compounds with members who processed ore and 22.3% (95% CI: 17.8, 27.7) in compounds where no one processed ore. Ore processing activities were associated with higher lead concentrations in soil, dust, and blood samples. Other factors associated with elevated BLL were a child's age and sex, breastfeeding, drinking water from a piped tap, and exposure to eye cosmetics. Childhood lead poisoning is widespread in Zamfara State in both ore processing and non-ore-processing settings, although it is more prevalent in ore processing areas. Although most children's BLLs were below the recommended level for chelation therapy, environmental remediation and use of safer ore processing practices are needed to prevent further exposures. Obtained. The study protocol was approved by the US Centers for Disease Control Institutional Review Board-A and the National Health Research Ethics Committee of Nigeria. The authors declare no competing financial interests.
Survival and Risk Comparison of Campylobacter jejuni on Various Processed Meat Products
Hong, Soo Hyeon; Kim, Han Sol; Yoon, Ki Sun
2016-01-01
The objective of this study was to investigate survival kinetics of Campylobacter jejuni on various processed meat products (dry-cured ham, round ham with/without sodium nitrite, garlic seasoned ham with/without sodium nitrite, and sausage without sodium nitrite). Additionally, a semi-quantitative risk assessment of C. jejuni on various processed meat products was conducted using FDA-iRISK 1.0. Inoculated processed meat products with 6.0 ± 0.5 log CFU/g of C. jejuni were vacuum packed and stored at 4, 10, 17, 24, 30, and 36 °C. Survival curves were fitted to the Weibull model to obtain the delta values of C. jejuni on various processed meat products. The most rapid death of C. jejuni was observed on dry-cured ham, followed by sausage without sodium nitrite. The results of semi-quantitative risk assessment indicate that dry-cured ham represented the lowest risk among all samples. C. jejuni on processed meats presented a greater risk at 4 °C than at 10 °C. The risk of ham was greater than the risk of sausage, regardless of type. Among all samples, the highest risk of C. jejuni was observed in round ham without sodium nitrite. Overall, our data indicates that risk of processed meat products due to C. jejuni is relatively low. PMID:27294947
Karraker, Amelia; Schoeni, Robert F; Cornman, Jennifer C
2015-11-01
Growing evidence suggests that psychological factors, such as conscientiousness and anger, as well as cognitive ability are related to mortality. Less is known about 1) the relative importance of each of these factors in predicting mortality, 2) through what social, economic, and behavioral mechanisms these factors influence mortality, and 3) how these processes unfold over long periods of time in nationally-representative samples. We use 35 years (1972-2007) of data from men (ages 20-40) in the Panel Study of Income Dynamics (PSID), a nationally representative sample in the United States, and discrete time event history analysis (n = 27,373 person-years) to examine the importance of measures of follow-through (a dimension of conscientiousness), anger, and cognitive ability in predicting mortality. We also assess the extent to which income, marriage, and smoking explain the relationship between psychological and cognitive factors with mortality. We find that while follow-through, anger, and cognitive ability are all associated with subsequent mortality when modeled separately, when they are modeled together and baseline demographic characteristics are controlled, only anger remains associated with mortality: being in the top quartile for anger is associated with a 1.57 fold increase in the risk of dying at follow-up compared with those in the bottom quartile. This relationship is robust to the inclusion of income, marriage, and smoking as mediators. Copyright © 2015 Elsevier Ltd. All rights reserved.
Do Hf isotopes in magmatic zircons represent those of their host rocks?
NASA Astrophysics Data System (ADS)
Wang, Di; Wang, Xiao-Lei; Cai, Yue; Goldstein, Steven L.; Yang, Tao
2018-04-01
Lu-Hf isotopic system in zircon is a powerful and widely used geochemical tracer in studying petrogenesis of magmatic rocks and crustal evolution, assuming that zircon Hf isotopes can represent initial Hf isotopes of their parental whole rock. However, this assumption may not always be valid. Disequilibrium partial melting of continental crust would preferentially melt out non-zircon minerals with high time-integrated Lu/Hf ratios and generate partial melts with Hf isotope compositions that are more radiogenic than those of its magma source. Dissolution experiments (with hotplate, bomb and sintering procedures) of zircon-bearing samples demonstrate this disequilibrium effect where partial dissolution yielded variable and more radiogenic Hf isotope compositions than fully dissolved samples. A case study from the Neoproterozoic Jiuling batholith in southern China shows that about half of the investigated samples show decoupled Hf isotopes between zircons and the bulk rocks. This decoupling could reflect complex and prolonged magmatic processes, such as crustal assimilation, magma mixing, and disequilibrium melting, which are consistent with the wide temperature spectrum from ∼630 °C to ∼900 °C by Ti-in-zircon thermometer. We suggest that magmatic zircons may only record the Hf isotopic composition of their surrounding melt during crystallization and it is uncertain whether their Hf isotopic compositions can represent the primary Hf isotopic compositions of the bulk magmas. In this regard, using zircon Hf isotopic compositions to trace crustal evolution may be biased since most of these could be originally from disequilibrium partial melts.
Increased instrument intelligence--can it reduce laboratory error?
Jekelis, Albert W
2005-01-01
Recent literature has focused on the reduction of laboratory errors and the potential impact on patient management. This study assessed the intelligent, automated preanalytical process-control abilities in newer generation analyzers as compared with older analyzers and the impact on error reduction. Three generations of immuno-chemistry analyzers were challenged with pooled human serum samples for a 3-week period. One of the three analyzers had an intelligent process of fluidics checks, including bubble detection. Bubbles can cause erroneous results due to incomplete sample aspiration. This variable was chosen because it is the most easily controlled sample defect that can be introduced. Traditionally, lab technicians have had to visually inspect each sample for the presence of bubbles. This is time consuming and introduces the possibility of human error. Instruments with bubble detection may be able to eliminate the human factor and reduce errors associated with the presence of bubbles. Specific samples were vortexed daily to introduce a visible quantity of bubbles, then immediately placed in the daily run. Errors were defined as a reported result greater than three standard deviations below the mean and associated with incomplete sample aspiration of the analyte of the individual analyzer Three standard deviations represented the target limits of proficiency testing. The results of the assays were examined for accuracy and precision. Efficiency, measured as process throughput, was also measured to associate a cost factor and potential impact of the error detection on the overall process. The analyzer performance stratified according to their level of internal process control The older analyzers without bubble detection reported 23 erred results. The newest analyzer with bubble detection reported one specimen incorrectly. The precision and accuracy of the nonvortexed specimens were excellent and acceptable for all three analyzers. No errors were found in the nonvortexed specimens. There were no significant differences in overall process time for any of the analyzers when tests were arranged in an optimal configuration. The analyzer with advanced fluidic intelligence demostrated the greatest ability to appropriately deal with an incomplete aspiration by not processing and reporting a result for the sample. This study suggests that preanalytical process-control capabilities could reduce errors. By association, it implies that similar intelligent process controls could favorably impact the error rate and, in the case of this instrument, do it without negatively impacting process throughput. Other improvements may be realized as a result of having an intelligent error-detection process including further reduction in misreported results, fewer repeats, less operator intervention, and less reagent waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMott, Paul J.; Hill, Thomas C. J.
This campaign augmented measurements obtained via deployment of the Atmospheric Radiation Measurement (ARM) Climate Research Facility’s ARM Mobile Facility (AMF) in the Marine ARM GPCI1 Investigation of Clouds (MAGIC) field campaign. The measurements, comprised of shipboard aerosol collections obtained during the five legs of the summer 2013 cruises, were sent for offline processing to measure ice nucleating particle (INP) number concentrations. The forty-three sample periods each represented, nominally, 24-hour segments during outbound and inbound transits of the Horizon Spirit. The samples were collected at locations between Los Angeles and Hawaii. Eight samples have been analyzed for immersion freezing temperature spectramore » thus far, using funding from other grants. Remaining samples are being frozen until support for further processing is obtained. Future analyses will investigate the inorganic/organic proportions of ice nuclei, in addition to determining the genetic composition of the overall biological community associated with INPs. Resulting correlations will be compared with other archived aerosol quantities, meteorological and ocean data (e.g., temperature, wind speed, sea surface temperature, etc…) and satellite ocean color products. These findings will ultimately aid in parameterizing oceanic (e.g., sea spray) INP emissions in regional and global scale models, when illustrating aerosol connections to cloud phases and properties. Independent future analyses of frozen filter samples, as proposed by collaborating investigators at the time of this report, will include single particle analyses of marine boundary layer aerosol compositions and morphology. The MAGIC-IN data are considered representative of the oligotrophic, low Chlorophyll-a (with the exception of near-shore) ocean regions, which exist along the MAGIC transect. Current analyses suggest that INP numbers in the marine boundary layer over this region are typically low, compared to existing measurements over marine areas and those collected in the laboratory as the result of realistic sea spray particle generation. These findings, along with separate studies, confirm the existence of highly variable emission sources for INP from oceans, (though weaker than land-based emissions at modestly cooled temperatures).« less
Tucker, Jalie A; Simpson, Cathy A; Chandler, Susan D; Borch, Casey A; Davies, Susan L; Kerbawy, Shatomi J; Lewis, Terri H; Crawford, M Scott; Cheong, JeeWon; Michael, Max
2016-01-01
Emerging adulthood often entails heightened risk-taking with potential life-long consequences, and research on risk behaviors is needed to guide prevention programming, particularly in under-served and difficult to reach populations. This study evaluated the utility of Respondent Driven Sampling (RDS), a peer-driven methodology that corrects limitations of snowball sampling, to reach at-risk African American emerging adults from disadvantaged urban communities. Initial "seed" participants from the target group recruited peers, who then recruited their peers in an iterative process (110 males, 234 females; M age = 18.86 years). Structured field interviews assessed common health risk factors, including substance use, overweight/obesity, and sexual behaviors. Established gender-and age-related associations with risk factors were replicated, and sample risk profiles and prevalence estimates compared favorably with matched samples from representative U.S. national surveys. Findings supported the use of RDS as a sampling method and grassroots platform for research and prevention with community-dwelling risk groups.
Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana
2018-01-01
To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.
Natural occurrence of aflatoxins and ochratoxin A in processed spices marketed in Malaysia.
Ali, Norhayati; Hashim, Noor Hasani; Shuib, Nor Shifa
2015-01-01
The analysis of aflatoxins (B1, B2, G1 and G2) and ochratoxin A (OTA) was performed in processed spices marketed in Penang, Malaysia, using immunoaffinity columns and HPLC equipped with fluorescence detector (HPLC-FD). The processed powdered spices analysed include dried chilli, fennel, cumin, turmeric, black and white pepper, poppy seed, coriander, 'garam masala', and mixed spices for fish, meat and chicken curry. Two different studies were carried out. The limit of detection (LOD) was 0.01 ng g(-1) for each aflatoxin (AF) and 0.10 ng g(-1) for OTA (signal-to-noise ratio = 3:1). In the first study, 34 commercial processed spices analysed with a mean level, range and incidence of positive samples for total AF were 1.61 ng g(-1), 0.01-9.34 ng g(-1) and 85%, respectively, and for AFB1 were 1.38 ng g(-1), 0.01-7.68 ng g(-1) and 85%, respectively. The mean level, range and incidence of positive samples for OTA were 2.21 ng g(-1), 0.14-20.40 ng g(-1) and 79%, respectively. Natural co-occurrence of AF and OTA was found in 25 (74%) samples. In the second study of 24 commercial processed spices, the mean level, range and incidence of positive samples for total AF were 8.38 ng g(-1), 0.32-31.17 ng g(-1) and 88%, respectively, and for AFB1 were 7.31 ng g(-1), 0.32-28.43 ng g(-1) and 83%, respectively. Fifteen positive samples for total AF and two positive samples for OTA exceeded the permissible Malaysian limit of 5 ng g(-1). Contamination of both mycotoxins in spices may represent another route of exposure to consumers due to their frequent and prolonged consumption, as spices are common ingredients in popular dishes among Asian countries.
NASA Astrophysics Data System (ADS)
Moores, John E.; Francis, Raymond; Mader, Marianne; Osinski, G. R.; Barfoot, T.; Barry, N.; Basic, G.; Battler, M.; Beauchamp, M.; Blain, S.; Bondy, M.; Capitan, R.-D.; Chanou, A.; Clayton, J.; Cloutis, E.; Daly, M.; Dickinson, C.; Dong, H.; Flemming, R.; Furgale, P.; Gammel, J.; Gharfoor, N.; Hussein, M.; Grieve, R.; Henrys, H.; Jaziobedski, P.; Lambert, A.; Leung, K.; Marion, C.; McCullough, E.; McManus, C.; Neish, C. D.; Ng, H. K.; Ozaruk, A.; Pickersgill, A.; Preston, L. J.; Redman, D.; Sapers, H.; Shankar, B.; Singleton, A.; Souders, K.; Stenning, B.; Stooke, P.; Sylvester, P.; Tornabene, L.
2012-12-01
A Mission Control Architecture is presented for a Robotic Lunar Sample Return Mission which builds upon the experience of the landed missions of the NASA Mars Exploration Program. This architecture consists of four separate processes working in parallel at Mission Control and achieving buy-in for plans sequentially instead of simultaneously from all members of the team. These four processes were: science processing, science interpretation, planning and mission evaluation. science processing was responsible for creating products from data downlinked from the field and is organized by instrument. Science Interpretation was responsible for determining whether or not science goals are being met and what measurements need to be taken to satisfy these goals. The Planning process, responsible for scheduling and sequencing observations, and the Evaluation process that fostered inter-process communications, reporting and documentation assisted these processes. This organization is advantageous for its flexibility as shown by the ability of the structure to produce plans for the rover every two hours, for the rapidity with which Mission Control team members may be trained and for the relatively small size of each individual team. This architecture was tested in an analogue mission to the Sudbury impact structure from June 6-17, 2011. A rover was used which was capable of developing a network of locations that could be revisited using a teach and repeat method. This allowed the science team to process several different outcrops in parallel, downselecting at each stage to ensure that the samples selected for caching were the most representative of the site. Over the course of 10 days, 18 rock samples were collected from 5 different outcrops, 182 individual field activities - such as roving or acquiring an image mosaic or other data product - were completed within 43 command cycles, and the rover travelled over 2200 m. Data transfer from communications passes were filled to 74%. Sample triage was simulated to allow down-selection to 1 kg of material for return to Earth.
Thornber, Carl R.; Budahn, James R.; Ridley, W. Ian; Unruh, Daniel M.
2003-01-01
This open-file report serves as a repository for geochemical data referred to in U.S. Geological Survey Professional Paper 1676 (Heliker, Swanson, and Takahashi, eds., 2003), which includes multidisciplinary research papers pertaining to the first twenty years of Puu Oo Kupaianaha eruption activity. Details of eruption characteristics and nomenclature are provided in the introductory chapter of that volume (Heliker and Mattox, 2003). Geochemical relations of this data are depicted and interpreted by Thornber (2003), Thornber and others (2003a) and Thornber (2001). This report supplements Thornber and others (2003b) in which whole-rock and glass major-element data on ~1000 near-vent lava samples collected during the 1983 to 2001 eruptive interval of Kilauea Volcano, Hawai'i, are presented. Herein, we present whole-rock trace element compositions of 85 representative samples collected from January 1983 to May 2001; glass trace-element compositions of 39 Pele’s Tear (tephra) samples collected from September 1995 to September 1996, and whole-rock Nd, Sr and Pb isotopic analyses of 10 representative samples collected from September 1983 to September 1993. Thornber and others (2003b) provide a specific record of sample characteristics, location, etc., for each of the samples reported here. Spreadsheets of both reports may be integrated and sorted based upon time of formation or sample numbers. General information pertaining to the selectivity and petrologic significance of this sample suite is presented by Thornber and others (2003b). As justified in that report, this select suite of time-constrained geochemical data is suitable for constructing petrologic models of pre-eruptive magmatic processes associated with prolonged rift zone eruption of Hawaiian shield volcanoes.
Ferrero, Giulio; Cordero, Francesca; Tarallo, Sonia; Arigoni, Maddalena; Riccardo, Federica; Gallo, Gaetano; Ronco, Guglielmo; Allasia, Marco; Kulkarni, Neha; Matullo, Giuseppe; Vineis, Paolo; Calogero, Raffaele A; Pardini, Barbara; Naccarati, Alessio
2018-01-09
The role of non-coding RNAs in different biological processes and diseases is continuously expanding. Next-generation sequencing together with the parallel improvement of bioinformatics analyses allows the accurate detection and quantification of an increasing number of RNA species. With the aim of exploring new potential biomarkers for disease classification, a clear overview of the expression levels of common/unique small RNA species among different biospecimens is necessary. However, except for miRNAs in plasma, there are no substantial indications about the pattern of expression of various small RNAs in multiple specimens among healthy humans. By analysing small RNA-sequencing data from 243 samples, we have identified and compared the most abundantly and uniformly expressed miRNAs and non-miRNA species of comparable size with the library preparation in four different specimens (plasma exosomes, stool, urine, and cervical scrapes). Eleven miRNAs were commonly detected among all different specimens while 231 miRNAs were globally unique across them. Classification analysis using these miRNAs provided an accuracy of 99.6% to recognize the sample types. piRNAs and tRNAs were the most represented non-miRNA small RNAs detected in all specimen types that were analysed, particularly in urine samples. With the present data, the most uniformly expressed small RNAs in each sample type were also identified. A signature of small RNAs for each specimen could represent a reference gene set in validation studies by RT-qPCR. Overall, the data reported hereby provide an insight of the constitution of the human miRNome and of other small non-coding RNAs in various specimens of healthy individuals.
Orani, Anna Maria; Barats, Aurélie; Zitte, Wendy; Morrow, Christine; Thomas, Olivier P
2018-06-01
The bioaccumulation and biotransformation of arsenic (As) were studied in six representative marine sponges from the French Mediterranean and Irish Atlantic coasts. Methodologies were carefully optimized in one of the species on Haliclona fulva sponges for two critical steps: the sample mineralization for total As analysis by ICP-MS and the extraction of As species for HPLC-ICP-MS analysis. During the optimization, extractions performed with 0.6 mol L -1 H 3 PO 4 were shown to be the most efficient. Extraction recovery of 81% was obtained which represents the best results obtained until now in sponge samples. Total As analyses and As speciation were performed on certified reference materials and allow confirming the measurement quality both during the sample preparation and analysis. Additionally, this study represents an environmental survey demonstrating a high variability of total As concentrations among the different species, probably related to different physiological or microbial features. As speciation results showed the predominance of arsenobetaine (AsB) regardless of the sponge species, as well as the occurrence of low amounts of dimethylarsinic acid (DMA), arsenate (As(+V)), and unknown As species in some samples. The process responsible for As transformation in sponges is most likely related to sponges metabolism itself or the action of symbiont organisms. AsB is supposed to be implied in the protection against osmolytic stress. This study demonstrates the ability of sponges to accumulate and bio-transform As, proving that sponges are relevant bio-monitors for As contamination in the marine environment, and potential tools in environmental bio-remediation. Copyright © 2018 Elsevier Ltd. All rights reserved.
Minor soil erosion contribution to denudation in Central Nepal Himalaya.
NASA Astrophysics Data System (ADS)
Morin, Guillaume; France-Lanord, Christian; Gallo, Florian; Lupker, Maarten; Lavé, Jérôme; Gajurel, Ananta
2013-04-01
In order to decipher river sediments provenance in terms of erosion processes, we characterized geochemical compositions of hillslope material coming from soils, glaciers and landslide, and compared them to rivers sediments. We focused our study on two South flank Himalayan catchments: (1) Khudi khola, as an example of small High Himalayan catchment (150 km2), undergoing severe precipitation, and rapid erosion ≈ 3.5 mm/yr [A] and (2) the Narayani-Gandak Transhimalayan basin (52000 km2) that drains the whole central Nepal. To assess the question, systematic samplings were conducted on hillslope material from different erosion processes in the basins. River sediment include daily sampling during the 2010 monsoon at two stations, and banks samples in different parts of the basins. Source rocks, soil and landslide samples, are compared to river sediment mobile to immobile element ratios, completed by hydration degree H2O+ analysis[2]. Data show that soils are clearly depleted in mobile elements Na, K, Ca, and highly hydrated compared to source rocks and other erosion products. In the Khudi basin, the contrast between soil and river sediment signatures allow to estimate that soil erosion represents less than 5% of the total sediment exported by the river. Most of the river sediment therefore derives from landslides inputs and to a lesser extent by barren high elevation sub-basins. This is further consistent with direct observation that, during monsoon, significant tributaries of the Khudi river do not export sediments. Considering that active landslide zones represent less than 0.5% of the total watershed area, it implies that erosion distribution is highly heterogeneous. Landslide erosion rate could reach more than 50 cm/yr in the landslide area. Sediments of the Narayani river are not significantly different from those of the Khudi in spite of more diverse geomorphology and larger area of the basin. Only H2O+ and Total Organic Carbon concentrations normalised to Al/Si ratios show distinctly higher values. This suggests that contribution of soil erosion is higher than in the Khudi basin. Nevertheless, soil erosion remains a minor source of sediments implying that more physical processes such as landslide and glaciers dominate the erosional flux. In spite of high deforestation and agricultural land-use [B], soil erosion does not represent an important source of sediments in Nepal Himalaya. [A] Gabet et al. (2008) Earth and Planetary Science Letters 267, 482-494. [B] Gardner et al. (2003) Applied Geography 23, 23-45.
Family level phylogenies reveal modes of macroevolution in RNA viruses.
Kitchen, Andrew; Shackelton, Laura A; Holmes, Edward C
2011-01-04
Despite advances in understanding the patterns and processes of microevolution in RNA viruses, little is known about the determinants of viral diversification at the macroevolutionary scale. In particular, the processes by which viral lineages assigned as different "species" are generated remain largely uncharacterized. To address this issue, we use a robust phylogenetic approach to analyze patterns of lineage diversification in five representative families of RNA viruses. We ask whether the process of lineage diversification primarily occurs when viruses infect new host species, either through cross-species transmission or codivergence, and which are defined here as analogous to allopatric speciation in animals, or by acquiring new niches within the same host species, analogous to sympatric speciation. By mapping probable primary host species onto family level viral phylogenies, we reveal a strong clustering among viral lineages that infect groups of closely related host species. Although this is consistent with lineage diversification within individual hosts, we argue that this pattern more likely represents strong biases in our knowledge of viral biodiversity, because we also find that better-sampled human viruses rarely cluster together. Hence, although closely related viruses tend to infect related host species, it is unlikely that they often infect the same host species, such that evolutionary constraints hinder lineage diversification within individual host species. We conclude that the colonization of new but related host species may represent the principle mode of macroevolution in RNA viruses.
General method of pattern classification using the two-domain theory
NASA Technical Reports Server (NTRS)
Rorvig, Mark E. (Inventor)
1993-01-01
Human beings judge patterns (such as images) by complex mental processes, some of which may not be known, while computing machines extract features. By representing the human judgements with simple measurements and reducing them and the machine extracted features to a common metric space and fitting them by regression, the judgements of human experts rendered on a sample of patterns may be imposed on a pattern population to provide automatic classification.
ERIC Educational Resources Information Center
Simsek, Ali; Nuss, Shirley
2010-01-01
The purpose of this study was to investigate how elementary students can learn about the culture of another country and how technology can play a role in this process. The sample of the study included 135 fifth grade students from the United States and Turkey. Initial knowledge and information sources of students were assessed at the beginning of…
General method of pattern classification using the two-domain theory
NASA Technical Reports Server (NTRS)
Rorvig, Mark E. (Inventor)
1990-01-01
Human beings judge patterns (such as images) by complex mental processes, some of which may not be known, while computing machines extract features. By representing the human judgements with simple measurements and reducing them and the machine extracted features to a common metric space and fitting them by regression, the judgements of human experts rendered on a sample of patterns may be imposed on a pattern population to provide automatic classification.
EDRN Standard Operating Procedures (SOP) — EDRN Public Portal
The NCI’s Early Detection Research Network is developing a number of standard operating procedures for assays, methods, and protocols for collection and processing of biological samples, and other reference materials to assist investigators to conduct experiments in a consistent, reliable manner. These SOPs are established by the investigators of the Early Detection Research Network to maintain constancy throughout the Network. These SOPs represent neither a consensus, nor are the recommendations of NCI.
Gannoun, Abdelmouhcine; Boyet, Maud; Rizo, Hanika; El Goresy, Ahmed
2011-05-10
The short-lived (146)Sm-(142)Nd chronometer (T(1/2) = 103 Ma) is used to constrain the early silicate evolution of planetary bodies. The composition of bulk terrestrial planets is then considered to be similar to that of primitive chondrites that represent the building blocks of rocky planets. However for many elements chondrites preserve small isotope differences. In this case it is not always clear to what extent these variations reflect the isotope heterogeneity of the protosolar nebula rather than being produced by the decay of parent isotopes. Here we present Sm-Nd isotopes data measured in a comprehensive suite of enstatite chondrites (EC). The EC preserve (142)Nd/(144)Nd ratios that range from those of ordinary chondrites to values similar to terrestrial samples. The EC having terrestrial (142)Nd/(144)Nd ratios are also characterized by small (144)Sm excesses, which is a pure p-process nuclide. The correlation between (144)Sm and (142)Nd for chondrites may indicate a heterogeneous distribution in the solar nebula of p-process matter synthesized in supernovae. However to explain the difference in (142)Nd/(144)Nd ratios, 20% of the p-process contribution to (142)Nd is required, at odds with the value of 4% currently proposed in stellar models. This study highlights the necessity of obtaining high-precision (144)Sm measurements to interpret properly measured (142)Nd signatures. Another explanation could be that the chondrites sample material formed in different pulses of the lifetime of asymptotic giant branch stars. Then the isotope signature measured in SiC presolar would not represent the unique s-process signature of the material present in the solar nebula during accretion.
NASA Astrophysics Data System (ADS)
Ferraretto, Daniele; Heal, Kate
2017-04-01
Temperate forest ecosystems are significant sinks for nitrogen deposition (Ndep) yielding benefits such as protection of waterbodies from eutrophication and enhanced sequestration of atmospheric CO2. Previous studies have shown evidence of biological nitrification and Ndep processing and retention in forest canopies. However, this was reported only at sites with high environmental or experimentally enhanced rates of Ndep (˜18 kg N ha-1 y-1) and has not yet been demonstrated in low Ndep environments. We have used bulk field hydrochemical measurements and labelled isotopic experiments to assess canopy processing in a lower Ndep environment (˜7 kg N ha-1 year-1) at a Sitka spruce plantation in Perthshire, Scotland, representing the dominant tree species (24%) in woodlands in Great Britain. Analysis of 4.5 years of measured N fluxes in rainfall (RF) and fogwater onto the canopy and throughfall (TF) and stemflow (SF) below the canopy suggests strong transformation and uptake of Ndep in the forest canopy. Annual canopy Ndep uptake was ˜4.7 kg N ha-1 year-1, representing 60-76% of annual Ndep. To validate these plot-scale results and track N uptake within the forest canopy in different seasons, double 15N-labelled NH4NO3 (98%) solution was sprayed in summer and winter onto the canopy of three trees at the measurement site. RF, TF and SF samples have been collected and analysed for 15NH4 and 15NO3. Comparing the amount of labelled N recovered under the sample trees with the measured δ15N signal is expected to provide further evidence of the role of forest canopies in actively processing and retaining atmospheric N deposition.
Gannoun, Abdelmouhcine; Boyet, Maud; Rizo, Hanika; El Goresy, Ahmed
2011-01-01
The short-lived 146Sm–142Nd chronometer (T1/2 = 103 Ma) is used to constrain the early silicate evolution of planetary bodies. The composition of bulk terrestrial planets is then considered to be similar to that of primitive chondrites that represent the building blocks of rocky planets. However for many elements chondrites preserve small isotope differences. In this case it is not always clear to what extent these variations reflect the isotope heterogeneity of the protosolar nebula rather than being produced by the decay of parent isotopes. Here we present Sm–Nd isotopes data measured in a comprehensive suite of enstatite chondrites (EC). The EC preserve 142Nd/144Nd ratios that range from those of ordinary chondrites to values similar to terrestrial samples. The EC having terrestrial 142Nd/144Nd ratios are also characterized by small 144Sm excesses, which is a pure p-process nuclide. The correlation between 144Sm and 142Nd for chondrites may indicate a heterogeneous distribution in the solar nebula of p-process matter synthesized in supernovae. However to explain the difference in 142Nd/144Nd ratios, 20% of the p-process contribution to 142Nd is required, at odds with the value of 4% currently proposed in stellar models. This study highlights the necessity of obtaining high-precision 144Sm measurements to interpret properly measured 142Nd signatures. Another explanation could be that the chondrites sample material formed in different pulses of the lifetime of asymptotic giant branch stars. Then the isotope signature measured in SiC presolar would not represent the unique s-process signature of the material present in the solar nebula during accretion. PMID:21515828
NASA Astrophysics Data System (ADS)
Rahbaralam, Maryam; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier
2015-12-01
Random walk particle tracking methods are a computationally efficient family of methods to solve reactive transport problems. While the number of particles in most realistic applications is in the order of 106-109, the number of reactive molecules even in diluted systems might be in the order of fractions of the Avogadro number. Thus, each particle actually represents a group of potentially reactive molecules. The use of a low number of particles may result not only in loss of accuracy, but also may lead to an improper reproduction of the mixing process, limited by diffusion. Recent works have used this effect as a proxy to model incomplete mixing in porous media. In this work, we propose using a Kernel Density Estimation (KDE) of the concentrations that allows getting the expected results for a well-mixed solution with a limited number of particles. The idea consists of treating each particle as a sample drawn from the pool of molecules that it represents; this way, the actual location of a tracked particle is seen as a sample drawn from the density function of the location of molecules represented by that given particle, rigorously represented by a kernel density function. The probability of reaction can be obtained by combining the kernels associated to two potentially reactive particles. We demonstrate that the observed deviation in the reaction vs time curves in numerical experiments reported in the literature could be attributed to the statistical method used to reconstruct concentrations (fixed particle support) from discrete particle distributions, and not to the occurrence of true incomplete mixing. We further explore the evolution of the kernel size with time, linking it to the diffusion process. Our results show that KDEs are powerful tools to improve computational efficiency and robustness in reactive transport simulations, and indicates that incomplete mixing in diluted systems should be modeled based on alternative mechanistic models and not on a limited number of particles.
Three dimensional mapping of strontium in bone by dual energy K-edge subtraction imaging
NASA Astrophysics Data System (ADS)
Cooper, D. M. L.; Chapman, L. D.; Carter, Y.; Wu, Y.; Panahifar, A.; Britz, H. M.; Bewer, B.; Zhouping, W.; Duke, M. J. M.; Doschak, M.
2012-09-01
The bones of many terrestrial vertebrates, including humans, are continually altered through an internal process of turnover known as remodeling. This process plays a central role in bone adaptation and disease. The uptake of fluorescent tetracyclines within bone mineral is widely exploited as a means of tracking new tissue formation. While investigation of bone microarchitecture has undergone a dimensional shift from 2D to 3D in recent years, we lack a 3D equivalent to fluorescent labeling. In the current study we demonstrate the ability of synchrotron radiation dual energy K-edge subtraction (KES) imaging to map the 3D distribution of elemental strontium within rat vertebral samples. This approach has great potential for ex vivo analysis of preclinical models and human tissue samples. KES also represents a powerful tool for investigating the pharmokinetics of strontium-based drugs recently approved in many countries around the globe for the treatment of osteoporosis.
Soil sedimentology at Gusev Crater from Columbia Memorial Station to Winter Haven
Cabrol, N.A.; Herkenhoff, K. E.; Greeley, R.; Grin, E.A.; Schroder, C.; d'Uston, C.; Weitz, C.; Yingst, R.A.; Cohen, B. A.; Moore, J.; Knudson, A.; Franklin, B.; Anderson, R.C.; Li, R.
2008-01-01
A total of 3140 individual particles were examined in 31 soils along Spirit's traverse. Their size, shape, and texture were quantified and classified. They represent a unique record of 3 years of sedimentologic exploration from landing to sol 1085 covering the Plains Unit to Winter Haven where Spirit spent the Martian winter of 2006. Samples in the Plains Unit and Columbia Hills appear as reflecting contrasting textural domains. One is heterogeneous, with a continuum of angular-to-round particles of fine sand to pebble sizes that are generally dust covered and locally cemented in place. The second shows the effect of a dominant and ongoing dynamic aeolian process that redistributes a uniform population of medium-size sand. The texture of particles observed in the samples at Gusev Crater results from volcanic, aeolian, impact, and water-related processes. Copyright 2008 by the American Geophysical Union.
Rapid Radiochemical Methods for Asphalt Paving Material ...
Technical Brief Validated rapid radiochemical methods for alpha and beta emitters in solid matrices that are commonly encountered in urban environments were previously unavailable for public use by responding laboratories. A lack of tested rapid methods would delay the quick determination of contamination levels and the assessment of acceptable site-specific exposure levels. Of special concern are matrices with rough and porous surfaces, which allow the movement of radioactive material deep into the building material making it difficult to detect. This research focuses on methods that address preparation, radiochemical separation, and analysis of asphalt paving materials and asphalt roofing shingles. These matrices, common to outdoor environments, challenge the capability and capacity of very experienced radiochemistry laboratories. Generally, routine sample preparation and dissolution techniques produce liquid samples (representative of the original sample material) that can be processed using available radiochemical methods. The asphalt materials are especially difficult because they do not readily lend themselves to these routine sample preparation and dissolution techniques. The HSRP and ORIA coordinate radiological reference laboratory priorities and activities in conjunction with HSRP’s Partner Process. As part of the collaboration, the HSRP worked with ORIA to publish rapid radioanalytical methods for selected radionuclides in building material matrice
Microbiological Quality Assessment of Game Meats at Retail in Japan.
Asakura, Hiroshi; Kawase, Jun; Ikeda, Tetsuya; Honda, Mioko; Sasaki, Yoshimasa; Uema, Masashi; Kabeya, Hidenori; Sugiyama, Hiromu; Igimi, Shizunobu; Takai, Shinji
2017-12-01
In this study, we examined the prevalence of Shiga toxin-producing Escherichia coli and Salmonella spp. and the distribution of indicator bacteria in 248 samples of game meats (120 venison and 128 wild boar) retailed between November 2015 and March 2016 in Japan. No Salmonella spp. were detected in any of the samples, whereas Shiga toxin-producing Escherichia coli serotype OUT:H25 (stx 2d + , eae - ) was isolated from one deer meat sample, suggesting a possible source for human infection. Plate count assays indicated greater prevalence of coliforms and E. coli in wild boar meat than in venison, whereas their prevalence in processing facilities showed greater variation than in animal species. The 16S rRNA ion semiconductor sequencing analysis of 24 representative samples revealed that the abundances of Acinetobacter and Arthrobacter spp. significantly correlated with the prevalence of E. coli, and quantitative PCR analyses in combination with selective plate count assay verified these correlations. To our knowledge, this is the first report to characterize the diversity of microorganisms of game meats at retail in Japan, together with identification of dominant microbiota. Our data suggest the necessity of bottom-up hygienic assessment in areas of slaughtering and processing facilities to improve microbiological safety.
Wickrama, K A S; Elder, Glen H; Todd Abraham, W
2007-01-01
This study's objectives are to: investigate potential additive and multiplicative influences of rurality and race/ethnicity on chronic physical illness in a nationally representative sample of youth; and examine intra-Latino processes using a Latino sub-sample. Specifically, we examine how rurality and individual psychosocial processes reflected by acculturation proxies (generational status and use of the English language at home) link to chronic physical illness of Latino youth. Finally, we examine whether these associations and the levels of chronic illness differ across Latino subgroups. Logistic-normal (binomial) modeling analyses examine multilevel influences on physical health using longitudinal data from a nationally representative sample (N = 13,905) of white, African American, Latino, Asian, and Native American adolescents between the ages of 12 and 19 participating in the National Longitudinal Study of Adolescent Health. Prevalence rates of certain chronic illnesses (obesity, asthma, and high cholesterol) among Latino adolescents exceed rates for the same illnesses among white adolescents. Comparisons between rural and non-rural youth reveal a rurality disadvantage in terms of any chronic illness likelihood among Latino, Asian, and Native American youth not evident among whites or African Americans. Among Latino youth (N = 2,505), Mexican Americans show lower health risk for any chronic illness compared to other Latino groups. However, third generation Latinos and those who primarily speak English at home experience higher risk for any chronic illness than do those of first or second generation status, with amplification of the risk linked to English use at home among Latino youth living in rural areas.
Instance-based learning: integrating sampling and repeated decisions from experience.
Gonzalez, Cleotilde; Dutt, Varun
2011-10-01
In decisions from experience, there are 2 experimental paradigms: sampling and repeated-choice. In the sampling paradigm, participants sample between 2 options as many times as they want (i.e., the stopping point is variable), observe the outcome with no real consequences each time, and finally select 1 of the 2 options that cause them to earn or lose money. In the repeated-choice paradigm, participants select 1 of the 2 options for a fixed number of times and receive immediate outcome feedback that affects their earnings. These 2 experimental paradigms have been studied independently, and different cognitive processes have often been assumed to take place in each, as represented in widely diverse computational models. We demonstrate that behavior in these 2 paradigms relies upon common cognitive processes proposed by the instance-based learning theory (IBLT; Gonzalez, Lerch, & Lebiere, 2003) and that the stopping point is the only difference between the 2 paradigms. A single cognitive model based on IBLT (with an added stopping point rule in the sampling paradigm) captures human choices and predicts the sequence of choice selections across both paradigms. We integrate the paradigms through quantitative model comparison, where IBLT outperforms the best models created for each paradigm separately. We discuss the implications for the psychology of decision making. © 2011 American Psychological Association
Meso-Scale Wetting of Paper Towels
NASA Astrophysics Data System (ADS)
Abedsoltan, Hossein
In this study, a new experimental approach is proposed to investigate the absorption properties of some selected retail paper towels. The samples were selected from two important manufacturing processes, conventional wet pressing (CWP) considered value products, and through air drying (TAD) considered as high or premium products. The tested liquids were water, decane, dodecane, and tetradecane with the total volumes in micro-liter range. The method involves the point source injection of liquid with different volumetric flowrates, in the nano-liter per second range. The local site for injection was chosen arbitrarily on the sample surface. The absorption process was monitored and recorded as the liquid advances, with two distinct imaging system methods, infrared imaging and optical imaging. The microscopic images were analyzed to calculate the wetted regions during the absorption test, and the absorption diagrams were generated. These absorption diagrams were dissected to illustrate the absorption phenomenon, and the absorption properties of the samples. The local (regional) absorption rates were computed for Mardi Gras and Bounty Basic as the representative samples for CWP and TAD, respectively in order to be compared with the absorption capacity property of these two samples. Then, the absorption capacity property was chosen as an index factor to compare the absorption properties of all the tested paper towels.
Szilágyi, Tamás Gábor; Vecseri, Beáta Hegyesné; Kiss, Zsuzsanna; Hajba, László; Guttman, András
2018-08-01
Determination of the oligosaccharide composition in different wort samples is important to monitor their change during the brewing process with different yeast types. In our work, the concentration of fermentable and non-fermentable sugars were monitored by capillary electrophoresis to observe the effect of two different types of yeasts, Saccharomyces pastorianus and Saccharomycodes ludwigii. The former first ferments the monosaccharides, then the higher sugar oligomers, such as maltose and maltotriose, to ethanol, while the latter fully ferments the monosaccharides, but ferments only very low percentages of the oligosaccharides. Therefore, breweries use Saccharomycodes ludwigii to produce beers with low alcohol content. The CE-LIF traces of the wort samples represented unique oligosaccharide signatures. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kasina, Monika; Morozova, Daria; Pellizzari, Linda; Würdemann, Hilke
2013-04-01
Microorganisms represent very effective geochemical catalysts, and may influence the process of the CO2 storage significantly. The goal of this study is to characterize the interactions between minerals and microorganisms during their exposure to the CO2 in a long term experiment in high pressure vessels to better understand the influence of biological processes on the composition of the reservoir sandstones and the long term stability of CO2 storage. The natural gas reservoir, proposed for the CO2 storage is characterized by high salinity (up to 420 g/l) and temperatures around 130°C, at depth of approximately 3.5 km. Microbial community of the reservoir fluid samples was dominated by different H2-oxidising, thiosulfate-oxidising and biocorrosive thermophilic bacteria as well as microorganisms similar to representatives from other deep environments, which have not previously been cultivated. The cells were attached to particles and were difficult to detect because of low cell numbers (Morozova et al., 2011). For the long term experiments, the autoclaved rock core samples from the core deposit were grinded, milled to the size of 0.5 mm and incubated with fresh reservoir fluids as inoculum for indigenous microorganisms in a N2/CH4/H2-atmosphere in high pressure vessels at a temperature of 80°C and pressure of 40 bars. Incubation was performed under lower temperature than in situ in order to favor the growth of the dormant microorganisms. After three months of incubation samples were exposed to high CO2 concentrations by insufflating it into the vessels. The sampling of rock and fluid material was executed 10 and 21 months after start of the experiment. Mineralogical analyses performed using XRD and SEM - EDS showed that main mineral components are quartz, feldspars, dolomite, anhydrite and calcite. Chemical fluid analyses using ICP-MS and ICP-OES showed that after CO2 exposure increasing Si4+ content in the fluid was noted after first sampling (ca. 25 relative %), whereas after the second sampling it decreased (to 31 relative %) in comparison to the reservoir fluid sample. This may suggest dissolution of silicate minerals at first, and secondary precipitation at second stage of experiment. In addition, immobilization of heavy metals dispersed within silicate minerals was also detected. An increase of Ca (3.2 up to 13% relative), SO4 (up to 14 relative %) and Fetot (47 and 24% relative) were also detected after first and second sampling respectively and may suggest dissolution of cements and iron rich minerals. The concentration of organic acids increased relatively by 12.5 % and 25% after first and second sampling respectively might be an indication for metabolic activity of microorganism or an effect of mobilisation due to CO2 exposure. The presence of newly formed mineral phases was detected using SEM-EDS. Quartz, albite and illite precipitation is a common process in all studied samples. However only illite is considered to be of bacterial origin, nevertheless its crystallization can also occur as a consequence of inorganic diagenetic processes. Further analyses of the microbial community composition, quantity and activity will bring a more insight into the CO2 exposure processes. Daria Morozova, Dagmar Kock, Martin Krüger, and Hilke Würdemann. Biogeochemical and microbial characterization of reservoir fluids from a gas field (Altmark). Geotechnologien 2011
Service-Learning General Chemistry: Lead Paint Analyses
NASA Astrophysics Data System (ADS)
Kesner, Laya; Eyring, Edward M.
1999-07-01
Houses painted with lead-based paints are ubiquitous in the United States because the houses and the paint have not worn out two decades after federal regulations prohibited inclusion of lead in paint. Remodeling older homes thus poses a health threat for infants and small children living in those homes. In a service-learning general chemistry class, students disseminate information about this health threat in an older neighborhood. At some of the homes they collect paint samples that they analyze for lead both qualitatively and quantitatively. This service-learning experience generates enthusiasm for general chemistry through the process of working on a "real" problem. Sample collection familiarizes the students with the concept of "representative" sampling. The sample preparation for atomic absorption spectroscopic (AAS) analysis enhances their laboratory skills. The focus of this paper is on the mechanics of integrating this particular service project into the first-term of the normal general chemistry course.
Xiong, Zheng; He, Yinyan; Hattrick-Simpers, Jason R; Hu, Jianjun
2017-03-13
The creation of composition-processing-structure relationships currently represents a key bottleneck for data analysis for high-throughput experimental (HTE) material studies. Here we propose an automated phase diagram attribution algorithm for HTE data analysis that uses a graph-based segmentation algorithm and Delaunay tessellation to create a crystal phase diagram from high throughput libraries of X-ray diffraction (XRD) patterns. We also propose the sample-pair based objective evaluation measures for the phase diagram prediction problem. Our approach was validated using 278 diffraction patterns from a Fe-Ga-Pd composition spread sample with a prediction precision of 0.934 and a Matthews Correlation Coefficient score of 0.823. The algorithm was then applied to the open Ni-Mn-Al thin-film composition spread sample to obtain the first predicted phase diagram mapping for that sample.
Dried haematic microsamples and LC-MS/MS for the analysis of natural and synthetic cannabinoids.
Protti, Michele; Rudge, James; Sberna, Angelo Eliseo; Gerra, Gilberto; Mercolini, Laura
2017-02-15
Synthetic cannabinoids are new psychoactive substances (NPS) with similar effects when compared to natural ones found in Cannabis derivatives. They have rapidly integrated into the illicit market, often sold as alternatives under international control. The need to identify and quantify an unprecedented and growing number of new compounds represents a unique challenge for toxicological, forensic and anti-doping analysis. Dried blood spots have been used within the bioanalytical framework in place of plasma or serum, in order to reduce invasiveness, lower sample size, simplify handling, storage and shipping of samples and to facilitate home-based and on-field applications. However, DBS implementation has been limited mainly by concerns related to haematocrit effect on method accuracy. Volumetric absorptive microsampling (VAMS™), a second generation dried miniaturized sampling technology, has been developed just in order to eliminate haematocrit effect, thus providing accurate sampling but still granting feasible sample processing. An original LC-MS/MS method was herein developed and validated for the analysis of THC and its 2 main metabolites, together with 10 representative synthetic cannabinoids in both DBS and VAMS dried microsamples. The ultimate goal of this work is to provide highly innovative DBS and VAMS analytical protocols, whose performances were extensively optimized and compared, in order to provide effective and alternative tools that can be applied for natural and synthetic cannabinoid determination, in place of classical analytical strategies. Copyright © 2016 Elsevier B.V. All rights reserved.
The future of Stardust science
NASA Astrophysics Data System (ADS)
Westphal, A. J.; Bridges, J. C.; Brownlee, D. E.; Butterworth, A. L.; de Gregorio, B. T.; Dominguez, G.; Flynn, G. J.; Gainsforth, Z.; Ishii, H. A.; Joswiak, D.; Nittler, L. R.; Ogliore, R. C.; Palma, R.; Pepin, R. O.; Stephan, T.; Zolensky, M. E.
2017-09-01
Recent observations indicate that >99% of the small bodies in the solar system reside in its outer reaches—in the Kuiper Belt and Oort Cloud. Kuiper Belt bodies are probably the best-preserved representatives of the icy planetesimals that dominated the bulk of the solid mass in the early solar system. They likely contain preserved materials inherited from the protosolar cloud, held in cryogenic storage since the formation of the solar system. Despite their importance, they are relatively underrepresented in our extraterrestrial sample collections by many orders of magnitude ( 1013 by mass) as compared with the asteroids, represented by meteorites, which are composed of materials that have generally been strongly altered by thermal and aqueous processes. We have only begun to scratch the surface in understanding Kuiper Belt objects, but it is already clear that the very limited samples of them that we have in our laboratories hold the promise of dramatically expanding our understanding of the formation of the solar system. Stardust returned the first samples from a known small solar system body, the Jupiter-family comet 81P/Wild 2, and, in a separate collector, the first solid samples from the local interstellar medium. The first decade of Stardust research resulted in more than 142 peer-reviewed publications, including 15 papers in Science. Analyses of these amazing samples continue to yield unexpected discoveries and to raise new questions about the history of the early solar system. We identify nine high-priority scientific objectives for future Stardust analyses that address important unsolved problems in planetary science.
Lunar Meteorites: A Global Geochemical Dataset
NASA Technical Reports Server (NTRS)
Zeigler, R. A.; Joy, K. H.; Arai, T.; Gross, J.; Korotev, R. L.; McCubbin, F. M.
2017-01-01
To date, the world's meteorite collections contain over 260 lunar meteorite stones representing at least 120 different lunar meteorites. Additionally, there are 20-30 as yet unnamed stones currently in the process of being classified. Collectively these lunar meteorites likely represent 40-50 distinct sampling locations from random locations on the Moon. Although the exact provenance of each individual lunar meteorite is unknown, collectively the lunar meteorites represent the best global average of the lunar crust. The Apollo sites are all within or near the Procellarum KREEP Terrane (PKT), thus lithologies from the PKT are overrepresented in the Apollo sample suite. Nearly all of the lithologies present in the Apollo sample suite are found within the lunar meteorites (high-Ti basalts are a notable exception), and the lunar meteorites contain several lithologies not present in the Apollo sample suite (e.g., magnesian anorthosite). This chapter will not be a sample-by-sample summary of each individual lunar meteorite. Rather, the chapter will summarize the different types of lunar meteorites and their relative abundances, comparing and contrasting the lunar meteorite sample suite with the Apollo sample suite. This chapter will act as one of the introductory chapters to the volume, introducing lunar samples in general and setting the stage for more detailed discussions in later more specialized chapters. The chapter will begin with a description of how lunar meteorites are ejected from the Moon, how deep samples are being excavated from, what the likely pairing relationships are among the lunar meteorite samples, and how the lunar meteorites can help to constrain the impactor flux in the inner solar system. There will be a discussion of the biases inherent to the lunar meteorite sample suite in terms of underrepresented lithologies or regions of the Moon, and an examination of the contamination and limitations of lunar meteorites due to terrestrial weathering. The bulk of the chapter will use examples from the lunar meteorite suite to examine important recent advances in lunar science, including (but not limited to the following: (1) Understanding the global compositional diversity of the lunar surface; (2) Understanding the formation of the ancient lunar primary crust; (3) Understanding the diversity and timing of mantle melting, and secondary crust formation; (4) Comparing KREEPy lunar meteorites to KREEPy Apollo samples as evidence of variability within the PKT; and (5) A better understanding of the South Pole Aitken Basin through lunar meteorites whose provenance are within that Terrane.
Sample Results From The Extraction, Scrub, And Strip Test For The Blended NGS Solvent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington, A. L. II; Peters, T. B.
This report summarizes the results of the extraction, scrub, and strip testing for the September 2013 sampling of the Next Generation Solvent (NGS) Blended solvent from the Modular Caustic Side-Solvent Extraction Unit (MCU) Solvent Hold Tank. MCU is in the process of transitioning from the BOBCalixC6 solvent to the NGS Blend solvent. As part of that transition, MCU has intentionally created a blended solvent to be processed using the Salt Batch program. This sample represents the first sample received from that blended solvent. There were two ESS tests performed where NGS blended solvent performance was assessed using either the Tankmore » 21 material utilized in the Salt Batch 7 analyses or a simulant waste material used in the V-5/V-10 contactor testing. This report tabulates the temperature corrected cesium distribution, or D Cs values, step recovery percentage, and actual temperatures recorded during the experiment. This report also identifies the sample receipt date, preparation method, and analysis performed in the accumulation of the listed values. The calculated extraction D Cs values using the Tank 21H material and simulant are 59.4 and 53.8, respectively. The DCs values for two scrub and three strip processes for the Tank 21 material are 4.58, 2.91, 0.00184, 0.0252, and 0.00575, respectively. The D-values for two scrub and three strip processes for the simulant are 3.47, 2.18, 0.00468, 0.00057, and 0.00572, respectively. These values are similar to previous measurements of Salt Batch 7 feed with lab-prepared blended solvent. These numbers are considered compatible to allow simulant testing to be completed in place of actual waste due to the limited availability of feed material.« less
Phosphatized algal-bacterial assemblages in Late Cretaceous phosphorites of the Voronezh Anteclise
NASA Astrophysics Data System (ADS)
Maleonkina, Svetlana Y.
2003-01-01
Late Cretaceous phosphogenesis of the Voronezh Anteclise has occurred during Cenomanian and Early Campanian. SEM studies show the presence of phosphatized algal-bacterial assemblages both in Cenomanian and Campanian phosphorites. In some Cenomanian nodular phosphorite samples revealed empty tubes 1 - 5 microns in diameter, which are most likely trichomes of cyanobacterial filaments. Other samples contained accumulations of spheres 0,5-3 microns, similar to coccoidal bacteria. Complicated tubular forms with variable diameter 2 - 5 microns occur on surface of some quartz grains in nodules. They are probably pseudomorphs after algae. We found similar formations in the Campanian phosphate grains. Frequently, grain represents a cyanobacterial mat, which is sometimes concentrically coated by phosphatic films. The films of some grains retain the primary structure, their concentric layers are formed by pseudomorphs after different bacterial types and obviously they represent oncolite. In other cases, the primary structure is unobservable because of recrystallization process erases them. Occasionally, the central part retains the coccoidal structure and the recrystallization affects only films. Besides the core of such oncolite can be represented not only by phosphatic grain, but also by grains of other minerals, such as quartz, glauconite and heavy minerals, which serve as a substrate for cyanobacterial colonies. Bacteria also could settle on cavity surfaces and interiors frames of sponge fragments, teeth and bones.
Siefen, Georg; Kirkcaldy, Bruce; Adam, Hubertus; Schepker, Renate
2015-03-01
How does the German child and adolescent psychiatry system respond to the increasing number of migrant children and adolescents? Senior doctors from German child and adolescent psychiatric hospitals (Association of Medical Hospital Directors in Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy in Germany, BAG) completed a specially constructed questionnaire about the treatment needs of migrant children, while a «random, representative» sample of child and adolescent psychiatrists in private practice (German Professional Association for Child and Adolescent Psychiatry, Psychosomatic Medicine and Psychotherapy, BKJPP) was administered a slightly modified version. The 100 psychiatrists in private practice represented only about one-eighth of their group, whereas the 55 medical directors comprised a representative sample. One-third of the hospitals has treatments tailored to the specific needs of migrants. In both settings, however, competent interpreters were rarely found, despite the treatment problems arising from the understanding the illness by the parents, language problems, and the clinical knowledge of the patient. Cultural diversity is perceived as enriching. The migration background and the sex of child and adolescent psychiatrists influence the treatment of migrants. Facilitating the process of «cultural opening» in child and adolescent psychiatry involves enacting concrete steps, such as the funding of interpreter costs.
Likelihood inference of non-constant diversification rates with incomplete taxon sampling.
Höhna, Sebastian
2014-01-01
Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species) and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family) and thus to maximize diversity (diversified sampling). So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent) diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa). The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa). Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model). Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species). All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear to be violated.
Multi-laboratory survey of qPCR enterococci analysis method performance
Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr
Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes
2016-01-01
The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788
Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes
Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...
2016-12-16
The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less
Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.
Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary
2017-01-17
The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.
Miler, Miloš; Gosar, Mateja
2013-12-01
Solid particles in snow deposits, sampled in mining and Pb-processing area of Žerjav, Slovenia, have been investigated using scanning electron microscopy/energy-dispersive X-ray spectroscopy (SEM/EDS). Identified particles were classified as geogenic-anthropogenic, anthropogenic, and secondary weathering products. Geogenic-anthropogenic particles were represented by scarce Zn- and Pb-bearing ore minerals, originating from mine waste deposit. The most important anthropogenic metal-bearing particles in snow were Pb-, Sb- and Sn-bearing oxides and sulphides. The morphology of these particles showed that they formed at temperatures above their melting points. They were most abundant in snow sampled closest to the Pb-processing plant and least abundant in snow taken farthest from the plant, thus indicating that Pb processing was their predominant source between the last snowfall and the time of sampling. SEM/EDS analysis showed that Sb and Sn contents in these anthropogenic phases were higher and more variable than in natural Pb-bearing ore minerals. The most important secondary weathering products were Pb- and Zn-containing Fe-oxy-hydroxides whose elemental composition and morphology indicated that they mostly resulted from oxidation of metal-bearing sulphides emitted from the Pb-processing plant. This study demonstrated the importance of single particle analysis using SEM/EDS for differentiation between various sources of metals in the environment.
Schwientek, Marc; Guillet, Gaëlle; Rügner, Hermann; Kuch, Bertram; Grathwohl, Peter
2016-01-01
Increasing numbers of organic micropollutants are emitted into rivers via municipal wastewaters. Due to their persistence many pollutants pass wastewater treatment plants without substantial removal. Transport and fate of pollutants in receiving waters and export to downstream ecosystems is not well understood. In particular, a better knowledge of processes governing their environmental behavior is needed. Although a lot of data are available concerning the ubiquitous presence of micropollutants in rivers, accurate data on transport and removal rates are lacking. In this paper, a mass balance approach is presented, which is based on the Lagrangian sampling scheme, but extended to account for precise transport velocities and mixing along river stretches. The calculated mass balances allow accurate quantification of pollutants' reactivity along river segments. This is demonstrated for representative members of important groups of micropollutants, e.g. pharmaceuticals, musk fragrances, flame retardants, and pesticides. A model-aided analysis of the measured data series gives insight into the temporal dynamics of removal processes. The occurrence of different removal mechanisms such as photooxidation, microbial degradation, and volatilization is discussed. The results demonstrate, that removal processes are highly variable in time and space and this has to be considered for future studies. The high precision sampling scheme presented could be a powerful tool for quantifying removal processes under different boundary conditions and in river segments with contrasting properties. Copyright © 2015. Published by Elsevier B.V.
Latin American Study of Nutrition and Health (ELANS): rationale and study design.
Fisberg, M; Kovalskys, I; Gómez, G; Rigotti, A; Cortés, L Y; Herrera-Cuenca, M; Yépez, M C; Pareja, R G; Guajardo, V; Zimberg, I Z; Chiavegatto Filho, A D P; Pratt, M; Koletzko, B; Tucker, K L
2016-01-30
Obesity is growing at an alarming rate in Latin America. Lifestyle behaviours such as physical activity and dietary intake have been largely associated with obesity in many countries; however studies that combine nutrition and physical activity assessment in representative samples of Latin American countries are lacking. The aim of this study is to present the design rationale of the Latin American Study of Nutrition and Health/Estudio Latinoamericano de Nutrición y Salud (ELANS) with a particular focus on its quality control procedures and recruitment processes. The ELANS is a multicenter cross-sectional nutrition and health surveillance study of a nationally representative sample of urban populations from eight Latin American countries (Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, Perú and Venezuela). A standard study protocol was designed to evaluate the nutritional intakes, physical activity levels, and anthropometric measurements of 9000 enrolled participants. The study was based on a complex, multistage sample design and the sample was stratified by gender, age (15 to 65 years old) and socioeconomic level. A small-scale pilot study was performed in each country to test the procedures and tools. This study will provide valuable information and a unique dataset regarding Latin America that will enable cross-country comparisons of nutritional statuses that focus on energy and macro- and micronutrient intakes, food patterns, and energy expenditure. Clinical Trials NCT02226627.
Scalable approximate policies for Markov decision process models of hospital elective admissions.
Zhu, George; Lizotte, Dan; Hoey, Jesse
2014-05-01
To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francis, A.J.; Gillow, J.B.
1993-09-01
Microbial processes involved in gas generation from degradation of the organic constituents of transuranic waste under conditions expected at the Waste Isolation Pilot Plant (WIPP) repository are being investigated at Brookhaven National Laboratory. These laboratory studies are part of the Sandia National Laboratories -- WIPP Gas Generation Program. Gas generation due to microbial degradation of representative cellulosic waste was investigated in short-term (< 6 months) and long-term (> 6 months) experiments by incubating representative paper (filter paper, paper towels, and tissue) in WIPP brine under initially aerobic (air) and anaerobic (nitrogen) conditions. Samples from the WIPP surficial environment and undergroundmore » workings harbor gas-producing halophilic microorganisms, the activities of which were studied in short-term experiments. The microorganisms metabolized a variety of organic compounds including cellulose under aerobic, anaerobic, and denitrifying conditions. In long-term experiments, the effects of added nutrients (trace amounts of ammonium nitrate, phosphate, and yeast extract), no nutrients, and nutrients plus excess nitrate on gas production from cellulose degradation.« less
Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R
2014-01-01
Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770
40 CFR 264.97 - General ground-water monitoring requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-water samples from the uppermost aquifer that: (1) Represent the quality of background ground water that... quality may include sampling of wells that are not hydraulically upgradient of the waste management area... quality that is representative or more representative than that provided by the upgradient wells; and (2...
White, James M.; Faber, Vance; Saltzman, Jeffrey S.
1992-01-01
An image population having a large number of attributes is processed to form a display population with a predetermined smaller number of attributes which represent the larger number of attributes. In a particular application, the color values in an image are compressed for storage in a discrete lookup table (LUT) where an 8-bit data signal is enabled to form a display of 24-bit color values. The LUT is formed in a sampling and averaging process from the image color values with no requirement to define discrete Voronoi regions for color compression. Image color values are assigned 8-bit pointers to their closest LUT value whereby data processing requires only the 8-bit pointer value to provide 24-bit color values from the LUT.
Diffusion Decision Model: Current Issues and History
Ratcliff, Roger; Smith, Philip L.; Brown, Scott D.; McKoon, Gail
2016-01-01
There is growing interest in diffusion models to represent the cognitive and neural processes of speeded decision making. Sequential-sampling models like the diffusion model have a long history in psychology. They view decision making as a process of noisy accumulation of evidence from a stimulus. The standard model assumes that evidence accumulates at a constant rate during the second or two it takes to make a decision. This process can be linked to the behaviors of populations of neurons and to theories of optimality. Diffusion models have been used successfully in a range of cognitive tasks and as psychometric tools in clinical research to examine individual differences. In this article, we relate the models to both earlier and more recent research in psychology. PMID:26952739
Symbol-string sensitivity and adult performance in lexical decision.
Pammer, Kristen; Lavis, Ruth; Cooper, Charity; Hansen, Peter C; Cornelissen, Piers L
2005-09-01
In this study of adult readers, we used a symbol-string task to assess participants' sensitivity to the position of briefly presented, non-alphabetic but letter-like symbols. We found that sensitivity in this task explained a significant proportion of sample variance in visual lexical decision. Based on a number of controls, we show that this relationship cannot be explained by other factors including: chronological age, intelligence, speed of processing and/or concentration, short term memory consolidation, or fixation stability. This approach represents a new way to elucidate how, and to what extent, individual variation in pre-orthographic visual and cognitive processes impinge on reading skills, and the results suggest that limitations set by visuo-spatial processes constrain visual word recognition.
Aghamohammadi, Amirhossein; Ang, Mei Choo; A Sundararajan, Elankovan; Weng, Ng Kok; Mogharrebi, Marzieh; Banihashem, Seyed Yashar
2018-01-01
Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods.
2018-01-01
Visual tracking in aerial videos is a challenging task in computer vision and remote sensing technologies due to appearance variation difficulties. Appearance variations are caused by camera and target motion, low resolution noisy images, scale changes, and pose variations. Various approaches have been proposed to deal with appearance variation difficulties in aerial videos, and amongst these methods, the spatiotemporal saliency detection approach reported promising results in the context of moving target detection. However, it is not accurate for moving target detection when visual tracking is performed under appearance variations. In this study, a visual tracking method is proposed based on spatiotemporal saliency and discriminative online learning methods to deal with appearance variations difficulties. Temporal saliency is used to represent moving target regions, and it was extracted based on the frame difference with Sauvola local adaptive thresholding algorithms. The spatial saliency is used to represent the target appearance details in candidate moving regions. SLIC superpixel segmentation, color, and moment features can be used to compute feature uniqueness and spatial compactness of saliency measurements to detect spatial saliency. It is a time consuming process, which prompted the development of a parallel algorithm to optimize and distribute the saliency detection processes that are loaded into the multi-processors. Spatiotemporal saliency is then obtained by combining the temporal and spatial saliencies to represent moving targets. Finally, a discriminative online learning algorithm was applied to generate a sample model based on spatiotemporal saliency. This sample model is then incrementally updated to detect the target in appearance variation conditions. Experiments conducted on the VIVID dataset demonstrated that the proposed visual tracking method is effective and is computationally efficient compared to state-of-the-art methods. PMID:29438421
ERIC Educational Resources Information Center
Kogan, Steven M.; Wejnert, Cyprian; Chen, Yi-fu; Brody, Gene H.; Slater, LaTrina M.
2011-01-01
Obtaining representative samples from populations of emerging adults who do not attend college is challenging for researchers. This article introduces respondent-driven sampling (RDS), a method for obtaining representative samples of hard-to-reach but socially interconnected populations. RDS combines a prescribed method for chain referral with a…
NASA Astrophysics Data System (ADS)
Brodic, D.
2011-01-01
Text line segmentation represents the key element in the optical character recognition process. Hence, testing of text line segmentation algorithms has substantial relevance. All previously proposed testing methods deal mainly with text database as a template. They are used for testing as well as for the evaluation of the text segmentation algorithm. In this manuscript, methodology for the evaluation of the algorithm for text segmentation based on extended binary classification is proposed. It is established on the various multiline text samples linked with text segmentation. Their results are distributed according to binary classification. Final result is obtained by comparative analysis of cross linked data. At the end, its suitability for different types of scripts represents its main advantage.
Holowenko, Fervone M; MacKinnon, Michael D; Fedorak, Phillip M
2002-06-01
The water produced during the extraction of bitumen from oil sands is toxic to aquatic organisms due largely to a group of naturally occurring organic acids, naphthenic acids (NAs), that are solubilized from the bitumen during processing. NAs are a complex mixture of alkyl-substituted acyclic and cycloaliphatic carboxylic acids, with the general chemical formula CnH(2n + Z)O2, where n is the carbon number and Z specifies a homologous family. Gas chromatography-electron impact mass spectrometry was used to characterize NAs in nine water samples derived from oil sands extraction processes. For each sample, the analysis provided the relative abundances for up to 156 base peaks, with each representing at least one NA structure. Plotting the relative abundances of NAs as three-dimensional bar graphs showed differences among samples. The relative abundance of NAs with carbon numbers < or = 21 to those in the "C22 + cluster" (sum of all NAs with carbon numbers > or = 22 in Z families 0 to -12) proved useful for comparing the water samples that had a range of toxicities. A decrease in toxicity of process-affected waters accompanied an increase in the proportion of NAs in the "C22 + cluster", likely caused by biodegradation of NAs with carbon numbers of < or = 21. In addition, an increase in the proportion of NAs in the "C22 + cluster" accompanied a decrease in the total NAs in the process-affected waters, again suggesting the selective removal of NAs with carbon numbers of < or = 21. This is the first investigation in which changes in the fingerprint of the NA fraction of process-affected waters from the oil sands operations has corresponded with measured toxicity in these waters.
Faulting processes in active faults - Evidences from TCDP and SAFOD drill core samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janssen, C.; Wirth, R.; Wenk, H. -R.
The microstructures, mineralogy and chemistry of representative samples collected from the cores of the San Andreas Fault drill hole (SAFOD) and the Taiwan Chelungpu-Fault Drilling project (TCDP) have been studied using optical microscopy, TEM, SEM, XRD and XRF analyses. SAFOD samples provide a transect across undeformed host rock, the fault damage zone and currently active deforming zones of the San Andreas Fault. TCDP samples are retrieved from the principal slip zone (PSZ) and from the surrounding damage zone of the Chelungpu Fault. Substantial differences exist in the clay mineralogy of SAFOD and TCDP fault gouge samples. Amorphous material has beenmore » observed in SAFOD as well as TCDP samples. In line with previous publications, we propose that melt, observed in TCDP black gouge samples, was produced by seismic slip (melt origin) whereas amorphous material in SAFOD samples was formed by comminution of grains (crush origin) rather than by melting. Dauphiné twins in quartz grains of SAFOD and TCDP samples may indicate high seismic stress. The differences in the crystallographic preferred orientation of calcite between SAFOD and TCDP samples are significant. Microstructures resulting from dissolution–precipitation processes were observed in both faults but are more frequently found in SAFOD samples than in TCDP fault rocks. As already described for many other fault zones clay-gouge fabrics are quite weak in SAFOD and TCDP samples. Clay-clast aggregates (CCAs), proposed to indicate frictional heating and thermal pressurization, occur in material taken from the PSZ of the Chelungpu Fault, as well as within and outside of the SAFOD deforming zones, indicating that these microstructures were formed over a wide range of slip rates.« less
A High Resolution Microprobe Study of EETA79001 Lithology C
NASA Technical Reports Server (NTRS)
Schrader, Christian M.; Cohen, B. A.; Donovan, J. J.; Vicenzi, E. P.
2010-01-01
Antarctic meteorite EETA79001 has received substantial attention for possibly containing a component of Martian soil in its impact glass (Lithology C) [1]. The composition of Martian soil can illuminate near-surface processes such as impact gardening [2] and hydrothermal and volcanic activity [3,4]. Impact melts in meteorites represent our most direct samples of Martian regolith. We present the initial findings from a high-resolution electron microprobe study of Lithology C from Martian meteorite EETA79001. As this study develops we aim to extract details of a potential soil composition and to examine Martian surface processes using elemental ratios and correlations.
Fowler, J C; Hilsenroth, M J; Handler, L
2000-08-01
In this article, we describe Martin Mayman's approach to early childhood memories as a projective technique, beginning with his scientific interest in learning theory, coupled with his interest in ego psychology and object relations theory. We describe Mayman's contributions to the use of the early memories technique to inform the psychotherapy process, tying assessment closely to psychotherapy and making assessment more useful in treatment. In this article, we describe a representative sample of research studies that demonstrate the reliability and validity of early memories, followed by case examples in which the early memories informed the therapy process, including issues of transference and countertransference.
Investigating the role of personal and context-related factors in convenience foods consumption.
Contini, Caterina; Boncinelli, Fabio; Gerini, Francesca; Scozzafava, Gabriele; Casini, Leonardo
2018-07-01
In the scenario of food consumptions, we witness the consumer's growing consideration for the "convenience" attribute. Our study intends to understand the consumer behaviour towards convenience-processed foods by analysing in a single model the role of beliefs, personal traits, social influence and market availability. We applied a Structural Equation Model (SEM) to a representative sample of 426 Italian consumers. The results show a correlation between intention to consume convenience-processed foods and social influence, market availability and several personal traits, suggesting strategies for the development of the convenience food market. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Jiyoun; Akin, Heather; Brossard, Dominique; Xenos, Michael; Scheufele, Dietram A.
2017-05-01
This study examines how familiarity with an issue—nanotechnology—moderates the effect of exposure to science information on how people process mediated messages about a complex issue. In an online experiment, we provide a nationally representative sample three definitions of nanotechnology (technical, technical applications, and technical risk/benefit definitions). We then ask them to read an article about the topic. We find significant interactions between perceived nano-familiarity and the definition received in terms of how respondents perceive favorable information conveyed in the stimulus. People less familiar with nanotechnology were more significantly affected by the type of definition they received.
Coherent radar imaging: Signal processing and statistical properties
NASA Astrophysics Data System (ADS)
Woodman, Ronald F.
1997-11-01
The recently developed technique for imaging radar scattering irregularities has opened a great scientific potential for ionospheric and atmospheric coherent radars. These images are obtained by processing the diffraction pattern of the backscattered electromagnetic field at a finite number of sampling points on the ground. In this paper, we review the mathematical relationship between the statistical covariance of these samples, (? ?†), and that of the radiating object field to be imaged, (??†), in a self-contained and comprehensive way. It is shown that these matrices are related in a linear way by (??†) = aM(FF†)M†a*, where M is a discrete Fourier transform operator and a is a matrix operator representing the discrete and limited sampling of the field. The image, or brightness distribution, is the diagonal of (FF†). The equation can be linearly inverted only in special cases. In most cases, inversion algorithms which make use of a priori information or maximum entropy constraints must be used. A naive (biased) "image" can be estimated in a manner analogous to an optical camera by simply applying an inverse DFT operator to the sampled field ? and evaluating the average power of the elements of the resulting vector ?. Such a transformation can be obtained either digitally or in an analog way. For the latter we can use a Butler matrix consisting of properly interconnected transmission lines. The case of radar targets in the near field is included as a new contribution. This case involves an additional matrix operator b, which is an analog of an optical lens used to compensate for the curvature of the phase fronts of the backscattered field. This "focusing" can be done after the statistics have been obtained. The formalism is derived for brightness distributions representing total powers. However, the derived expressions have been extended to include "color" images for each of the frequency components of the sampled time series. The frequency filtering is achieved by estimating spectra and cross spectra of the sample time series, in lieu of the power and cross correlations used in the derivation.
Rarity and Incomplete Sampling in DNA-Based Species Delimitation.
Ahrens, Dirk; Fujisawa, Tomochika; Krammer, Hans-Joachim; Eberle, Jonas; Fabrizi, Silvia; Vogler, Alfried P
2016-05-01
DNA-based species delimitation may be compromised by limited sampling effort and species rarity, including "singleton" representatives of species, which hampers estimates of intra- versus interspecies evolutionary processes. In a case study of southern African chafers (beetles in the family Scarabaeidae), many species and subclades were poorly represented and 48.5% of species were singletons. Using cox1 sequences from >500 specimens and ∼100 species, the Generalized Mixed Yule Coalescent (GMYC) analysis as well as various other approaches for DNA-based species delimitation (Automatic Barcode Gap Discovery (ABGD), Poisson tree processes (PTP), Species Identifier, Statistical Parsimony), frequently produced poor results if analyzing a narrow target group only, but the performance improved when several subclades were combined. Hence, low sampling may be compensated for by "clade addition" of lineages outside of the focal group. Similar findings were obtained in reanalysis of published data sets of taxonomically poorly known species assemblages of insects from Madagascar. The low performance of undersampled trees is not due to high proportions of singletons per se, as shown in simulations (with 13%, 40% and 52% singletons). However, the GMYC method was highly sensitive to variable effective population size ([Formula: see text]), which was exacerbated by variable species abundances in the simulations. Hence, low sampling success and rarity of species affect the power of the GMYC method only if they reflect great differences in [Formula: see text] among species. Potential negative effects of skewed species abundances and prevalence of singletons are ultimately an issue about the variation in [Formula: see text] and the degree to which this is correlated with the census population size and sampling success. Clade addition beyond a limited study group can overcome poor sampling for the GMYC method in particular under variable [Formula: see text] This effect was less pronounced for methods of species delimitation not based on coalescent models. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Hilner, Joan E; Perdue, Letitia H; Sides, Elizabeth G; Pierce, June J; Wägner, Ana M; Aldrich, Alan; Loth, Amanda; Albret, Lotte; Wagenknecht, Lynne E; Nierras, Concepcion; Akolkar, Beena
2010-01-01
The Type 1 Diabetes Genetics Consortium (T1DGC) is an international project whose primary aims are to: (a) discover genes that modify type 1 diabetes risk; and (b) expand upon the existing genetic resources for type 1 diabetes research. The initial goal was to collect 2500 affected sibling pair (ASP) families worldwide. T1DGC was organized into four regional networks (Asia-Pacific, Europe, North America, and the United Kingdom) and a Coordinating Center. A Steering Committee, with representatives from each network, the Coordinating Center, and the funding organizations, was responsible for T1DGC operations. The Coordinating Center, with regional network representatives, developed study documents and data systems. Each network established laboratories for: DNA extraction and cell line production; human leukocyte antigen genotyping; and autoantibody measurement. Samples were tracked from the point of collection, processed at network laboratories and stored for deposit at National Institute for Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories. Phenotypic data were collected and entered into the study database maintained by the Coordinating Center. T1DGC achieved its original ASP recruitment goal. In response to research design changes, the T1DGC infrastructure also recruited trios, cases, and controls. Results of genetic analyses have identified many novel regions that affect susceptibility to type 1 diabetes. T1DGC created a resource of data and samples that is accessible to the research community. Participation in T1DGC was declined by some countries due to study requirements for the processing of samples at network laboratories and/or final deposition of samples in NIDDK Central Repositories. Re-contact of participants was not included in informed consent templates, preventing collection of additional samples for functional studies. T1DGC implemented a distributed, regional network structure to reach ASP recruitment targets. The infrastructure proved robust and flexible enough to accommodate additional recruitment. T1DGC has established significant resources that provide a basis for future discovery in the study of type 1 diabetes genetics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haudebourg, Raphael; Fichet, Pascal; Goutelard, Florence
The detection (location and quantification) of nuclear facilities to be dismantled possible contamination with low-range particles emitters ({sup 3}H, other low-energy β emitters, a emitters) remains a tedious and expensive task. Indeed, usual remote counters show a too low sensitivity to these non-penetrating radiations, while conventional wipe tests are irrelevant for fixed radioactivity evaluation. The only method to accurately measure activity levels consists in sampling and running advanced laboratory analyses (spectroscopy, liquid scintillation counting, pyrolysis...). Such measurements generally induce sample preparation, waste production (destructive analyses, solvents), nuclear material transportation, long durations, and significant labor mobilization. Therefore, the search for themore » limitation of their number and cost easily conflicts with the necessity to perform a dense screening for sampling (to maximize the representativeness of the samples), in installations of thousands of square meters (floors, wells, ceilings), plus furniture, pipes, and other wastes. To overcome this contradiction, Digital Autoradiography (D. A.) was re-routed from bio molecular research to radiological mapping of nuclear installations under dismantling and to waste and sample analysis. After in-situ exposure to the possibly-contaminated areas to investigate, commercial reusable radiosensitive phosphor screens (of a few 100 cm{sup 2}) were scanned in the proper laboratory device and sharp quantitative images of the radioactivity could be obtained. The implementation of geostatistical tools in the data processing software enabled the exhaustive characterization of concrete floors at a rate of 2 weeks / 100 m{sup 2}, at lowest costs. Various samples such as drilled cores, or tank and wood pieces, were also successfully evaluated with this method, for decisive results. Thanks to the accurate location of potential contamination spots, this approach ensures relevant and representative sampling for further laboratory analyses and should be inserted in the range of common tools used in dismantling. (authors)« less
Bushel, Pierre R; Wolfinger, Russell D; Gibson, Greg
2007-01-01
Background Commonly employed clustering methods for analysis of gene expression data do not directly incorporate phenotypic data about the samples. Furthermore, clustering of samples with known phenotypes is typically performed in an informal fashion. The inability of clustering algorithms to incorporate biological data in the grouping process can limit proper interpretation of the data and its underlying biology. Results We present a more formal approach, the modk-prototypes algorithm, for clustering biological samples based on simultaneously considering microarray gene expression data and classes of known phenotypic variables such as clinical chemistry evaluations and histopathologic observations. The strategy involves constructing an objective function with the sum of the squared Euclidean distances for numeric microarray and clinical chemistry data and simple matching for histopathology categorical values in order to measure dissimilarity of the samples. Separate weighting terms are used for microarray, clinical chemistry and histopathology measurements to control the influence of each data domain on the clustering of the samples. The dynamic validity index for numeric data was modified with a category utility measure for determining the number of clusters in the data sets. A cluster's prototype, formed from the mean of the values for numeric features and the mode of the categorical values of all the samples in the group, is representative of the phenotype of the cluster members. The approach is shown to work well with a simulated mixed data set and two real data examples containing numeric and categorical data types. One from a heart disease study and another from acetaminophen (an analgesic) exposure in rat liver that causes centrilobular necrosis. Conclusion The modk-prototypes algorithm partitioned the simulated data into clusters with samples in their respective class group and the heart disease samples into two groups (sick and buff denoting samples having pain type representative of angina and non-angina respectively) with an accuracy of 79%. This is on par with, or better than, the assignment accuracy of the heart disease samples by several well-known and successful clustering algorithms. Following modk-prototypes clustering of the acetaminophen-exposed samples, informative genes from the cluster prototypes were identified that are descriptive of, and phenotypically anchored to, levels of necrosis of the centrilobular region of the rat liver. The biological processes cell growth and/or maintenance, amine metabolism, and stress response were shown to discern between no and moderate levels of acetaminophen-induced centrilobular necrosis. The use of well-known and traditional measurements directly in the clustering provides some guarantee that the resulting clusters will be meaningfully interpretable. PMID:17408499
Reckel, Frank; Melzer, Roland R
2004-04-01
In order to comparatively analyze curtain-like septa in the eyes of visually orientated "close-to-surface-predators" among atherinomorph teleosts, we examined the eyes of 24 atherinomorph species under a binocular microscope with regard to the falciform process and related structures in the vitreous cavity. Additionally, falciform process samples were analyzed by transmission electron microscopy. All the studied representatives of the Cyprinodontiformes and Atheriniformes, and of one of the beloniform suborder, Adrianichthyioidei, possess a "typical" processus falciformis. In the eyes of the representatives of the other beloniform suborder, Belonoidei, however, pigmented structures that originate in the region of the optic disc and protrude into the vitreous cavity were noted. In the Hemiramphidae (halfbeaks) and Exocoetidae (flying fishes) these pigmented structures have a more cone-like shape, whereas in the Belonidae (needlefishes) and Scomberesocidae (sauries) horizontally oriented heavily pigmented curtain-like septa occur that divide the vitreous cavity dorsoventrally. It is suggested that the "typical" processus falciformis represents a plesiomorphic feature within the Atherinomorpha, whereas the pigmented modifications of the falciform process must be seen as a synapomorphic character state of the Belonoidei. The curtain-like septum of the Belonidae and Scomberesocidae might have evolved from the cone-like structures that are found in the Exocoetoidea. The functional significance of the pigmented structures in the eye is as yet not clear, except for the curtain-like septum found in Belonidae. It might play a role in visual orientation near the water surface at Snell's window. Copyright 2004 Wiley-Liss, Inc.
Cipolat-Gotet, C; Cecchinato, A; De Marchi, M; Bittante, G
2013-01-01
Cheese yield (CY) is the most important technological trait of milk, because cheese-making uses a very high proportion of the milk produced worldwide. Few studies have been carried out at the level of individual milk-producing animals due to a scarcity of appropriate procedures for model-cheese production, the complexity of cheese-making, and the frequent use of the fat and protein (or casein) contents of milk as a proxy for cheese yield. Here, we report a high-throughput cheese manufacturing process that mimics all phases of cheese-making, uses 1.5-L samples of milk from individual animals, and allows the simultaneous processing of 15 samples per run. Milk samples were heated (35°C for 40 min), inoculated with starter culture (90 min), mixed with rennet (51.2 international milk-clotting units/L of milk), and recorded for gelation time. Curds were cut twice (10 and 15 min after gelation), separated from the whey, drained (for 30 min), pressed (3 times, 20 min each, with the wheel turned each time), salted in brine (for 60 min), weighed, and sampled. Whey was collected, weighed, and sampled. Milk, curd, and whey samples were analyzed for pH, total solids, fat content, and protein content, and energy content was estimated. Three measures of percentage cheese yield (%CY) were calculated: %CY(CURD), %CY(SOLIDS), and %CY(WATER), representing the ratios between the weight of fresh curd, the total solids of the curd, and the water content of the curd, respectively, and the weight of the milk processed. In addition, 3 measures of daily cheese yield (dCY, kg/d) were defined, considering the daily milk yield. Three measures of nutrient recovery (REC) were computed: REC(FAT), REC(PROTEIN), and REC(SOLIDS), which represented the ratio between the weights of the fat, protein, and total solids in the curd, respectively, and the corresponding components in the milk. Energy recovery, REC(ENERGY), represented the energy content of the cheese compared with that in the milk. This procedure was used to process individual milk samples obtained from 1,167 Brown Swiss cows reared in 85 herds of the province of Trento (Italy). The assessed traits exhibited almost normal distributions, with the exception of REC(FAT). The average values (± SD) were as follows: %CY(CURD)=14.97±1.86, %CY(SOLIDS)=7.18±0.92, %CY(WATER)=7.77±1.27, dCY(CURD)=3.63±1.17, dCY(SOLIDS)=1.74±0.57, dCY(WATER)=1.88±0.63, REC(FAT)=89.79±3.55, REC(PROTEIN)=78.08±2.43, REC(SOLIDS)=51.88±3.52, and REC(ENERGY)=67.19±3.29. All traits were highly influenced by herd-test-date and days in milk of the cow, moderately influenced by parity, and weakly influenced by the utilized vat. Both %CY(CURD) and dCY(CURD) depended not only on the fat and protein (casein) contents of the milk, but also on their proportions retained in the curd; the water trapped in curd presented an higher variability than that of %CY(SOLIDS). All REC traits were variable and affected by days in milk and parity of the cows. The described model cheese-making procedure and the results obtained provided new insight into the phenotypic variation of cheese yield and recovery traits at the individual level. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
LMI designmethod for networked-based PID control
NASA Astrophysics Data System (ADS)
Souza, Fernando de Oliveira; Mozelli, Leonardo Amaral; de Oliveira, Maurício Carvalho; Palhares, Reinaldo Martinez
2016-10-01
In this paper, we propose a methodology for the design of networked PID controllers for second-order delayed processes using linear matrix inequalities. The proposed procedure takes into account time-varying delay on the plant, time-varying delays induced by the network and packed dropouts. The design is carried on entirely using a continuous-time model of the closed-loop system where time-varying delays are used to represent sampling and holding occurring in a discrete-time digital PID controller.
Instrumental neutron activation analysis for studying size-fractionated aerosols
NASA Astrophysics Data System (ADS)
Salma, Imre; Zemplén-Papp, Éva
1999-10-01
Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.
Use of porosity to estimate hydraulic properties of volcanic tuffs
Flint, L.E.; Selker, J.S.
2003-01-01
Correlations of hydraulic properties with easily measured physical properties are useful for purposes of site characterization in heterogeneous sites. Approximately 600 samples of volcanic rocks from Yucca Mountain, Nevada, representing lithologies with a large range of hydraulic properties, were analyzed to develop correlations of effective porosity with saturated hydraulic conductivity and moisture-retention curve-fit parameters that relate to lithologies of varying depositional history and alteration processes. Effective porosity, ??e, defined as the porosity calculated using drying at a relative humidity of -70 MPa, is used in a generalized Kozeny-Carman equation to predict saturated hydraulic conductivity, Ks = b??en, where b and n are constants. The entire dataset has an R2 of 0.36. When samples are grouped according to general lithology, correlations result in an R2 of 0.71 for the crystallized/vitric samples, 0.24 for samples with mineral alteration, and 0.34 for samples with microfractures, thus increasing the predictive capability over that of the total dataset. Published by Elsevier Science Ltd.
Standard deviation and standard error of the mean.
Lee, Dong Kyu; In, Junyong; Lee, Sangseok
2015-06-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results.
Standard deviation and standard error of the mean
In, Junyong; Lee, Sangseok
2015-01-01
In most clinical and experimental studies, the standard deviation (SD) and the estimated standard error of the mean (SEM) are used to present the characteristics of sample data and to explain statistical analysis results. However, some authors occasionally muddle the distinctive usage between the SD and SEM in medical literature. Because the process of calculating the SD and SEM includes different statistical inferences, each of them has its own meaning. SD is the dispersion of data in a normal distribution. In other words, SD indicates how accurately the mean represents sample data. However the meaning of SEM includes statistical inference based on the sampling distribution. SEM is the SD of the theoretical distribution of the sample means (the sampling distribution). While either SD or SEM can be applied to describe data and statistical results, one should be aware of reasonable methods with which to use SD and SEM. We aim to elucidate the distinctions between SD and SEM and to provide proper usage guidelines for both, which summarize data and describe statistical results. PMID:26045923
Maiorino, Leonardo; Farke, Andrew A; Kotsakis, Tassos; Teresi, Luciano; Piras, Paolo
2015-11-01
Ceratopsidae represents a group of quadrupedal herbivorous dinosaurs that inhabited western North America and eastern Asia during the Late Cretaceous. Although horns and frills of the cranium are highly variable across species, the lower jaw historically has been considered to be relatively conservative in morphology. Here, the lower jaws from 58 specimens representing 21 ceratopsoid taxa were sampled, using geometric morphometrics and 2D finite element analysis (FEA) to explore differences in morphology and mechanical performance across Ceratopsoidea (the clade including Ceratopsidae, Turanoceratops and Zuniceratops). Principal component analyses and non-parametric permuted manovas highlight Triceratopsini as a morphologically distinct clade within the sample. A relatively robust and elongate dentary, a larger and more elongated coronoid process, and a small and dorso-ventrally compressed angular characterize this clade, as well as the absolutely larger size. By contrast, non-triceratopsin chasmosaurines, Centrosaurini and Pachyrhinosaurini have similar morphologies to each other. Zuniceratops and Avaceratops are distinct from other taxa. No differences in size between Pachyrhinosaurini and Centrosaurini are recovered using non-parametric permuted anovas. Structural performance, as evaluated using a 2D FEA, is similar across all groups as measured by overall stress, with the exception of Triceratopsini. Shape, size and stress are phylogenetically constrained. A longer dentary as well as a long coronoid process result in a lower jaw that is reconstructed as relatively much more stressed in triceratopsins. © 2015 Anatomical Society.
Sub-seafloor Processes and the Composition of Diffuse Hydrothermal Fluids
NASA Astrophysics Data System (ADS)
Butterfield, D. A.; Lilley, M. D.; Huber, J. A.; Baross, J. A.
2002-12-01
High-temperature water/rock reactions create the primary hydrothermal fluids that are diluted with cool, "crustal seawater" to produce low-temperature, diffuse hydrothermal vent fluids. By knowing the composition of each of the components that combine to produce diffuse fluids, one can compare the composition of calculated mixtures with the composition of sampled fluids, and thereby infer what chemical constituents have been affected by processes other than simple conservative mixing. Although there is always uncertainty in the composition of fluids from the sub-seafloor, some processes are significant enough to alter diffuse fluid compositions from the expected conservative mixtures of hot,primary fluid and "crustal seawater." When hydrothermal vents with a wide range of temperature are sampled, processes occurring in different thermal and chemical environments potentially can be discerned. At Axial Volcano (AV) on the Juan de Fuca ridge, methane clearly is produced in warm sub-seafloor environments at temperatures of ~ 100° or less. Based on culturing and phylogenetic analysis from the same water samples at AV, hyperthermophilic methanogens are present in water samples taken from vents ranging in temperature from 15 to 78° C. Ratios of hydrogen sulfide to pseudo-conservative tracers (dissolved silica or heat) at AV decrease when primary fluids are highly diluted with oxygenated seawater. Phylogenetic signatures of microbes closely related to sulfide-oxidizers are present in these same fluids. Hydrogen sulfide oxidation represents the dominant source of energy for chemosynthesis at AV, as in most hydrothermal systems, but a relatively small proportion of the total hydrogen sulfide available is actually oxidized, except at the very lowest temperatures.
NASA Astrophysics Data System (ADS)
Willingham, David; Naes, Benjamin E.; Tarolli, Jay G.; Schemer-Kohrn, Alan; Rhodes, Mark; Dahl, Michael; Guzman, Anthony; Burkes, Douglas E.
2018-01-01
Uranium-molybdenum (U-Mo) monolithic fuels represent one option for converting civilian research and test reactors operating with high enriched uranium (HEU) to low enriched uranium (LEU), effectively reducing the threat of nuclear proliferation world-wide. However, processes associated with fabrication of U-Mo monolithic fuels result in regions of elemental heterogeneity, observed as bands traversing the cross-section of representative samples. Isotopic variations (e.g., 235U and 238U) could also be introduced because of associated processing steps, particularly since HEU feedstock is melted with natural or depleted uranium diluent to produce LEU. This study demonstrates the utility of correlative analysis of Energy-Dispersive X-ray Spectroscopy (EDS) and Secondary Ion Mass Spectrometry (SIMS) with their image data streams using image fusion, resulting in a comprehensive microanalytical characterization toolbox. Elemental and isotopic measurements were made on a sample from the Advanced Test Reactor (ATR) Full-sized plate In-center flux trap Position (AFIP)-7 experiment and compared to previous optical and electron microscopy results. The image fusion results are characteristic of SIMS isotopic maps, but with the spatial resolution of EDS images and, therefore, can be used to increase the effective spatial resolution of the SIMS imaging results to better understand homogeneity or heterogeneity that persists because of processing selections. Visual inspection using the image fusion methodology indicated slight variations in the 235U/238U ratio and quantitative analysis using the image intensities across several FoVs revealed an average 235U atom percent value of 17.9 ± 2.4%, which was indicative of a non-uniform U isotopic distribution in the area sampled. Further development of this capability is useful for understanding the connections between the properties of LEU fuel alternatives and the ability to predict performance under irradiation.
Representativeness-based sampling network design for the State of Alaska
Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove
2013-01-01
Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...
Precipitation of hydrides in high purity niobium after different treatments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barkov, F.; Romanenko, A.; Trenikhina, Y.
Precipitation of lossy non-superconducting niobium hydrides represents a known problem for high purity niobium in superconducting applications. Using cryogenic optical and laser confocal scanning microscopy we have directly observed surface precipitation and evolution of niobium hydrides in samples after different treatments used for superconducting RF cavities for particle acceleration. Precipitation is shown to occur throughout the sample volume, and the growth of hydrides is well described by the fast diffusion-controlled process in which almost all hydrogen is precipitated atmore » $T=140$~K within $$\\sim30$$~min. 120$$^{\\circ}$$C baking and mechanical deformation are found to affect hydride precipitation through their influence on the number of nucleation and trapping centers.« less
Berant, Ety; Wald, Yaarit
2009-07-01
In this study, we addressed associations between self-reported attachment scales (anxiety and avoidance) and Rorschach (1921/1942) indexes indicating ego-boundary perception (barrier and penetration), use of projective identification, devaluation and splitting defenses, and Comprehensive System (Exner, 2005) scores that represent boundary blurring (incongruous and fabulized combinations). In this study, we extended the sample and findings described by Berant, Mikulincer, Shaver, and Segal (2005) using a nonclinical sample of 89 Israeli adults. We found significant associations between attachment orientations and Rorschach indexes dynamically related to anxiety. We also found a trend toward association between attachment orientation and avoidance. We discuss the theoretical and clinical implications of these findings.
FR Performance of New Fire-off on PET/CO blend fabrics
NASA Astrophysics Data System (ADS)
Atakan, R.; Çelebi, E.; Ozcan, G.; Soydan, N.; Sarac, A. S.
2017-10-01
This paper represents the investigation on flame retardancy performance and durability of polyester/cotton (P/C) fabrics treated with a novel halogen/formaldehyde free, P-N synergetic FR finishing agent called New Fire-off. 100 % Cotton, 100 % Polyester and three different blend P/C fabrics were chosen in this study. Fabric samples were treated with New Fire-off through pad-dry-cure process. Flammability and thermal properties of the treated samples with New Fire-off were tested according to relevant ISO standard and procedures. The obtained results showed that this new finishing formulation is a good char-forming agent. However, further studies are required to achieve washing durability for the P/C blends.
Actinomycetal complex of light sierozem on the Kopet-Dag piedmont plain
NASA Astrophysics Data System (ADS)
Zenova, G. M.; Zvyagintsev, D. G.; Manucharova, N. A.; Stepanova, O. A.; Chernov, I. Yu.
2016-10-01
The population density of actinomycetes in the samples of light sierozem from the Kopet Dag piedmont plain (75 km from Ashkhabad, Turkmenistan) reaches hundreds of thousand CFU/g soil. The actinomycetal complex is represented by two genera: Streptomyces and Micromonospora. Representatives of the Streptomyces genus predominate and comprise 73 to 87% of the actinomycetal complex. In one sample, representatives of the Micromonospora genus predominated in the complex (75%). The Streptomyces genus in the studied soil samples is represented by the species from several sections and series: the species of section Helvolo-Flavus series Helvolus represent the dominant component of the streptomycetal complex; their portion is up to 77% of all isolated actinomycetes. The species of other sections and series are much less abundant. Thus, the percentage of the Cinereus Achromogenes section in the actinomycetal complex does not exceed 28%; representatives of the Albus section Albus series, Roseus section Lavendulae-Roseus series, and Imperfectus section belong to rare species; they have been isolated not from all the studied samples of light sierozem, and their portion does not exceed 10% of the actinomycetal complex.
Toward an Integration of Cognitive and Genetic Models of Risk for Depression
Gibb, Brandon E.; Beevers, Christopher G.; McGeary, John E.
2012-01-01
There is growing interest in integrating cognitive and genetic models of depression risk. We review two ways in which these models can be meaningfully integrated. First, information-processing biases may represent intermediate phenotypes for specific genetic influences. These genetic influences may represent main effects on specific cognitive processes or may moderate the impact of environmental influences on information-processing biases. Second, cognitive and genetic influences may combine to increase reactivity to environmental stressors, increasing risk for depression in a gene × cognition × environment model of risk. There is now growing support for both of these ways of integrating cognitive and genetic models of depression risk. Specifically, there is support for genetic influences on information-processing biases, particularly the link between 5-HTTLPR and attentional biases, from both genetic association and gene × environment (G × E) studies. There is also initial support for gene × cognition × environment models of risk in which specific genetic influences contribute to increased reactivity to environmental influences. We review this research and discuss important areas of future research, particularly the need for larger samples that allow for a broader examination of genetic and epigenetic influences as well as the combined influence of variability across a number of genes. PMID:22920216
Gamma and Beta Oscillations Define a Sequence of Neurocognitive Modes Present in Odor Processing
Frederick, Donald E.; Brown, Austin; Brim, Elizabeth; Mehta, Nisarg; Vujovic, Mark
2016-01-01
Olfactory system beta (15–35 Hz) and gamma (40–110 Hz) oscillations of the local field potential in mammals have both been linked to odor learning and discrimination. Gamma oscillations represent the activity of a local network within the olfactory bulb, and beta oscillations represent engagement of a systemwide network. Here, we test whether beta and gamma oscillations represent different cognitive modes using the different demands of go/no-go and two-alternative choice tasks that previously were suggested to favor beta or gamma oscillations, respectively. We reconcile previous studies and show that both beta and gamma oscillations occur in both tasks, with gamma dominating the early odor sampling period (2–4 sniffs) and beta dominating later. The relative power and coherence of both oscillations depend separately on multiple factors within both tasks without categorical differences across tasks. While the early/gamma-associated period occurs in all trials, rats can perform above chance without the later/beta-associated period. Longer sampling, which includes beta oscillations, is associated with better performance. Gamma followed by beta oscillations therefore represents a sequence of cognitive and neural states during odor discrimination, which can be separately modified depending on the demands of a task and odor discrimination. Additionally, fast (85 Hz) and slow (70 Hz) olfactory bulb gamma oscillation sub-bands have been hypothesized to represent tufted and mitral cell networks, respectively (Manabe and Mori, 2013). We find that fast gamma favors the early and slow gamma the later (beta-dominated) odor-sampling period and that the relative contributions of these oscillations are consistent across tasks. SIGNIFICANCE STATEMENT Olfactory system gamma (40–110 Hz) and beta (15–35 Hz) oscillations of the local field potential indicate different neural firing statistics and functional circuits. We show that gamma and beta oscillations occur in stereotyped sequence during odor sampling in associative tasks, with local gamma dominating the first 250 ms of odor sniffing, followed by systemwide beta as behavioral responses are prepared. Oscillations and coupling strength between brain regions are modulated by task, odor, and learning, showing that task features can dramatically adjust the dynamics of a cortical sensory system, which changes state every ∼250 ms. Understanding cortical circuits, even at the biophysical level, depends on careful use of multiple behavioral contexts and stimuli. PMID:27445151
Gamma and Beta Oscillations Define a Sequence of Neurocognitive Modes Present in Odor Processing.
Frederick, Donald E; Brown, Austin; Brim, Elizabeth; Mehta, Nisarg; Vujovic, Mark; Kay, Leslie M
2016-07-20
Olfactory system beta (15-35 Hz) and gamma (40-110 Hz) oscillations of the local field potential in mammals have both been linked to odor learning and discrimination. Gamma oscillations represent the activity of a local network within the olfactory bulb, and beta oscillations represent engagement of a systemwide network. Here, we test whether beta and gamma oscillations represent different cognitive modes using the different demands of go/no-go and two-alternative choice tasks that previously were suggested to favor beta or gamma oscillations, respectively. We reconcile previous studies and show that both beta and gamma oscillations occur in both tasks, with gamma dominating the early odor sampling period (2-4 sniffs) and beta dominating later. The relative power and coherence of both oscillations depend separately on multiple factors within both tasks without categorical differences across tasks. While the early/gamma-associated period occurs in all trials, rats can perform above chance without the later/beta-associated period. Longer sampling, which includes beta oscillations, is associated with better performance. Gamma followed by beta oscillations therefore represents a sequence of cognitive and neural states during odor discrimination, which can be separately modified depending on the demands of a task and odor discrimination. Additionally, fast (85 Hz) and slow (70 Hz) olfactory bulb gamma oscillation sub-bands have been hypothesized to represent tufted and mitral cell networks, respectively (Manabe and Mori, 2013). We find that fast gamma favors the early and slow gamma the later (beta-dominated) odor-sampling period and that the relative contributions of these oscillations are consistent across tasks. Olfactory system gamma (40-110 Hz) and beta (15-35 Hz) oscillations of the local field potential indicate different neural firing statistics and functional circuits. We show that gamma and beta oscillations occur in stereotyped sequence during odor sampling in associative tasks, with local gamma dominating the first 250 ms of odor sniffing, followed by systemwide beta as behavioral responses are prepared. Oscillations and coupling strength between brain regions are modulated by task, odor, and learning, showing that task features can dramatically adjust the dynamics of a cortical sensory system, which changes state every ∼250 ms. Understanding cortical circuits, even at the biophysical level, depends on careful use of multiple behavioral contexts and stimuli. Copyright © 2016 the authors 0270-6474/16/367750-18$15.00/0.
On the bandwidth of the plenoptic function.
Do, Minh N; Marchand-Maillet, Davy; Vetterli, Martin
2012-02-01
The plenoptic function (POF) provides a powerful conceptual tool for describing a number of problems in image/video processing, vision, and graphics. For example, image-based rendering is shown as sampling and interpolation of the POF. In such applications, it is important to characterize the bandwidth of the POF. We study a simple but representative model of the scene where band-limited signals (e.g., texture images) are "painted" on smooth surfaces (e.g., of objects or walls). We show that, in general, the POF is not band limited unless the surfaces are flat. We then derive simple rules to estimate the essential bandwidth of the POF for this model. Our analysis reveals that, in addition to the maximum and minimum depths and the maximum frequency of painted signals, the bandwidth of the POF also depends on the maximum surface slope. With a unifying formalism based on multidimensional signal processing, we can verify several key results in POF processing, such as induced filtering in space and depth-corrected interpolation, and quantify the necessary sampling rates. © 2011 IEEE
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Colzi, Ilaria; Taiti, Cosimo; Marone, Elettra; Magnelli, Susanna; Gonnelli, Cristina; Mancuso, Stefano
2017-12-15
This work was performed to evaluate the possible application of PTR-ToF-MS technique in distinguishing between Coffea arabica (Arabica) and Coffea canephora var. robusta (Robusta) commercial stocks in each step of the processing chain (green beans, roasted beans, ground coffee, brews). volatile organic compounds (VOC) spectra from coffee samples of 7 Arabica and 6 Robusta commercial stocks were recorded and submitted to multivariate statistical analysis. Results clearly showed that, in each stage of the coffee processing, the volatile composition of coffee is highly influenced by the species. Actually, with the exception of green beans, PTR-ToF-MS technique was able to correctly recognize Arabica and Robusta samples. Particularly, among 134 tentatively identified VOCs, some masses (16 for roasted coffee, 12 for ground coffee and 12 for brewed coffee) were found to significantly discriminate the two species. Therefore, headspace VOC analyses was showed to represent a valuable tool to distinguish between Arabica and Robusta. Copyright © 2017 Elsevier Ltd. All rights reserved.
Schmitt, Christopher J.; Finger, Susan E.
1987-01-01
The influence of sample preparation on measured concentrations of eight elements in the edible tissues of two black basses (Centrarchidae), two catfishes (Ictaluridae), and the black redhorse,Moxostoma duquesnei (Catostomidae) from two rivers in southeastern Missouri contaminated by mining and related activities was investigated. Concentrations of Pb, Cd, Cu, Zn, Fe, Mn, Ba, and Ca were measured in two skinless, boneless samples of axial muscle from individual fish prepared in a clean room. One sample (normally-processed) was removed from each fish with a knife in a manner typically used by investigators to process fish for elemental analysis and presumedly representative of methods employed by anglers when preparing fish for home consumption. A second sample (clean-processed) was then prepared from each normally-processed sample by cutting away all surface material with acid-cleaned instruments under ultraclean conditions. The samples were analyzed as a single group by atomic absorption spectrophotometry. Of the elements studied, only Pb regularly exceeded current guidelines for elemental contaminants in foods. Concentrations were high in black redhorse from contaminated sites, regardless of preparation method; for the other fishes, whether or not Pb guidelines were exceeded depended on preparation technique. Except for Mn and Ca, concentrations of all elements measured were significantly lower in cleanthan in normally-processed tissue samples. Absolute differences in measured concentrations between clean- and normally-processed samples were most evident for Pb and Ba in bass and catfish and for Cd and Zn in redhorse. Regardless of preparation method, concentrations of Pb, Ca, Mn, and Ba in individual fish were closely correlated; samples that were high or low in one of these four elements were correspondingly high or low in the other three. In contrast, correlations between Zn, Fe, and Cd occurred only in normallyprocessed samples, suggesting that these correlations resulted from high concentrations on the surfaces of some samples. Concentrations of Pb and Ba in edible tissues of fish from contaminated sites were highly correlated with Ca content, which was probably determined largely by the amount of tissue other than muscle in the sample because fish muscle contains relatively little Ca. Accordingly, variation within a group of similar samples can be reduced by normalizing Pb and Ba concentrations to a standard Ca concentration. When sample size (N) is large, this can be accomplished statistically by analysis of covariance; whenN is small, molar ratios of [Pb]/[Ca] and [Ba]/[Ca] can be computed. Without such adjustments, unrealistically large Ns are required to yield statistically reliable estimates of Pb concentrations in edible tissues. Investigators should acknowledge that reported concentrations of certain elements are only estimates, and that regardless of the care exercised during the collection, preparation, and analysis of samples, results should be interpreted with the awareness that contamination from external sources may have occurred.
Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel
2014-07-07
The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.
NASA Astrophysics Data System (ADS)
Bristow, Tony W. T.; Ray, Andrew D.; O'Kearney-McMullan, Anne; Lim, Louise; McCullough, Bryan; Zammataro, Alessio
2014-10-01
For on-line monitoring of chemical reactions (batch or continuous flow), mass spectrometry (MS) can provide data to (1) determine the fate of starting materials and reagents, (2) confirm the presence of the desired product, (3) identify intermediates and impurities, (4) determine steady state conditions and point of completion, and (5) speed up process optimization. Recent developments in small footprint atmospheric pressure ionization portable mass spectrometers further enable this coupling, as the mass spectrometer can be easily positioned with the reaction system to be studied. A major issue for this combination is the transfer of a sample that is representative of the reaction and also compatible with the mass spectrometer. This is particularly challenging as high concentrations of reagents and products can be encountered in organic synthesis. The application of a portable mass spectrometer for on-line characterization of flow chemical synthesis has been evaluated by coupling a Microsaic 4000 MiD to the Future Chemistry Flow Start EVO chemistry system. Specifically, the Hofmann rearrangement has been studied using the on-line mass spectrometry approach. Sample transfer from the flow reactor is achieved using a mass rate attenuator (MRA) and a sampling make-up flow from a high pressure pump. This enables the appropriate sample dilution, transfer, and preparation for electrospray ionization. The capability of this approach to provide process understanding is described using an industrial pharmaceutical process that is currently under development. The effect of a number of key experimental parameters, such as the composition of the sampling make-up flow and the dilution factor on the mass spectrometry data, is also discussed.
NASA Astrophysics Data System (ADS)
Dietze, Michael; Fuchs, Margret; Kreutzer, Sebastian
2016-04-01
Many modern approaches of radiometric dating or geochemical fingerprinting rely on sampling sedimentary deposits. A key assumption of most concepts is that the extracted grain-size fraction of the sampled sediment adequately represents the actual process to be dated or the source area to be fingerprinted. However, these assumptions are not always well constrained. Rather, they have to align with arbitrary, method-determined size intervals, such as "coarse grain" or "fine grain" with partly even different definitions. Such arbitrary intervals violate principal process-based concepts of sediment transport and can thus introduce significant bias to the analysis outcome (i.e., a deviation of the measured from the true value). We present a flexible numerical framework (numOlum) for the statistical programming language R that allows quantifying the bias due to any given analysis size interval for different types of sediment deposits. This framework is applied to synthetic samples from the realms of luminescence dating and geochemical fingerprinting, i.e. a virtual reworked loess section. We show independent validation data from artificially dosed and subsequently mixed grain-size proportions and we present a statistical approach (end-member modelling analysis, EMMA) that allows accounting for the effect of measuring the compound dosimetric history or geochemical composition of a sample. EMMA separates polymodal grain-size distributions into the underlying transport process-related distributions and their contribution to each sample. These underlying distributions can then be used to adjust grain-size preparation intervals to minimise the incorporation of "undesired" grain-size fractions.
Petroleum storage tank cleaning using commercial microbial culture products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, D.R.; Entzeroth, L.C.; Timmis, A.
1995-12-31
The removal of paraffinic bottom accumulations from refinery storage tanks represents an increasingly costly area of petroleum storage management. Microorganisms can be used to reduce paraffinic bottoms by increasing the solubility of bottom material and by increasing the wax-carrying capacity of carrier oil used in the cleaning process. The economic savings of such treatments are considerable. The process is also intrinsically safer than alternative methods, as it reduces and even eliminates the need for personnel to enter the tank during the cleaning process. Both laboratory and field sample analyses can be used to document changes in tank material during themore » treatment process. These changes include increases in volatile content and changes in wax distribution. Several case histories illustrating these physical and chemical changes are presented along with the economics of treatment.« less
Lee, Sun Hee; Kim, Jae Hee; Chung, Chung-Wook; Kim, Do Young; Rhee, Young Ha
2018-04-01
Analysis of mixed microbial populations responsible for the production of medium-chain-length polyhydroxyalkanoates (MCL-PHAs) under periodic substrate feeding in a sequencing batch reactor (SBR) was conducted. Regardless of activated sludge samples and the different MCL alkanoic acids used as the sole external carbon substrate, denaturing gradient gel electrophoresis analysis indicated that Pseudomonas aeruginosa was the dominant bacterium enriched during the SBR process. Several P. aeruginosa strains were isolated from the enriched activated sludge samples. The isolates were subdivided into two groups, one that produced only MCL-PHAs and another that produced both MCL- and short-chain-length PHAs. The SBR periodic feeding experiments with five representative MCL-PHA-producing Pseudomonas species revealed that P. aeruginosa has an advantage over other species that enables it to become dominant in the bacterial community.
Applications of DART-MS for food quality and safety assurance in food supply chain.
Guo, Tianyang; Yong, Wei; Jin, Yong; Zhang, Liya; Liu, Jiahui; Wang, Sai; Chen, Qilong; Dong, Yiyang; Su, Haijia; Tan, Tianwei
2017-03-01
Direct analysis in real time (DART) represents a new generation of ion source which is used for rapid ionization of small molecules under ambient conditions. The combination of DART and various mass spectrometers allows analyzing multiple food samples with simple or no sample treatment, or in conjunction with prevailing protocolized sample preparation methods. Abundant applications by DART-MS have been reviewed in this paper. The DART-MS strategy applied to food supply chain (FSC), including production, processing, and storage and transportation, provides a comprehensive solution to various food components, contaminants, authenticity, and traceability. Additionally, typical applications available in food analysis by other ambient ionization mass spectrometers were summarized, and fundamentals mainly including mechanisms, devices, and parameters were discussed as well. © 2015 Wiley Periodicals, Inc. Mass Spec Rev. 36:161-187, 2017. © 2015 Wiley Periodicals, Inc.
The developmental basis for germline mosaicism in mouse and Drosophila melanogaster.
Drost, J B; Lee, W R
1998-01-01
Data involving germline mosaics in Drosophila melanogaster and mouse are reconciled with developmental observations. Mutations that become fixed in the early embryo before separation of soma from the germline may, by the sampling process of development, continue as part of germline and/or differentiate into any somatic tissue. The cuticle of adult D. melanogaster, because of segmental development, can be used to estimate the proportion of mutant nuclei in the early embryo, but most somatic tissues and the germlines of both species continue from samples too small to be representative of the early embryo. Because of the small sample of cells/nuclei that remain in the germline after separation of soma in both species, mosaic germlines have percentages of mutant cells that vary widely, with a mean of 50% and an unusual platykurtic, flat-topped distribution. While the sampling process leads to similar statistical results for both species, their patterns of development are very different. In D. melanogaster the first differentiation is the separation of soma from germline with the germline continuing from a sample of only two to four nuclei, whereas the adult cuticle is a representative sample of cleavage nuclei. The presence of mosaicism in D. melanogaster germline is independent of mosaicism in the eye, head, and thorax. This independence was used to determine that mutations can occur at any of the early embryonic cell divisions and still average 50% mutant germ cells when the germline is mosaic; however, the later the mutation occurs, the higher the proportion of completely nonmutant germlines. In contrast to D. melanogaster, the first differentiation in the mouse does not separate soma from germline but produces the inner cell mass that is representative of the cleavage nuclei. Following formation of the primitive streak, the primordial germ cells develop at the base of the allantois and among a clonally related sample of cells, providing the same statistical distribution in the mouse germlines as in D. melanogaster. The proportion of mutations that are fixed during early embryonic development is greatly underestimated. For example, a DNA lesion in a postmeiotic gamete that becomes fixed as a dominant mutation during early embryonic development of the F1 may produce an individual completely mutant in the germ line and relevant somatic tissue or, alternatively, the F1 germline may be completely mutant but with no relevant somatic tissues for detecting the mutation until the F2. In both cases the mutation would be classified as complete in the F1 and F2, respectively, and not recognized as embryonic in origin. Because germ cells differentiate later in mammalian development, there are more opportunities for correlation between germline and soma in the mammal than Drosophila. However, because the germ cells and any somatic tissue, like blood, are derived from small samples, there may be many individuals that test negative in blood but have germlines that are either mosaic or entirely mutant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thien, Mike G.; Barnes, Steve M.
2013-07-01
The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broadmore » spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)« less
Anco, Corey; Kolokotronis, Sergios-Orestis; Henschel, Philipp; Cunningham, Seth W; Amato, George; Hekkala, Evon
2018-04-01
Once found throughout Africa and Eurasia, the leopard (Panthera pardus) was recently uplisted from Near Threatened to Vulnerable by the International Union for the Conservation of Nature (IUCN). Historically, more than 50% of the leopard's global range occurred in continental Africa, yet sampling from this part of the species' distribution is only sparsely represented in prior studies examining patterns of genetic variation at the continental or global level. Broad sampling to determine baseline patterns of genetic variation throughout the leopard's historical distribution is important, as these measures are currently used by the IUCN to direct conservation priorities and management plans. By including data from 182 historical museum specimens, faecal samples from ongoing field surveys, and published sequences representing sub-Saharan Africa, we identify previously unrecognized genetic diversity in African leopards. Our mtDNA data indicates high levels of divergence among regional populations and strongly differentiated lineages in West Africa on par with recent studies of other large vertebrates. We provide a reference benchmark of genetic diversity in African leopards against which future monitoring can be compared. These findings emphasize the utility of historical museum collections in understanding the processes that shape present biodiversity. Additionally, we suggest future research to clarify African leopard taxonomy and to differentiate between delineated units requiring monitoring or conservation action.
DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.
Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A
2017-01-01
Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.
Ungulate management in national parks of the United States and Canada
Demarais, S.; Cornicelli, L.; Kahn, R.; Merrill, E.; Miller, C.; Peek, J.M.; Porter, W.F.; Sargeant, G.A.
2012-01-01
Enabling legislation—that which gives appropriate officials the authority to implement or enforce the law—impacts management of ungulates in national parks of Canada and the United States (U.S.). The initial focus of such legislation in both countries centered on preserving natural and culturally significant areas for posterity. Although this objective remains primary, philosophies and practices have changed. A Canadian vision for ungulate management emerged during the latter half of the 20th century to protect and maintain or restore the ecological integrity of representative samples of the country’s 39 distinct landscapes, and to include provisions for traditional hunting and fishing practices representative of past cultural impacts on the environment. The current ungulate management approach in the U.S. relies on natural (ecological) processes, as long as normal conditions are promoted and there is no impairment of natural resources. Emphasizing natural processes as the basis has been a challenge because ecosystem dynamics are complex and management is multi-jurisdictional. Additionally, natural regulation typically will not prevent ungulates from reaching and sustaining densities that are incompatible with preservation or restoration of native flora and fauna, natural processes, or historical landscapes.
Information Processing from Infancy to 11 Years: Continuities and Prediction of IQ
Rose, Susan A.; Feldman, Judith F.; Jankowski, Jeffery J.; Van Rossem, Ronan
2012-01-01
This study provides the first direct evidence of cognitive continuity for multiple specific information processing abilities from infancy and toddlerhood to pre-adolescence, and provides support for the view that infant abilities and form the basis of later childhood abilities. Data from a large sample of children (N = 131) were obtained at five different time points (7, 12, 24, 36 months, and 11 years) for a large battery of tasks representing four cognitive domains (attention, processing speed, memory, and representational competence). Structural equation models of continuity were assessed for each domain, in which it was assumed that infant abilities → toddler abilities → 11-year abilities. Abilities at each age were represented by latent variables, which minimize task-specific variance and measurement error. The model for each domain fit the data. Moreover, abilities from the three age periods predicted global outcome, with infant, toddler, and contemporaneous 11-year measures, respectively, accounting for 12.3%, 18.5%, and 45.2% of the variance in 11-year IQ. These findings strengthen contentions that specific cognitive abilities that can be identified in infancy show long-term continuity and contribute importantly to later cognitive competence. PMID:23162179
Appearance-based representative samples refining method for palmprint recognition
NASA Astrophysics Data System (ADS)
Wen, Jiajun; Chen, Yan
2012-07-01
The sparse representation can deal with the lack of sample problem due to utilizing of all the training samples. However, the discrimination ability will degrade when more training samples are used for representation. We propose a novel appearance-based palmprint recognition method. We aim to find a compromise between the discrimination ability and the lack of sample problem so as to obtain a proper representation scheme. Under the assumption that the test sample can be well represented by a linear combination of a certain number of training samples, we first select the representative training samples according to the contributions of the samples. Then we further refine the training samples by an iteration procedure, excluding the training sample with the least contribution to the test sample for each time. Experiments on PolyU multispectral palmprint database and two-dimensional and three-dimensional palmprint database show that the proposed method outperforms the conventional appearance-based palmprint recognition methods. Moreover, we also explore and find out the principle of the usage for the key parameters in the proposed algorithm, which facilitates to obtain high-recognition accuracy.
Sampling device for withdrawing a representative sample from single and multi-phase flows
Apley, Walter J.; Cliff, William C.; Creer, James M.
1984-01-01
A fluid stream sampling device has been developed for the purpose of obtaining a representative sample from a single or multi-phase fluid flow. This objective is carried out by means of a probe which may be inserted into the fluid stream. Individual samples are withdrawn from the fluid flow by sampling ports with particular spacings, and the sampling parts are coupled to various analytical systems for characterization of the physical, thermal, and chemical properties of the fluid flow as a whole and also individually.
On the representativeness of behavior observation samples in classrooms.
Tiger, Jeffrey H; Miller, Sarah J; Mevers, Joanna Lomas; Mintz, Joslyn Cynkus; Scheithauer, Mindy C; Alvarez, Jessica
2013-01-01
School consultants who rely on direct observation typically conduct observational samples (e.g., 1 30-min observation per day) with the hopes that the sample is representative of performance during the remainder of the day, but the representativeness of these samples is unclear. In the current study, we recorded the problem behavior of 3 referred students for 4 consecutive school days between 9:30 a.m. and 2:30 p.m. using duration recording in consecutive 10-min sessions. We then culled 10-min, 20-min, 30-min, and 60-min observations from the complete record and compared these observations to the true daily mean to assess their accuracy (i.e., how well individual observations represented the daily occurrence of target behaviors). The results indicated that when behavior occurred with low variability, the majority of brief observations were representative of the overall levels; however, when behavior occurred with greater variability, even 60-min observations did not accurately capture the true levels of behavior. © Society for the Experimental Analysis of Behavior.
Salmonella contamination risk points in broiler carcasses during slaughter line processing.
Rivera-Pérez, Walter; Barquero-Calvo, Elías; Zamora-Sanabria, Rebeca
2014-12-01
Salmonella is one of the foodborne pathogens most commonly associated with poultry products. The aim of this work was to identify and analyze key sampling points creating risk of Salmonella contamination in a chicken processing plant in Costa Rica and perform a salmonellosis risk analysis. Accordingly, the following examinations were performed: (i) qualitative testing (presence or absence of Salmonella), (ii) quantitative testing (Salmonella CFU counts), and (iii) salmonellosis risk analysis, assuming consumption of contaminated meat from the processing plant selected. Salmonella was isolated in 26% of the carcasses selected, indicating 60% positive in the flocks sampled. The highest Salmonella counts were observed after bleeding (6.1 log CFU per carcass), followed by a gradual decrease during the subsequent control steps. An increase in the percentage of contamination (10 to 40%) was observed during evisceration and spray washing (after evisceration), with Salmonella counts increasing from 3.9 to 5.1 log CFU per carcass. According to the prevalence of Salmonella -contaminated carcasses released to trade (20%), we estimated a risk of 272 cases of salmonellosis per year as a result of the consumption of contaminated chicken. Our study suggests that the processes of evisceration and spray washing represent a risk of Salmonella cross-contamination and/ or recontamination in broilers during slaughter line processing.
ACID EVAPORATION OF ULTIMA GOLD TM AB LIQUID SCINTILLATION COCKTAIL RESIDUE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyser, E.; Fondeur, F.; Crump, S.
2011-12-21
Prior analyses of samples from the F/H Lab solutions showed the presence of diisopropylnapthalene (DIN), a major component of Ultima Gold{trademark} AB liquid scintillation cocktail (LSC). These solutions are processed through H-Canyon Tank 10.5 and ultimately through the 17.8E evaporator. Similar solutions originated in SRNL streams sent to the same H Canyon tanks. This study examined whether the presence of these organics poses a process-significant hazard for the evaporator. Evaporation and calorimetry testing of surrogate samples containing 2000 ppm of Ultima Gold{trademark} AB LSC in 8 M nitric acid have been completed. These experiments showed that although reactions between nitricmore » acid and the organic components do occur, they do not appear to pose a significant hazard for runaway reactions or generation of energetic compounds in canyon evaporators. The amount of off-gas generated was relatively modest and appeared to be well within the venting capacity of the H-Canyon evaporators. A significant fraction of the organic components likely survives the evaporation process primarily as non-volatile components that are not expected to represent any new process concerns during downstream operations such as neutralization. Laboratory Waste solutions containing minor amounts of DIN can be safely received, stored, transferred, and processed through the canyon waste evaporator.« less
Unsupervised object segmentation with a hybrid graph model (HGM).
Liu, Guangcan; Lin, Zhouchen; Yu, Yong; Tang, Xiaoou
2010-05-01
In this work, we address the problem of performing class-specific unsupervised object segmentation, i.e., automatic segmentation without annotated training images. Object segmentation can be regarded as a special data clustering problem where both class-specific information and local texture/color similarities have to be considered. To this end, we propose a hybrid graph model (HGM) that can make effective use of both symmetric and asymmetric relationship among samples. The vertices of a hybrid graph represent the samples and are connected by directed edges and/or undirected ones, which represent the asymmetric and/or symmetric relationship between them, respectively. When applied to object segmentation, vertices are superpixels, the asymmetric relationship is the conditional dependence of occurrence, and the symmetric relationship is the color/texture similarity. By combining the Markov chain formed by the directed subgraph and the minimal cut of the undirected subgraph, the object boundaries can be determined for each image. Using the HGM, we can conveniently achieve simultaneous segmentation and recognition by integrating both top-down and bottom-up information into a unified process. Experiments on 42 object classes (9,415 images in total) show promising results.
NASA Astrophysics Data System (ADS)
Terán Mita, Tania; Faz Cano, Angel; Muñoz, Maria Angeles; Millán Gómez, Rocio; Chincheros Paniagua, Jaime
2010-05-01
In Bolivia, metal mining activities since historical times have been one of the most important sources of environmental pollution. This is the case of the National Area of Apolobamba Integrated Management (ANMIN of Apolobamba) in La Paz, Bolivia, where intense gold mining activities have been carried out from former times to the present, with very little gold extraction and very primitive mineral processing technology; in fact, mercury is still being used in the amalgam processes of the gold concentration, which is burned outdoors to recover the gold. Sunchullí is a representative mining district in ANMIN of Apolobamba where mining activity is mainly gold extraction and its water effluents go to the Amazonian basin; in this mining district the productivity of extracted mineral is very low but the processes can result in heavy-metal contamination of the air, water, soils and plants. Due to its high toxicity, the contamination by arsenic and mercury create the most critical environmental problems. In addition, some other heavy metals may also be present such as lead, copper, zinc and cadmium. These heavy metals could be incorporated in the trophic chain, through the flora and the fauna, in their bio-available and soluble forms. Inhabitants of this area consume foodcrops, fish from lakes and rivers and use the waters for the livestock, domestic use, and irrigation. The aim of this work was to evaluate the heavy metals pollution by gold mining activities in Sunchullí area. In Sunchullí two representative zones were distinguished and sampled. Zone near the mining operation site was considered as affected by mineral extraction processes, while far away zones represented the non affected ones by the mining operation. In each zone, 3 plots were established; in each plot, 3 soil sampling points were selected in a random manner and analysed separately. In each sampling point, two samples were taken, one at the surface, from 0-5 cm depth (topsoil), and the other between 5 and 15 cm (subsurface). In addition, surface soils from mercury burn areas were also taken. Arsenic, mercury, lead, copper, zinc and cadmium total, DTPA and water extractable metals were determined. In both zones, the results show that mining activities do not increase heavy metals levels except for arsenic (17.20 - 69.25 mg/kg) that presents high concentrations surpassing the Belgium reference levels (19.00 mg/kg), in some cases stands out the high mercury values in the affected zone (2.07 mg/kg, 1.18 mg/kg, 1.93 mg/kg). The most polluted soils are mercury burn areas with high levels of mercury (4.21 - 21.79 mg/kg) surpassing levels according to the Holland regulation (0.3 mg/kg). Workers and population are in close contact with these soils without any type of protection.
Kaufman, John A.; Brown, Mary Jean; Umar-Tsafe, Nasir T.; Adbullahi, Muhammad Bashir; Getso, Kabiru I.; Kaita, Ibrahim M.; Sule, Binta Bako; Ba’aba, Ahmed; Davis, Lora; Nguku, Patrick M.; Sani-Gwarzo, Nasir
2018-01-01
Background In March 2010, Medecins Sans Frontieres/Doctors Without Borders detected an outbreak of acute lead poisoning in Zamfara State, northwestern Nigeria, linked to low-technology gold ore processing. The outbreak killed more than 400 children ≤5 years of age in the first half of 2010 and has left more than 2,000 children with permanent disabilities. Objectives The aims of this study were to estimate the statewide prevalence of children ≤5 years old with elevated blood lead levels (BLLs) in gold ore processing and non-ore-processing communities, and to identify factors associated with elevated blood lead levels in children. Methods A representative, population-based study of ore processing and non-ore-processing villages was conducted throughout Zamfara in 2012. Blood samples from children, outdoor soil samples, indoor dust samples, and survey data on ore processing activities and other lead sources were collected from 383 children ≤5 years old in 383 family compounds across 56 villages. Results 17.2% of compounds reported that at least one member had processed ore in the preceding 12 months (95% confidence intervals (CI): 9.7, 24.7). The prevalence of BLLs ≥10 µg/dL in children ≤5 years old was 38.2% (95% CI: 26.5, 51.4) in compounds with members who processed ore and 22.3% (95% CI: 17.8, 27.7) in compounds where no one processed ore. Ore processing activities were associated with higher lead concentrations in soil, dust, and blood samples. Other factors associated with elevated BLL were a child’s age and sex, breastfeeding, drinking water from a piped tap, and exposure to eye cosmetics. Conclusions Childhood lead poisoning is widespread in Zamfara State in both ore processing and non-ore-processing settings, although it is more prevalent in ore processing areas. Although most children’s BLLs were below the recommended level for chelation therapy, environmental remediation and use of safer ore processing practices are needed to prevent further exposures. Patient consent Obtained Ethics approval The study protocol was approved by the US Centers for Disease Control Institutional Review Board-A and the National Health Research Ethics Committee of Nigeria. Competing Interests The authors declare no competing financial interests. PMID:29416933
Isotopes in North American Rocky Mountain Snowpack 1993-2014
NASA Astrophysics Data System (ADS)
Anderson, Lesleigh; Berkelhammer, Max; Mast, M. Alisa
2016-01-01
We present ∼1300 new isotopic measurements (δ18O and δ2H) from a network of snowpack sites in the Rocky Mountains that have been sampled since 1993. The network includes 177 locations where depth-integrated snow samples are collected each spring near peak accumulation. At 57 of these locations snowpack samples were obtained for 10-21 years and their isotopic measurements provide unprecedented spatial and temporal documentation of snowpack isotope values at mid-latitudes. For environments where snowfall accounts for the majority of annual precipitation, snowmelt is likely to have the strongest influence on isotope values retained in proxy archives. In this first presentation of the dataset we (1) describe the basic features of the isotope values in relation to the Global Meteoric Water Line (GMWL), (2) evaluate space for time substitutions traditionally used to establish δ18O-temperature relations, (3) evaluate site-to-site similarities across the network and identify those that are the most regionally representative, (4) examine atmospheric circulation patterns for several years with spatially coherent isotope patterns, and (5) provide examples of the implications this new dataset has for interpreting paleoclimate records (Bison Lake, Colorado and Minnetonka Cave, Idaho). Results indicate that snowpack δ18O is rarely a simple proxy of temperature. Instead, it exhibits a high degree of spatial heterogeneity and temporal variance that reflect additional processes such as vapor transport and post-depositional modification. Despite these complexities we identify consistent climate-isotope patterns and regionally representative locations that serve to better define Holocene hydroclimate estimates and their uncertainty. Climate change has and will affect western U.S. snowpack and we suggest these changes can be better understood and anticipated by oxygen and hydrogen isotope-based reconstructions of Holocene hydroclimate using a process-based understanding of the controls on snowpack isotope ratios.
Lavilla Lerma, Leyre; Benomar, Nabil; Gálvez, Antonio; Abriouel, Hikmate
2013-02-01
In order to investigate the prevalence of resistant bacteria to biocides and/or antibiotics throughout meat chain production from sacrifice till end of production line, samples from various surfaces of a goat and lamb slaughterhouse representative of the region were analyzed by the culture dependent approach. Resistant Psychrotrophs (n=255 strains), Pseudomonas sp. (n=166 strains), E. coli (n=23 strains), Staphylococcus sp. (n=17 strains) and LAB (n=82 represented mainly by Lactobacillus sp.) were isolated. Resistant psychrotrophs and pseudomonads (47 and 29%, respectively) to different antimicrobials were frequently detected in almost all areas of meat processing plant regardless the antimicrobial used, although there was a clear shift in the spectrum of other bacterial groups and for this aim such resistance was determined according to several parameters: antimicrobial tested, sampling zone and the bacterial group. Correlation of different parameters was done using a statistical tool "Principal component analysis" (PCA) which determined that quaternary ammonium compounds and hexadecylpyridinium were the most relevant biocides for resistance in Pseudomonas sp., while ciprofloxacin and hexachlorophene were more relevant for psychrotrophs, LAB, and in lesser extent Staphylococcus sp. and Escherichia coli. On the other hand, PCA of sampling zones determined that sacrifice room (SR) and cutting room (CR) considered as main source of antibiotic and/or biocide resistant bacteria showed an opposite behaviour concerning relevance of antimicrobials to determine resistance being hexadecylpyridinium, cetrimide and chlorhexidine the most relevant in CR, while hexachlorophene, oxonia 6P and PHMG the most relevant in SR. In conclusion, rotational use of the relevant biocides as disinfectants in CR and SR is recommended in an environment which is frequently disinfected. Copyright © 2012 Elsevier B.V. All rights reserved.
Occurrence of invertebrates at 38 stream sites in the Mississippi Embayment study unit, 1996-99
Caskey, Brian J.; Justus, B.G.; Zappia, Humbert
2002-01-01
A total of 88 invertebrate species and 178 genera representing 59 families, 8 orders, 6 classes, and 3 phyla was identified at 38 stream sites in the Mississippi Embayment Study Unit from 1996 through 1999 as part of the National Water-Quality Assessment Program. Sites were selected based on land use within the drainage basins and the availability of long-term streamflow data. Invertebrates were sampled as part of an overall sampling design to provide information related to the status and trends in water quality in the Mississippi Embayment Study Unit, which includes parts of Arkansas, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. Invertebrate sampling and processing was conducted using nationally standardized techniques developed for the National Water-Quality Assessment Program. These techniques included both a semi-quantitative method, which targeted habitats where invertebrate diversity is expected to be highest, and a qualitative multihabitat method, which samples all available habitat types possible within a sampling reach. All invertebrate samples were shipped to the USGS National Water-Quality Laboratory (NWQL) where they were processed. Of the 365 taxa identified, 156 were identified with the semi-quantitative method that involved sampling a known quantity of what was expected to be the richest habitat, woody debris. The qualitative method, which involved sampling all available habitats, identified 345 taxa The number of organisms identified in the semi-quantitative samples ranged from 74 to 3,295, whereas the number of taxa identified ranged from 9 to 54. The number of organisms identified in the qualitative samples ranged from 42 to 29,634, whereas the number of taxa ranged from 18 to 81. From all the organisms identified, chironomid taxa were the most frequently identified, and plecopteran taxa were among the least frequently identified.
MacDonnell, Judith A.; Dastjerdi, Mahdieh; Bokore, Nimo; Khanlou, Nazilla
2012-01-01
This paper reports on grounded theory findings that are relevant to promoting the mental health and well-being of immigrant women in Canada. The findings illustrate how relationships among settlement factors and dynamics of empowerment had implications for “becoming resilient” as immigrant women and how various health promotion approaches enhanced their well-being. Dimensions of empowerment were embedded in the content and process of the feminist health promotion approach used in this study. Four focus groups were completed in Toronto, Ontario, Canada with 35 racialized immigrant women who represented diverse countries of origin: 25 were from Africa; others were equally represented from South Asia (5), Asia (5), and Central or South America and the Caribbean (5). Participants represented diverse languages, family dynamics, and educational backgrounds. One focus group was conducted in Somali; three were conducted in English. Constructivist grounded theory, theoretical sampling, and a critical feminist approach were chosen to be congruent with health promotion research that fostered women's empowerment. Findings foreground women's agency in the study process, the ways that immigrant women name and frame issues relevant to their lives, and the interplay among individual, family, community, and structural dynamics shaping their well-being. Implications for mental health promotion are discussed. PMID:22754696
Volatile element chemistry of selected lunar, meteoritic, and terrestrial samples
NASA Technical Reports Server (NTRS)
Simoneit, B. R.; Christiansen, P. C.; Burlingame, A. L.
1973-01-01
Using vacuum pyrolysis and high resolution mass spectrometry, a study is made of the gas release patterns of representative lunar samples, meteorites, terrestrial samples, and synthetic samples doped with various sources of carbon and nitrogen. The pyrolytic gas evolution patterns were intercorrelated, allowing an assessment of the possible sources of the volatilizable material in the lunar samples to be made. Lightly surface adsorbed species and more strongly chemisorbed species are released from ambient to 300 C and from 300 to 500 C, respectively. The low-temperature volatiles (less than 500 C) derived from various chondrites correlate well with the gas evolution patterns of volatile-rich samples, as for example 74220 and 61221. Solar wind entrapped species and molecules derived from reactions probably in the grain surfaces are evolved from about 500 to 700 C, respectively. Solar wind implanted C, N, and S species are generated from 750 to 1150 C, probably by reaction with the mineral matrix during the annealing process. Possible indigenous and/or refractory carbide, nitride, and sulfide C, N, and S are released in the region from 1200 C to fusion.
Some physical properties of naturally irradiated fluorite
Berman, Robert
1955-01-01
Five samples of purple fluorite found in association with radioactive, materials, and a synthetic colorless control sample were studied and compared. Before and after heating, observations were made on specific gravity, index of refraction, unit-cell size, breadth of X-ray diffraction lines, and fluorescence. The purple samples became colorless on heating above 175° C. During the process, observations were made on color, thermoluminescence, and differential thermal analysis curves. There were strong correlations between the various physical properties, and it was found possible to arrange the samples in order of increasing difference in their physical properties from the control sample. This order apparently represents increasing structural damage by radiation; if so, it correlates with decreasing specific gravity, increasing index of refraction, broadening of X-ray lines, and increasingly strong exothermic reactions on annealing. The differences between the samples in index of refraction and X-ray pattern are largely eliminated on annealing. Annealing begins at 1750 C; thermoluminescence at lower temperatures is due to electrons escaping from the metastable potential traps, not the destruction of those traps which takes place on annealing.
Compaction of Railway Ballast During Tamping Process: a Parametric Study
NASA Astrophysics Data System (ADS)
Saussine, G.; Azéma, E.; Perales, R.; Radjaï, F.
2009-06-01
We characterize an industrial process currently used on railway track: tamping operation. This process is employed in order to restore the geometry of railway track distorted by train traffics. The main goal is to compact the granular material under the sleepers supporting the railroad squeezing and vibrations. We focus on different phases of the tamping process, namely the penetration of tamping tines into the ballast and squeezing of ballast between tines. Our numerical simulations of three-dimensional discrete polyhedral grains allow us to investigate the influence of vibration frequency on the compaction level at the end of the process, the role of velocity of tamping tines during penetration phase and the mechanism of compaction of a confined granular layer under horizontal vibrations. For each tamping phase, an optimal frequency is proposed, and an analysis of the full process on the samples representing a portion of the railway track enables us to access the influence of various parameters required to optimize the process.
NASA Astrophysics Data System (ADS)
Raleigh, M. S.; Smyth, E.; Small, E. E.
2017-12-01
The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.
Modern and Unconventional Approaches to Karst Hydrogeology
NASA Astrophysics Data System (ADS)
Sukop, M. C.
2017-12-01
Karst hydrogeology is frequently approached from a hydrograph/statistical perspective where precipitation/recharge inputs are converted to output hydrographs and the conversion process reflects the hydrology of the system. Karst catchments show hydrological response to short-term meteorological events and to long-term variation of large-scale atmospheric circulation. Modern approaches to analysis of these data include, for example, multiresolution wavelet techniques applied to understand relations between karst discharge and climate fields. Much less effort has been directed towards direct simulation of flow fields and transport phenomena in karst settings. This is primarily due to the lack of information on the detailed physical geometry of most karst systems. New mapping, sampling, and modeling techniques are beginning to enable direct simulation of flow and transport. A Conduit Flow Process (CFP) add-on to the USGS ModFlow model became available in 2007. FEFLOW and similar models are able to represent flows in individual conduits. Lattice Boltzmann models have also been applied to flow modeling in karst systems. Regarding quantitative measurement of karst system geometry, at scales to 0.1 m, X-ray computed tomography enables good detection of detailed (sub-millimeter) pore space in karstic rocks. Three-dimensional printing allows reconstruction of fragile high porosity rocks, and surrogate samples generated this way can then be subjected to laboratory testing. Borehole scales can be accessed with high-resolution ( 0.001 m) Digital Optical Borehole Imaging technologies and can provide virtual samples more representative of the true nature of karst aquifers than can obtained from coring. Subsequent extrapolation of such samples can generate three-dimensional models suitable for direct modeling of flow and transport. Finally, new cave mapping techniques are beginning to provide information than can be applied to direct simulation of flow. Due to flow rates and cave diameter, very high Reynolds number flows may be encountered.
Who Are We Studying? Sample Diversity in Teaching of Psychology Research
ERIC Educational Resources Information Center
Richmond, Aaron S.; Broussard, Kristin A.; Sterns, Jillian L.; Sanders, Kristina K.; Shardy, Justin C.
2015-01-01
The purpose of the current study was to examine the sample diversity of empirical articles published in four premier teaching of psychology journals from 2008 to 2013. We investigated which demographic information was commonly reported and if samples were ethnically representative and whether gender was representative compared to National…
Prevalence and Predictors of Sexual Assault among a College Sample
ERIC Educational Resources Information Center
Conley, A. H.; Overstreet, C. M.; Hawn, S. E.; Kendler, K. S.; Dick, D. M.; Amstadter, A. B.
2017-01-01
Objective: This study examined the prevalence and correlates of precollege, college-onset, and repeat sexual assault (SA) within a representative student sample. Participants: A representative sample of 7,603 students. Methods: Incoming first-year students completed a survey about their exposure to broad SA prior to college, prior trauma,…
Han, Hongyan; Wang, Chao; Li, Yanbing; Yu, Zhu; Xu, Qingfang; Li, Guangpeng; Minh, Tang Thuy; Nishino, Naoki
2018-01-01
In order to assess the survival of lactic acid bacteria (LAB) in whole crop maize silage in the gut of dairy cows, one representative silage sample and three different feces samples were collected from dairy cows on three dairy farms in Hua Bei, China and three dairy farms in Kyushu, Japan. The composition of the bacterial community was examined by denaturing gradient gel electrophoresis and quantitative polymerase chain reaction. Lactobacillus acetotolerans was detected in all bunker-made maize silage samples, regardless of the dairy farm or sampling region from which they were sourced. A total of eight LAB species were detected in the maize silage samples, of which three (L. acetotolerans, L. pontis and L. casei) appeared to survive digestion. The populations of L. acetotolerans in silage and feces were 10 6-7 and 10 3-4 copies/g, respectively, indicating that, even for the LAB species showing potential survival in the gut, competition in this niche may be harsh and the population may substantially decrease during the digestion process. It may be difficult for silage LAB to survive in the gut of silage-fed dairy cows, because marked decrease in population can take place during the digestion process, even for surviving species. © 2017 Japanese Society of Animal Science.
A Search for X-ray Emission in Isolated Compact Triplets
NASA Technical Reports Server (NTRS)
Brown, Beth A.; Williams, Barbara
2006-01-01
We describe preliminary results of an exploratory search for diffuse X-ray emission in a sample of the poorest galaxy groups, i.e., isolated compact triplets of galaxies. These systems represent the simplest forms of galaxy clustering while manifesting all the complexities inherent in other groups. We have selected 20 compact triplets for this initial study. The component galaxies are expected to interact with each other and with the group's intergalactic medium, if present, in complex ways that trigger high-energy processes.
Distributed Spacing Stochastic Feature Selection and its Application to Textile Classification
2011-09-01
Spandex, (b) 65% Polyester / 35% Cot- ton vs 94% Polyester / 6% Spandex, (c) 65% Polyester / 35% Cotton vs 100% Cotton , and (d) 65% Polyester / 35% Cotton ...3-29 3.10. This is an example of the final feature selection process for 100% Cotton Woven, with acceptable distributed spacing set to a 35...3-40 4.1. Representative samples from the 12 class textile data set: 65% Polyester 35% Cotton Woven (red), 80% Nylon 20% Spandex Knit (green), 97
Mechanically controllable break junctions for molecular electronics.
Xiang, Dong; Jeong, Hyunhak; Lee, Takhee; Mayer, Dirk
2013-09-20
A mechanically controllable break junction (MCBJ) represents a fundamental technique for the investigation of molecular electronic junctions, especially for the study of the electronic properties of single molecules. With unique advantages, the MCBJ technique has provided substantial insight into charge transport processes in molecules. In this review, the techniques for sample fabrication, operation and the various applications of MCBJs are introduced and the history, challenges and future of MCBJs are discussed. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fukasawa, Lucila Okuyama; Gonçalves, Maria Gisele; Higa, Fábio Takenori; Castilho, Euclides Ayres; Ibarz-Pavón, Ana Belén; Sacchi, Claudio Tavares
2017-01-01
The lack of information regarding the burden of acute bacterial meningitis in Latin America leads to a reduction in the estimated incidence rates of the disease, and impairs public health decisions on the use and follow-up of preventive interventions, particularly, the evaluation of existing vaccination policies. The use of the real-time PCR in diagnostic routine procedures has resulted in a substantial increase in confirmed bacterial meningitis cases. However, in resource-poor countries, these assays are only available in reference laboratories. Sample transportation to these laboratories is a critical constraint, as it requires specialized, high cost courier services. To overcome this barrier we evaluated the use of FTATM Elute filter paper cards for the conservation and processing of samples under normal environmental conditions, as they would be when transported from remote and under-equipped healthcare facilities to the reference centers. A total of 401 samples received in 2015 as part of Sao Paulo's national surveillance for routine diagnosis were selected for this study. The sensitivity and specificity of real-time PCR were evaluated using fresh serum and cerebrospinal fluid (CSF) samples processed using our laboratory's standard DNA extraction, and processing the same samples after being dried and stored on FTATM card, and DNA extracted following the manufacturer's instructions. The sensitivities for detection of Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae from CSF dried and stored on FTATM cards were 98%, 92%, and 100%, respectively, and with serum samples were 73%, 88%, and 100%, respectively. When compared to our laboratory's standard methodology, results showed high concordance, with Kappa index ranges of 0.9877-1.00 for CSF, and 0.8004-1.00 for serum samples. The use of FTATM cards for CSF and serum conservation and transport represents a rapid, reliable, and cost-effective alternative that will allow obtaining valuable epidemiological information that would otherwise be lost.
Kraschnewski, Jennifer L; Keyserling, Thomas C; Bangdiwala, Shrikant I; Gizlice, Ziya; Garcia, Beverly A; Johnston, Larry F; Gustafson, Alison; Petrovic, Lindsay; Glasgow, Russell E; Samuel-Hodge, Carmen D
2010-01-01
Studies of type 2 translation, the adaption of evidence-based interventions to real-world settings, should include representative study sites and staff to improve external validity. Sites for such studies are, however, often selected by convenience sampling, which limits generalizability. We used an optimized probability sampling protocol to select an unbiased, representative sample of study sites to prepare for a randomized trial of a weight loss intervention. We invited North Carolina health departments within 200 miles of the research center to participate (N = 81). Of the 43 health departments that were eligible, 30 were interested in participating. To select a representative and feasible sample of 6 health departments that met inclusion criteria, we generated all combinations of 6 from the 30 health departments that were eligible and interested. From the subset of combinations that met inclusion criteria, we selected 1 at random. Of 593,775 possible combinations of 6 counties, 15,177 (3%) met inclusion criteria. Sites in the selected subset were similar to all eligible sites in terms of health department characteristics and county demographics. Optimized probability sampling improved generalizability by ensuring an unbiased and representative sample of study sites.
Lozada, Mariana; Marcos, Magalí S.; Commendatore, Marta G.; Gil, Mónica N.; Dionisi, Hebe M.
2014-01-01
The aim of this study was to design a molecular biological tool, using information provided by amplicon pyrosequencing of 16S rRNA genes, that could be suitable for environmental assessment and bioremediation in marine ecosystems. We selected 63 bacterial genera that were previously linked to hydrocarbon biodegradation, representing a minimum sample of the bacterial guild associated with this process. We defined an ecological indicator (ecological index of hydrocarbon exposure, EIHE) using the relative abundance values of these genera obtained by pyrotag analysis. This index reflects the proportion of the bacterial community that is potentially capable of biodegrading hydrocarbons. When the bacterial community structures of intertidal sediments from two sites with different pollution histories were analyzed, 16 of the selected genera (25%) were significantly overrepresented with respect to the pristine site, in at least one of the samples from the polluted site. Although the relative abundances of individual genera associated with hydrocarbon biodegradation were generally low in samples from the polluted site, EIHE values were 4 times higher than those in the pristine sample, with at least 5% of the bacterial community in the sediments being represented by the selected genera. EIHE values were also calculated in other oil-exposed marine sediments as well as in seawater using public datasets from experimental systems and field studies. In all cases, the EIHE was significantly higher in oiled than in unpolluted samples, suggesting that this tool could be used as an estimator of the hydrocarbon-degrading potential of microbial communities. PMID:24964812
Lozada, Mariana; Marcos, Magalí S; Commendatore, Marta G; Gil, Mónica N; Dionisi, Hebe M
2014-09-17
The aim of this study was to design a molecular biological tool, using information provided by amplicon pyrosequencing of 16S rRNA genes, that could be suitable for environmental assessment and bioremediation in marine ecosystems. We selected 63 bacterial genera that were previously linked to hydrocarbon biodegradation, representing a minimum sample of the bacterial guild associated with this process. We defined an ecological indicator (ecological index of hydrocarbon exposure, EIHE) using the relative abundance values of these genera obtained by pyrotag analysis. This index reflects the proportion of the bacterial community that is potentially capable of biodegrading hydrocarbons. When the bacterial community structures of intertidal sediments from two sites with different pollution histories were analyzed, 16 of the selected genera (25%) were significantly overrepresented with respect to the pristine site, in at least one of the samples from the polluted site. Although the relative abundances of individual genera associated with hydrocarbon biodegradation were generally low in samples from the polluted site, EIHE values were 4 times higher than those in the pristine sample, with at least 5% of the bacterial community in the sediments being represented by the selected genera. EIHE values were also calculated in other oil-exposed marine sediments as well as in seawater using public datasets from experimental systems and field studies. In all cases, the EIHE was significantly higher in oiled than in unpolluted samples, suggesting that this tool could be used as an estimator of the hydrocarbon-degrading potential of microbial communities.
Frequency position modulation using multi-spectral projections
NASA Astrophysics Data System (ADS)
Goodman, Joel; Bertoncini, Crystal; Moore, Michael; Nousain, Bryan; Cowart, Gregory
2012-10-01
In this paper we present an approach to harness multi-spectral projections (MSPs) to carefully shape and locate tones in the spectrum, enabling a new and robust modulation in which a signal's discrete frequency support is used to represent symbols. This method, called Frequency Position Modulation (FPM), is an innovative extension to MT-FSK and OFDM and can be non-uniformly spread over many GHz of instantaneous bandwidth (IBW), resulting in a communications system that is difficult to intercept and jam. The FPM symbols are recovered using adaptive projections that in part employ an analog polynomial nonlinearity paired with an analog-to-digital converter (ADC) sampling at a rate at that is only a fraction of the IBW of the signal. MSPs also facilitate using commercial of-the-shelf (COTS) ADCs with uniform-sampling, standing in sharp contrast to random linear projections by random sampling, which requires a full Nyquist rate sample-and-hold. Our novel communication system concept provides an order of magnitude improvement in processing gain over conventional LPI/LPD communications (e.g., FH- or DS-CDMA) and facilitates the ability to operate in interference laden environments where conventional compressed sensing receivers would fail. We quantitatively analyze the bit error rate (BER) and processing gain (PG) for a maximum likelihood based FPM demodulator and demonstrate its performance in interference laden conditions.
Hawash, Y; Ghonaim, M; Hussein, Y; Alhazmi, A; Alturkistani, A
2015-06-01
The presence of Cryptosporidium and/or Giardia in drinking water represents a major public health problem. This study was the first report concerned with the occurrence of these protozoa in drinking water in Saudi Arabia. The study was undertaken in Al-Taif, a high altitude region, Western Saudi Arabia. Eight underground wells water, six desalinated water and five domestic brands of bottled water samples, 10 liter each, were monthly collected between May 2013 and April 2014. All samples (n = 228), were processed using an automated wash/elution station (IDEXX Laboratories, Inc.). Genomic DNA was directly isolated and purified from samples concentrates with QIAamp® Stool Mini Kit (Qiagen). The target protozoan DNA sequences were amplified using two previously published nested-PCR protocols. Of all the analyzed water, 31 samples (≈14%) were found contaminated with the target protozoa. Giardia lamblia was detected in ≈10% (7/72) of desalinated water and in ≈9% (9/96) of wells water. On the other hand, Cryptosporidium was identified in ≈8% (8/72) of desalinated water and in ≈7% (7/96) of wells water. All bottled water samples (n = 60) were (oo)cysts-free. Protozoan (oo)cysts were more frequently identified in water samples collected in the spring than in other seasons. The methodology established in our study proved sensitive, cost-effective and is amenable for future automation or semi-automation. For better understanding of the current situation that represent an important health threat to the local inhabitants, further studies concerned with (oo)cyst viability, infectivity, concentration and genotype identification are recommended.
Diffusion Decision Model: Current Issues and History.
Ratcliff, Roger; Smith, Philip L; Brown, Scott D; McKoon, Gail
2016-04-01
There is growing interest in diffusion models to represent the cognitive and neural processes of speeded decision making. Sequential-sampling models like the diffusion model have a long history in psychology. They view decision making as a process of noisy accumulation of evidence from a stimulus. The standard model assumes that evidence accumulates at a constant rate during the second or two it takes to make a decision. This process can be linked to the behaviors of populations of neurons and to theories of optimality. Diffusion models have been used successfully in a range of cognitive tasks and as psychometric tools in clinical research to examine individual differences. In this review, we relate the models to both earlier and more recent research in psychology. Copyright © 2016. Published by Elsevier Ltd.
Processing and properties of Titanium alloy based materials with tailored porosity and composition
NASA Astrophysics Data System (ADS)
Cabezas-Villa, Jose Luis; Olmos, Luis; Lemus-Ruiz, Jose; Bouvard, Didier; Chavez, Jorge; Jimenez, Omar; Manuel Solorio, Victor
2017-06-01
This paper deals with powder processing of Ti6Al4V titanium alloy based materials with tailored porosity and composition. Ti6Al4V powder was mixed either with salt particles acting as space holder, so as to provide two-scale porosity, or with hard TiN particles that significantly modified the microstructure of the material and increased its hardness. Finally an original three-layer component was produced. Sample microstructure was observed by SEM and micro-tomography with special interest in pore size and shape, inclusion distribution and connectivity. Compression tests provided elastic modulus and yield stress as functions of density. These materials are representative of bone implants subjected to complex biological and mechanical conditions. These results thus open avenues for processing personalized implants by powder metallurgy.
Total recovery of the waste of two-phase olive oil processing: isolation of added-value compounds.
Fernández-Bolaños, Juan; Rodríguez, Guillermo; Gómez, Esther; Guillén, Rafael; Jiménez, Ana; Heredia, Antonia; Rodríguez, Rocío
2004-09-22
A process for the value addition of solid waste from two-phase olive oil extraction or "alperujo" that includes a hydrothermal treatment has been suggested. In this treatment an autohydrolysis process occurs and the solid olive byproduct is partially solubilized. From this water-soluble fraction can be obtained besides the antioxidant hydroxytyrosol several other compounds of high added value. In this paper three different samples of alperujo were characterized and subjected to a hydrothermal treatment with and without acid catalyst. The main soluble compounds after the hydrolysis were represented by monosaccharides xylose, arabinose, and glucose; oligosaccharides, mannitol and products of sugar destruction. Oligosaccharides were separated by size exclusion chromatography. It was possible to get highly purified mannitol by applying a simple purification method.
Blome, C.D.; Reed, K.M.
1993-01-01
Destruction of radiolarians during both diagenesis and HF processing severely reduces faunal abundance and diversity and affects the taxonomic and biostratigraphic utility of chert residues. The robust forms that survive the processing represent only a small fraction of the death assemblage, and delicate skeletal structures used for species differentiation, are either poorly preserved or dissolved in many coeval chert residues. First and last occurrences of taxa in chert sequences are likely to be coarse approximations of their true stratigraphic ranges. Precise correlation is difficult between biozonations based solely on index species from cherts and those constructed from limestone faunas. Careful selection of samples in sequence, use of weaker HF solutions, and study of both chert and limestone faunas should yield better biostratigraphic information. -from Authors
NASA Astrophysics Data System (ADS)
Ambrose, Jesse L.
2017-12-01
Atmospheric Hg measurements are commonly carried out using Tekran® Instruments Corporation's model 2537 Hg vapor analyzers, which employ gold amalgamation preconcentration sampling and detection by thermal desorption (TD) and atomic fluorescence spectrometry (AFS). A generally overlooked and poorly characterized source of analytical uncertainty in those measurements is the method by which the raw Hg atomic fluorescence (AF) signal is processed. Here I describe new software-based methods for processing the raw signal from the Tekran® 2537 instruments, and I evaluate the performances of those methods together with the standard Tekran® internal signal processing method. For test datasets from two Tekran® instruments (one 2537A and one 2537B), I estimate that signal processing uncertainties in Hg loadings determined with the Tekran® method are within ±[1 % + 1.2 pg] and ±[6 % + 0.21 pg], respectively. I demonstrate that the Tekran® method can produce significant low biases (≥ 5 %) not only at low Hg sample loadings (< 5 pg) but also at tropospheric background concentrations of gaseous elemental mercury (GEM) and total mercury (THg) (˜ 1 to 2 ng m-3) under typical operating conditions (sample loadings of 5-10 pg). Signal processing uncertainties associated with the Tekran® method can therefore represent a significant unaccounted for addition to the overall ˜ 10 to 15 % uncertainty previously estimated for Tekran®-based GEM and THg measurements. Signal processing bias can also add significantly to uncertainties in Tekran®-based gaseous oxidized mercury (GOM) and particle-bound mercury (PBM) measurements, which often derive from Hg sample loadings < 5 pg. In comparison, estimated signal processing uncertainties associated with the new methods described herein are low, ranging from within ±0.053 pg, when the Hg thermal desorption peaks are defined manually, to within ±[2 % + 0.080 pg] when peak definition is automated. Mercury limits of detection (LODs) decrease by 31 to 88 % when the new methods are used in place of the Tekran® method. I recommend that signal processing uncertainties be quantified in future applications of the Tekran® 2537 instruments.
Sekula, W; Nelson, M; Figurska, K; Oltarzewski, M; Weisell, R; Szponar, L
2005-06-01
Household budget survey (HBS) data are used regularly for nutritional epidemiological purposes. The validity of HBS data, however, is not well established. The aim of this project was to compare HBS and individual nutrition survey (INS) data in a nationally representative sample of Polish households. Estimates of food consumption and nutrient intake were compared between household food acquisition data collected over 1 month and a single 24-hour recall collected from every household member in a nationally representative sample of Polish households surveyed between September and November 2000. To facilitate the comparison, INS food consumption data excluded food eaten away from home and were modified using a computer program to estimate food 'as purchased' (including disaggregation of recipe data) and to allow for wastage. Poland. Participants were 3716 individuals in 1215 households (representing co-operation rates of 86.2% and 89.2%, respectively). Good agreement was shown between median estimates of foods such as potatoes, vegetables (including processed), meat, meat products and poultry, and animal fats (excluding butter), but agreement was poor for bread and rolls, fruit, vegetable fats and oils, eggs and six other food groups. Estimates of energy and nutrient intake were within +/-10% with the exceptions of polyunsaturated fats, potassium and vitamin C. Possible reasons for differences in findings between the two surveys include survey bias (e.g. social approval bias leading to overreporting of fruit), seasonal variations (e.g. high potato purchases between September and November) and aspects of the methodology (e.g. HBS data were based on records collected over 1 month, whereas 24-hour recall data were based on recalls collected from all household respondents on only 1 day and averaged for each household type). HBSs provide useful data for epidemiological research, but findings need to be interpreted in the light of other data regarding consumption, and numerous factors that may affect consumption need to be taken into account.
Recruitment for Occupational Research: Using Injured Workers as the Point of Entry into Workplaces
Koehoorn, Mieke; Trask, Catherine M.; Teschke, Kay
2013-01-01
Objective To investigate the feasibility, costs and sample representativeness of a recruitment method that used workers with back injuries as the point of entry into diverse working environments. Methods Workers' compensation claims were used to randomly sample workers from five heavy industries and to recruit their employers for ergonomic assessments of the injured worker and up to 2 co-workers. Results The final study sample included 54 workers from the workers’ compensation registry and 72 co-workers. This sample of 126 workers was based on an initial random sample of 822 workers with a compensation claim, or a ratio of 1 recruited worker to approximately 7 sampled workers. The average recruitment cost was CND$262/injured worker and CND$240/participating worksite including co-workers. The sample was representative of the heavy industry workforce, and was successful in recruiting the self-employed (8.2%), workers from small employers (<20 workers, 38.7%), and workers from diverse working environments (49 worksites, 29 worksite types, and 51 occupations). Conclusions The recruitment rate was low but the cost per participant reasonable and the sample representative of workers in small worksites. Small worksites represent a significant portion of the workforce but are typically underrepresented in occupational research despite having distinct working conditions, exposures and health risks worthy of investigation. PMID:23826387
Self-organizing maps for learning the edit costs in graph matching.
Neuhaus, Michel; Bunke, Horst
2005-06-01
Although graph matching and graph edit distance computation have become areas of intensive research recently, the automatic inference of the cost of edit operations has remained an open problem. In the present paper, we address the issue of learning graph edit distance cost functions for numerically labeled graphs from a corpus of sample graphs. We propose a system of self-organizing maps (SOMs) that represent the distance measuring spaces of node and edge labels. Our learning process is based on the concept of self-organization. It adapts the edit costs in such a way that the similarity of graphs from the same class is increased, whereas the similarity of graphs from different classes decreases. The learning procedure is demonstrated on two different applications involving line drawing graphs and graphs representing diatoms, respectively.
Determination of geostatistically representative sampling locations in Porsuk Dam Reservoir (Turkey)
NASA Astrophysics Data System (ADS)
Aksoy, A.; Yenilmez, F.; Duzgun, S.
2013-12-01
Several factors such as wind action, bathymetry and shape of a lake/reservoir, inflows, outflows, point and diffuse pollution sources result in spatial and temporal variations in water quality of lakes and reservoirs. The guides by the United Nations Environment Programme and the World Health Organization to design and implement water quality monitoring programs suggest that even a single monitoring station near the center or at the deepest part of a lake will be sufficient to observe long-term trends if there is good horizontal mixing. In stratified water bodies, several samples can be required. According to the guide of sampling and analysis under the Turkish Water Pollution Control Regulation, a minimum of five sampling locations should be employed to characterize the water quality in a reservoir or a lake. The European Union Water Framework Directive (2000/60/EC) states to select a sufficient number of monitoring sites to assess the magnitude and impact of point and diffuse sources and hydromorphological pressures in designing a monitoring program. Although existing regulations and guidelines include frameworks for the determination of sampling locations in surface waters, most of them do not specify a procedure in establishment of monitoring aims with representative sampling locations in lakes and reservoirs. In this study, geostatistical tools are used to determine the representative sampling locations in the Porsuk Dam Reservoir (PDR). Kernel density estimation and kriging were used in combination to select the representative sampling locations. Dissolved oxygen and specific conductivity were measured at 81 points. Sixteen of them were used for validation. In selection of the representative sampling locations, care was given to keep similar spatial structure in distributions of measured parameters. A procedure was proposed for that purpose. Results indicated that spatial structure was lost under 30 sampling points. This was as a result of varying water quality in the reservoir due to inflows, point and diffuse inputs, and reservoir hydromorphology. Moreover, hot spots were determined based on kriging and standard error maps. Locations of minimum number of sampling points that represent the actual spatial structure of DO distribution in the Porsuk Dam Reservoir
Rare Earth Element and Trace Element Data Associated with Hydrothermal Spring Reservoir Rock, Idaho
Quillinan, Scott; Bagdonas, Davin
2017-06-22
These data represent rock samples collected in Idaho that correspond with naturally occurring hydrothermal samples that were collected and analyzed by INL (Idaho Falls, ID). Representative samples of type rocks were selected to best represent the various regions of Idaho in which naturally occurring hydrothermal waters occur. This includes the Snake River Plain (SRP), Basin and Range type structures east of the SRP, and large scale/deep seated orogenic uplift of the Sawtooth Mountains, ID. Analysis includes ICP-OES and ICP-MS methods for Major, Trace, and REE concentrations.
Women: The New Providers. Whirlpool Foundation Study, Part One.
ERIC Educational Resources Information Center
Families and Work Inst., New York, NY.
A study conducted interviews with a nationally representative sample of 1,502 women, as well as focus groups across the country, to develop a new portrait of women's views on family, work, society, and the future. On several key questions, their views were compared to a representative group of 460 U.S. men and representative samples of 1,005 women…
CIHR Candrive Cohort Comparison with Canadian Household Population Holding Valid Driver's Licenses.
Gagnon, Sylvain; Marshall, Shawn; Kadulina, Yara; Stinchcombe, Arne; Bédard, Michel; Gélinas, Isabelle; Man-Son-Hing, Malcolm; Mazer, Barbara; Naglie, Gary; Porter, Michelle M; Rapoport, Mark; Tuokko, Holly; Vrkljan, Brenda
2016-06-01
We investigated whether convenience sampling is a suitable method to generate a sample of older drivers representative of the older-Canadian driver population. Using equivalence testing, we compared a large convenience sample of older drivers (Candrive II prospective cohort study) to a similarly aged population of older Canadian drivers. The Candrive sample consists of 928 community-dwelling older drivers from seven metropolitan areas of Canada. The population data was obtained from the Canadian Community Health Survey - Healthy Aging (CCHS-HA), which is a representative sample of older Canadians. The data for drivers aged 70 and older were extracted from the CCHS-HA database, for a total of 3,899 older Canadian drivers. Two samples were demonstrated as equivalent on socio-demographic, health, and driving variables that we compared, but not on driving frequency. We conclude that convenience sampling used in the Candrive study created a fairly representative sample of Canadian older drivers, with a few exceptions.
Levy, Ifat; Lazzaro, Stephanie C.; Rutledge, Robb B.; Glimcher, Paul W.
2011-01-01
Decision-making is often viewed as a two-stage process, where subjective values are first assigned to each option and then the option of the highest value is selected. Converging evidence suggests that these subjective values are represented in the striatum and medial prefrontal cortex (MPFC). A separate line of evidence suggests that activation in the same areas represents the values of rewards even when choice is not required, as in classical conditioning tasks. However, it is unclear whether the same neural mechanism is engaged in both cases. To address this question we measured brain activation with fMRI while human subjects passively viewed individual consumer goods. We then sampled activation from predefined regions of interest and used it to predict subsequent choices between the same items made outside of the scanner. Our results show that activation in the striatum and MPFC in the absence of choice predicts subsequent choices, suggesting that these brain areas represent value in a similar manner whether or not choice is required. PMID:21209196
CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.
Saegusa, Jun
2008-01-01
The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.
Lack of evidence for microplastic contamination in honey.
Mühlschlegel, Peter; Hauk, Armin; Walter, Ulrich; Sieber, Robert
2017-11-01
Honey samples from Switzerland were investigated with regard to their microplastic particle burden. Five representative honey samples of different origin were processed following a standardized protocol to separate plastic-based microparticles from particles of natural origin, such as pollen, propolis, wax, and bee-related debris. The procedure was optimized to minimize post-sampling microplastic cross-contamination in the laboratory. The isolated microplastic particles were characterized and grouped by means of light microscopy as well as chemically characterized by microscopically coupled Raman and Fourier transform infrared spectroscopy. Five particle classes with an abundance significantly above blank levels were identified: black particles (particle count between 1760/kg and 8680/kg), white transparent fibres (particle count between 132/kg and 728/kg), white transparent particles (particle count between 60/kg and 172/kg), coloured fibres (particle count between 32/kg and 108/kg), and coloured particles (particle count between 8/kg and 64/kg). The black particles, which represented the majority of particles, were identified as char or soot and most probably originated from the use of smokers, a widespread practice in beekeeping. The majority of fibres were identified as cellulose or polyethylene terephthalate and were most likely of textile origin. In addition to these particle and fibre groups lower numbers of fragments were detected that were related to glass, polysaccharides or chitin, and few bluish particles contained copper phthalocyanine pigment. We found no indications that the honey samples were significantly contaminated with microplastic particles.
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
Birge, Max; Duffy, Stephen; Miler, Joanna Astrid; Hajek, Peter
2017-11-04
The 'conversion rate' from initial experimentation to daily smoking is a potentially important metric of smoking behavior, but estimates of it based on current representative data are lacking. The Global Health Data Exchange was searched for representative surveys conducted in English speaking, developed countries after year 2000 that included questions about ever trying a cigarette and ever smoking daily. The initial search identified 2776 surveys that were further screened for language, location, year, sample size, survey structure and representativeness. 44 surveys that passed the screening process were accessed and their codebooks were examined to see whether the two questions of interest were included. Eight datasets allowed extraction or estimation of relevant information. Survey quality was assessed with regards to response rates, sampling methods and data collection procedures. PRISMA guidelines were followed, with explicit rules for approaching derived variables and skip patterns. Proportions were pooled using random effects meta-analysis. The eight surveys used representative samples of the general adult population. Response rates varied from 45% to 88%. Survey methods were on par with the best practice in this field. Altogether 216,314 respondents were included of whom 60.3% (95%CI 51.3-69.3) ever tried a cigarette. Among those, 68.9% (95% CI 60.9-76.9%) progressed to daily smoking. Over two thirds of people who try one cigarette become, at least temporarily, daily smokers. The finding provides strong support for the current efforts to reduce cigarette experimentation among adolescents. The transition from trying the first cigarette through occasional to daily smoking usually implies that a recreational activity is turning into a compulsive need that has to be satisfied virtually continuously. The 'conversion rate' from initial experimentation to daily smoking is thus a potentially important metric of smoking behavior, but estimates of it based on representative data are lacking. The present meta analysis addressed this gap. Currently, about two thirds of non-smokers experimenting with cigarettes progress to daily smoking. The finding supports strongly the current efforts to reduce cigarette experimentation among adolescents. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
78 FR 44355 - Semiannual Regulatory Agenda
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-23
... central nervous system (``CNS'') depression, decreased heart rate, and depressed ventilation in children... Product Certification Regarding Representative Samples for Periodic Testing of Children's Products. 303... Certification Regarding Representative Samples for Periodic Testing of Children's Products Legal Authority: 15 U...
Method and apparatus for data sampling
Odell, Daniel M. C.
1994-01-01
A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.
Trudeau, Michaela P; Verma, Harsha; Sampedro, Fernando; Urriola, Pedro E; Shurson, Gerald C; McKelvey, Jessica; Pillai, Suresh D; Goyal, Sagar M
2016-01-01
Infection with porcine epidemic diarrhea virus (PEDV) causes diarrhea, vomiting, and high mortality in suckling pigs. Contaminated feed has been suggested as a vehicle of transmission for PEDV. The objective of this study was to compare thermal and electron beam processing, and the inclusion of feed additives on the inactivation of PEDV in feed. Feed samples were spiked with PEDV and then heated to 120-145°C for up to 30 min or irradiated at 0-50 kGy. Another set of feed samples spiked with PEDV and mixed with Ultracid P (Nutriad), Activate DA (Novus International), KEM-GEST (Kemin Agrifood), Acid Booster (Agri-Nutrition), sugar or salt was incubated at room temperature (~25°C) for up to 21 days. At the end of incubation, the virus titers were determined by inoculation of Vero-81 cells and the virus inactivation kinetics were modeled using the Weibull distribution model. The Weibull kinetic parameter delta represented the time or eBeam dose required to reduce virus concentration by 1 log. For thermal processing, delta values ranged from 16.52 min at 120°C to 1.30 min at 145°C. For eBeam processing, a target dose of 50 kGy reduced PEDV concentration by 3 log. All additives tested were effective in reducing the survival of PEDV when compared with the control sample (delta = 17.23 days). Activate DA (0.81) and KEM-GEST (3.28) produced the fastest inactivation. In conclusion, heating swine feed at temperatures over 130°C or eBeam processing of feed with a dose over 50 kGy are effective processing steps to reduce PEDV survival. Additionally, the inclusion of selected additives can decrease PEDV survivability.
The woman's birth experience---the effect of interpersonal relationships and continuity of care.
Dahlberg, Unn; Aune, Ingvild
2013-04-01
the aim of the present study was to gain a deeper understanding of how relational continuity in the childbearing process may influence the woman's birth experience. RESEARCH DESIGN/SETTING: a Q-methodological approach was chosen, as it allows the researcher to systematically assess subjectivity. 23 women were invited to sort a sample of 48 statements regarding their subjective view of birth experience after having participated in a pilot project in Norway, where six midwifery students provided continuity of care to 58 women throughout the childbearing process. The sorting patterns were subsequently factor-analysed, using the statistical software 'PQ' which reveals one strong and one weaker factor. The consensus statements and the defining statements for the two factors were later interpreted. both factors seemed to represent experiences of psychological trust and a feeling of team work along with the midwifery student. Both factors indicated the importance of quality in the relation. Factor one represented experiences of presence and emotional support in the relationship. It also represented a feeling of personal growth for the women. Factor two was defined by experiences of predictability in the relation and process, as well as the feeling of interdependency in the relation. According to quality in the relation, women defining factor two experienced that the content, not only the continuity in the relation, was important for the birth experience. relational continuity is a key concept in the context of a positive birth experience. Quality in the relation gives the woman a possibility to experience positivity during the childbearing process. Continuity in care and personal growth related to birth promote empowerment for both the woman and her partner. Relational continuity gives an opportunity for midwives to provide care in a more holistic manner. Copyright © 2012 Elsevier Ltd. All rights reserved.
Moonrise: Sampling the South Pole-Aitken Basin to Address Problems of Solar System Significance
NASA Technical Reports Server (NTRS)
Zeigler, R. A.; Jolliff, B. L.; Korotev, R. L.; Shearer, C. K.
2016-01-01
A mission to land in the giant South Pole-Aitken (SPA) Basin on the Moon's southern farside and return a sample to Earth for analysis is a high priority for Solar System Science. Such a sample would be used to determine the age of the SPA impact; the chronology of the basin, including the ages of basins and large impacts within SPA, with implications for early Solar System dynamics and the magmatic history of the Moon; the age and composition of volcanic rocks within SPA; the origin of the thorium signature of SPA with implications for the origin of exposed materials and thermal evolution of the Moon; and possibly the magnetization that forms a strong anomaly especially evident in the northern parts of the SPA basin. It is well known from studies of the Apollo regolith that rock fragments found in the regolith form a representative collection of many different rock types delivered to the site by the impact process (Fig. 1). Such samples are well documented to contain a broad suite of materials that reflect both the local major rock formations, as well as some exotic materials from far distant sources. Within the SPA basin, modeling of the impact ejection process indicates that regolith would be dominated by SPA substrate, formed at the time of the SPA basin-forming impact and for the most part moved around by subsequent impacts. Consistent with GRAIL data, the SPA impact likely formed a vast melt body tens of km thick that took perhaps several million years to cool, but that nonetheless represents barely an instant in geologic time that should be readily apparent through integrated geochronologic studies involving multiple chronometers. It is anticipated that a statistically significant number of age determinations would yield not only the age of SPA but also the age of several prominent nearby basins and large craters within SPA. This chronology would provide a contrast to the Imbrium-dominated chronology of the nearside Apollo samples and an independent test of the timing of the lunar cataclysm.
Kaye, Gordon; Weber, Peter; Evans, Ann; Venezia, Richard
1998-05-01
The efficacy of alkaline hydrolysis as an alternative for incineration or autoclaving during treatment and disposal of infectious waste was evaluated by testing for the destruction of samples of pure cultures of selected infectious microorganisms during digestion of 114 to 136-kg loads of animal carcasses in an animal tissue digestor at the Albany Medical College. Ten milliliter samples of pure cultures of each microorganism were divided among 3 dialysis bags made from narrow diameter dialysis tubing, and each of these bags was placed inside another dialysis bag made from larger diameter dialysis tubing. Each double-bagged sample was suspended from the cover of the carcass basket of the tissue digestor so that it was completely covered by hot alkaline digestion solution during the carcass digestion process. The following organisms were required by the New York State Department of Health as representative pathogens for testing sterilization capabilities of the procedure: Staphylococcus aureus, Mycobacterium fortuitum, Candida albicans, Bacillus subtilis, Pseudomonas aeruginosa, Aspergillus fumigatus, Mycobacterium bovis BCG, MS-2 bacteriophage, and Giardia muris. Animal carcasses included pigs, sheep, rabbits, dogs, rats, mice, and guinea pigs. The tissue digestor was operated at 110 to 120 C and approximately 15 lb/in2 (gauge) for 18 h before the system was allowed to cool to 50 C and dialysis bags were retrieved and submitted for microbial culture. None of the samples obtained from the dialysis bags after the digestion process yielded indicator bacteria or yeast. Giardia cysts were completely destroyed; only small fragments of what appeared to be cyst wall could be recognized with light microscopic examination. No plaque-forming units were detected with MS-2 bacteriophage after digestion. Samples of the hydrolyzate also did not yield growth on culture media. Animal carcasses were completely solubilized and digested, with only the inorganic components of the bones and teeth remaining after draining and rinsing of the digestion vessel. Alkaline hydrolysis, as performed in this tissue digestor, completely destroyed all representative classes of potentially infectious agents as well as disposing of animal carcasses by solubilization and digestion.
Thomas, Jennifer L; Donnelly, Christopher C; Lloyd, Erin W; Mothershead, Robert F; Miller, Mark L
2018-03-01
An improved cleanup method has been developed for the recovery of trace levels of 12 nitro-organic explosives in soil, which is important not only for the forensic community, but also has environmental implications. A wide variety of explosives or explosive-related compounds were evaluated, including nitramines, nitrate esters, nitroaromatics, and a nitroalkane. Fortified soil samples were extracted with acetone, processed via solid phase extraction (SPE), and then analyzed by gas chromatography with electron capture detection. The following three SPE sorbents in cartridge format were compared: Empore™ SDB-XC, Oasis ® HLB, and Bond Elut NEXUS cartridges. The NEXUS cartridges provided the best overall recoveries for the 12 explosives in potting soil (average 48%) and the fastest processing times (<30min). It also rejected matrix components from spent motor oil on potting soil. The SPE method was validated by assessing limit of detection (LOD), processed sample stability, and interferences. All 12 compounds were detectable at 0.02μg explosive/gram of soil or lower in the three matrices tested (potting soil, sand, and loam) over three days. Seven explosives were stable up to seven days at 2μg/g and three were stable at 0.2μg/g, both in processed loam, which was the most challenging matrix. In the interference study, five interferences above the determined LOD for soil were detected in matrices collected across the United States and in purchased all-purpose sand, potting soil, and loam. This represented a 3.2% false positive rate for the 13 matrices processed by the screening method for interferences. The reported SPE cleanup method provides a fast and simple extraction process for separating organic explosives from matrix components, facilitating sample throughput and reducing instrument maintenance. In addition, a comparison study of the validated SPE method versus conventional syringe filtration was completed and highlighted the benefits of sample cleanup for removing matrix interferences, while also providing lower supply cost, order of magnitude lower LODs for most explosives, higher percent recoveries for complex matrices, and fewer instrument maintenance issues. Published by Elsevier B.V.
Body Fluids as a Source of Diagnostic Biomarkers: Prostate — EDRN Public Portal
Recent advances in high-throughput protein expression profiling of bodily fluids has generated great enthusiasm and hope for this approach as a potent diagnostic tool. At the center of these efforts is the application of SELDI-TOF-MS and artificial intelligence algorithms by the EDRN BDL site at Eastern Virginia Medical School and the DMCC respectively. When the expression profiling process was applied to sera from individuals with prostate cancer (N=197), BPH (N=92) or from otherwise healthy donors (N=97) we achieved an overall misclassification rate of 90% sensitivity. Since this represents a noticeable improvement in current clinical approach we are proposing to embark upon a validation process. The described studies are designed to address validation issues and include three phases. Phase 1; Synchronization of SELDI Output within the EDRN-Prostate-SELDI Investigational Collaboration (EPSIC); addressing portability (A) Synchronize SELDI instrumentation and robotic sample processing across the EPSIC using pooled serum(QC); (B) Establish the portability and reproducibility of the SELDI protein profiling approach within the EPSIC using normal and prostate cancer patient’s serum from a single site; (C) Establish robustness of the approach toward geographic, sample collection and processing differences within EPSIC using case and control serum from five different sites. Phase 2; Population Validation Establish geographic variability and robustness in a large cross-sectional study among different sample population. Phase 3; Clinical Validation; validate the serum protein expression profiling coupled with a learning algorithm as a means for early detection of prostate cancer using longitudinal PCPT samples. We have assembled a cohesive multi-institutional team for completing these studies in a timely and efficient manner. The team consists of five EDRN laboratories, DMCC and CBI and the proposed budget reflects the total involvement.
Utilization of microwave energy for decontamination of oil polluted soils.
Iordache, Daniela; Niculae, Dumitru; Francisc, Ioan Hathazi
2010-01-01
Soil oil (petroleum) product pollution represents a great environmental threat as it may contaminate the neighboring soils and surface and underground water. Liquid fuel contamination may occur anywhere during oil (petroleum) product transportation, storing, handling and utilization. The polluted soil recovery represents a complex process due to the wide range of physical, chemical and biological properties of soils which should be analyzed in connection with the study of the contaminated soil behavior under the microwave field action. The soil, like any other non-metallic material, can be heated through microwave energy absorption due to the dielectric losses, expressed by its dielectric complex constant. Oil polluted soil behaves differently in a microwave field depending on the nature, structure and amount of the polluting fuel. Decontamination is performed through volatilization and retrieval of organic contaminant volatile components. After decontamination only a soil fixed residue remains, which cannot penetrate the underground anymore. In carrying out the soil recovery process by means of this technology we should also consider the soil characteristics such as: the soil type, temperature, moisture.The first part of the paper presents the theoretical aspects relating to the behavior of the polluted soil samples in the microwave field, as well as their relating experimental data. The experimental data resulting from the analysis of soils with a different level of pollution point out that the degree of pollutant recovery is high, contributing to changing the initial classification of soils from the point of view of pollution. The paper graphically presents the levels of microwave generated and absorbed power in soil samples, soil temperature during experimentations, specific processing parameters in a microwave field. It also presents the constructive solution of the microwave equipment designed for the contaminated soil in situ treatment.
Family climates: family factors specific to disturbed eating and bulimia nervosa.
Laliberté, M; Boland, F J; Leichner, P
1999-09-01
More than a decade of research has characterized the families of individuals with bulimia and bulimia anorexia (Anorexia Nervosa, Binge/Purging Type) as less expressive, less cohesive, and experiencing more conflicts than normal control families. This two-part study investigated variables believed more directly related to disturbed eating and bulimia as contributing to a "family climate for eating disorders." In Study 1. a nonclinical sample of 324 women who had just left home for college and a sample of 121 mothers evaluated their families. Principal-components analyses revealed the same factor structure for both students and mothers, with Family Body Satisfaction, Family Social Appearance Orientation, and Family Achievement Emphasis loading together, representing the hypothesized family climate for eating disorders: the remaining variables loaded with the more traditional family process variables (conflict, cohesion, expressiveness), representing a more general family dysfunction. As predicted, the family climate for eating disorders factor score was a more powerful predictor of disturbed eating. Study 2 extended these findings into a clin ical population, examining whether the family climate for eating disorders variables would distinguish individuals with bulimia from both depressed and healthy controls. Groups of eating-disordered patients (n = 40) and depressed (n = 17) and healthy (n = 27) controls completed family measures. The eating-disordered group scored significantly higher on family climate variables than control groups. Family process variables distinguished clinical groups (depressed and eating disordered) from healthy controls, but not from one another. Controlling for depression removed group differences on family process variables, but family climate variables continued to distinguish the eating-disordered group from both control groups. Indications for further research are discussed.
Photoacoustic sensor for medical diagnostics
NASA Astrophysics Data System (ADS)
Wolff, Marcus; Groninga, Hinrich G.; Harde, Hermann
2004-03-01
The development of new optical sensor technologies has a major impact on the progress of diagnostic methods. Of the permanently increasing number of non-invasive breath tests, the 13C-Urea Breath Test (UBT) for the detection of Helicobacter pylori is the most prominent. However, many recent developments, like the detection of cancer by breath test, go beyond gastroenterological applications. We present a new detection scheme for breath analysis that employs an especially compact and simple set-up. Photoacoustic Spectroscopy (PAS) represents an offset-free technique that allows for short absorption paths and small sample cells. Using a single-frequency diode laser and taking advantage of acoustical resonances of the sample cell, we performed extremely sensitive and selective measurements. The smart data processing method contributes to the extraordinary sensitivity and selectivity as well. Also, the reasonable acquisition cost and low operational cost make this detection scheme attractive for many biomedical applications. The experimental set-up and data processing method, together with exemplary isotope-selective measurements on carbon dioxide, are presented.
André, L; Lamy, E; Lutz, P; Pernier, M; Lespinard, O; Pauss, A; Ribeiro, T
2016-02-01
The electrical resistivity tomography (ERT) method is a non-intrusive method widely used in landfills to detect and locate liquid content. An experimental set-up was performed on a dry batch anaerobic digestion reactor to investigate liquid repartition in process and to map spatial distribution of inoculum. Two array electrodes were used: pole-dipole and gradient arrays. A technical adaptation of ERT method was necessary. Measured resistivity data were inverted and modeled by RES2DINV software to get resistivity sections. Continuous calibration along resistivity section was necessary to understand data involving sampling and physicochemical analysis. Samples were analyzed performing both biochemical methane potential and fiber quantification. Correlations were established between the protocol of reactor preparation, resistivity values, liquid content, methane potential and fiber content representing liquid repartition, high methane potential zones and degradations zones. ERT method showed a strong relevance to monitor and to optimize the dry batch anaerobic digestion process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cellular and Molecular Changes in Orthodontic Tooth Movement
Zainal Ariffin, Shahrul Hisham; Yamamoto, Zulham; Zainol Abidin, lntan Zarina; Megat Abdul Wahab, Rohaya; Zainal Ariffin, Zaidah
2011-01-01
Tooth movement induced by orthodontic treatment can cause sequential reactions involving the periodontal tissue and alveolar bone, resulting in the release of numerous substances from the dental tissues and surrounding structures. To better understand the biological processes involved in orthodontic treatment, improve treatment, and reduce adverse side effects, several of these substances have been proposed as biomarkers. Potential biological markers can be collected from different tissue samples, and suitable sampling is important to accurately reflect biological processes. This paper covers the tissue changes that are involved during orthodontic tooth movement such as at compression region (involving osteoblasts), tension region (involving osteoclasts), dental root, and pulp tissues. Besides, the involvement of stem cells and their development towards osteoblasts and osteoclasts during orthodontic treatment have also been explained. Several possible biomarkers representing these biological changes during specific phenomenon, that is, bone remodelling (formation and resorption), inflammation, and root resorption have also been proposed. The knowledge of these biomarkers could be used in accelerating orthodontic treatment. PMID:22125437
Heuristic-driven graph wavelet modeling of complex terrain
NASA Astrophysics Data System (ADS)
Cioacǎ, Teodor; Dumitrescu, Bogdan; Stupariu, Mihai-Sorin; Pǎtru-Stupariu, Ileana; Nǎpǎrus, Magdalena; Stoicescu, Ioana; Peringer, Alexander; Buttler, Alexandre; Golay, François
2015-03-01
We present a novel method for building a multi-resolution representation of large digital surface models. The surface points coincide with the nodes of a planar graph which can be processed using a critically sampled, invertible lifting scheme. To drive the lazy wavelet node partitioning, we employ an attribute aware cost function based on the generalized quadric error metric. The resulting algorithm can be applied to multivariate data by storing additional attributes at the graph's nodes. We discuss how the cost computation mechanism can be coupled with the lifting scheme and examine the results by evaluating the root mean square error. The algorithm is experimentally tested using two multivariate LiDAR sets representing terrain surface and vegetation structure with different sampling densities.
Recommendations for clinical biomarker specimen preservation and stability assessments.
Dakappagari, Naveen; Zhang, Hui; Stephen, Laurie; Amaravadi, Lakshmi; Khan, Masood U
2017-04-01
With the wide use of biomarkers to enable critical drug-development decisions, there is a growing concern from scientific community on the need for a 'standardized process' for ensuring biomarker specimen stability and hence, a strong desire to share best practices on preserving the integrity of biomarker specimens in clinical trials and the design of studies to evaluate analyte stability. By leveraging representative industry experience, we have attempted to provide an overview of critical aspects of biomarker specimen stability commonly encountered during clinical development, including: planning of clinical sample collection procedures, clinical site training, selection of sample preservation buffers, shipping logistics, fit-for-purpose stability assessments in the analytical laboratory and presentation of case studies covering widely utilized biomarker specimen types.
NASA Astrophysics Data System (ADS)
Ivanova, T. M.; Serebryany, V. N.
2017-12-01
The component fit method in quantitative texture analysis assumes that the texture of the polycrystalline sample can be represented by a superposition of weighted standard distributions those are characterized by position in the orientation space, shape and sharpness of the scattering. The components of the peak and axial shapes are usually used. It is known that an axial texture develops in materials subjected to direct pressing. In this paper we considered the possibility of modelling a texture of a magnesium sample subjected to equal-channel angular pressing with axial components only. The results obtained make it possible to conclude that ECAP is also a process leading to the appearance of an axial texture in magnesium alloys.
NASA Astrophysics Data System (ADS)
Bousquet, B.; Travaillé, G.; Ismaël, A.; Canioni, L.; Michel-Le Pierrès, K.; Brasseur, E.; Roy, S.; le Hecho, I.; Larregieu, M.; Tellier, S.; Potin-Gautier, M.; Boriachon, T.; Wazen, P.; Diard, A.; Belbèze, S.
2008-10-01
Principal Components Analysis (PCA) is successfully applied to the full laser-induced breakdown spectroscopy (LIBS) spectra of soil samples, defining classes according to the concentrations of the major elements. The large variability of the LIBS data is related to the heterogeneity of the samples and the representativeness of the data is finally discussed. Then, the development of a mobile LIBS system dedicated to the in-situ analysis of soils polluted by heavy metals is described. Based on the use of ten-meter long optical fibers, the mobile system allows deported measurements. Finally, the laser-assisted drying process studied by the use of a customized laser has not been retained to overcome the problem of moisture.
The fate of a toxigenic strain of Staphylococcus aureus in vacuum-packaged bacon.
Dempster, J F; Kelly, W R
1973-09-01
Pork was cured by (a) the Wiltshire method and (b) a hygienic sweet cure process. Representative samples of both bacons were inoculated at ;low' density (10(3) organisms/g.) and ;high' density (10(6) organisms/g.) with a toxin-producing strain of Staphylococcus aureus, ;High' and ;low' density samples of both bacons were each stored at 5 degrees C. for 42 days and 15 degrees C. for 21 days.Results indicated that the test organism at high inoculum density grew slowly in both bacons at 5 degrees C. The organism survived at 5 degrees C. in both ;low density' bacons. At 15 degrees C. the test organism grew; growth being more pronounced in the ;hygienic' than in Wiltshire bacon.
Method and apparatus for improved observation of in-situ combustion processes
Lee, D.O.; Montoya, P.C.; Wayland, J.R. Jr.
Method and apparatus are provided for obtaining accurate dynamic measurements for passage of phase fronts through a core sample in a test fixture. Flow-through grid structures are provided for electrodes to permit data to be obtained before, during and after passage of a front there-through. Such electrodes are incorporated in a test apparatus for obtaining electrical characteristics of the core sample. With the inventive structure a method is provided for measurement of instabilities in a phase front progressing through the medium. Availability of accurate dynamic data representing parameters descriptive of material characteristics before, during and after passage of a front provides a more efficient method for enhanced recovery of oil using a fire flood technique. 6 figures, 2 tables.
NASA Technical Reports Server (NTRS)
1981-01-01
Science Applications, Inc.'s ATP Photometer makes a rapid and accurate count of the bacteria in a body fluid sample. Instrument provides information on the presence and quantity of bacteria by measuring the amount of light emitted by the reaction between two substances. Substances are ATP adenosine triphosphate and luciferase. The reactants are applied to a human body sample and the ATP Photometer observes the intensity of the light emitted displaying its findings in a numerical output. Total time lapse is usually less than 10 minutes, which represents a significant time savings in comparison of other techniques. Other applications are measuring organisms in fresh and ocean waters, determining bacterial contamination of foodstuffs, biological process control in the beverage industry, and in assay of activated sewage sludge.
Evaporation process in histological tissue sections for neutron autoradiography.
Espector, Natalia M; Portu, Agustina; Santa Cruz, Gustavo A; Saint Martin, Gisela
2018-05-01
The analysis of the distribution and density of nuclear tracks forming an autoradiography in a nuclear track detector (NTD) allows the determination of 10 B atoms concentration and location in tissue samples from Boron Neutron Capture Therapy (BNCT) protocols. This knowledge is of great importance for BNCT dosimetry and treatment planning. Tissue sections studied with this technique are obtained by cryosectioning frozen tissue specimens. After the slicing procedure, the tissue section is put on the NTD and the sample starts drying. The thickness varies from its original value allowing more particles to reach the detector and, as the mass of the sample decreases, the boron concentration in the sample increases. So in order to determine the concentration present in the hydrated tissue, the application of corrective coefficients is required. Evaporation mechanisms as well as various factors that could affect the process of mass variation are outlined in this work. Mass evolution for tissue samples coming from BDIX rats was registered with a semimicro analytical scale and measurements were analyzed with software developed to that end. Ambient conditions were simultaneously recorded, obtaining reproducible evaporation curves. Mathematical models found in the literature were applied for the first time to this type of samples and the best fit of the experimental data was determined. The correlation coefficients and the variability of the parameters were evaluated, pointing to Page's model as the one that best represented the evaporation curves. These studies will contribute to a more precise assessment of boron concentration in tissue samples by the Neutron Autoradiography technique.
Haenssgen, Marco J
2015-01-01
The increasing availability of online maps, satellite imagery, and digital technology can ease common constraints of survey sampling in low- and middle-income countries. However, existing approaches require specialised software and user skills, professional GPS equipment, and/or commercial data sources; they tend to neglect spatial sampling considerations when using satellite maps; and they continue to face implementation challenges analogous to conventional survey implementation methods. This paper presents an alternative way of utilising satellite maps and digital aides that aims to address these challenges. The case studies of two rural household surveys in Rajasthan (India) and Gansu (China) compare conventional survey sampling and implementation techniques with the use of online map services such as Google, Bing, and HERE maps. Modern yet basic digital technology can be integrated into the processes of preparing, implementing, and monitoring a rural household survey. Satellite-aided systematic random sampling enhanced the spatial representativeness of the village samples and entailed savings of approximately £4000 compared to conventional household listing, while reducing the duration of the main survey by at least 25 %. This low-cost/low-tech satellite-aided survey sampling approach can be useful for student researchers and resource-constrained research projects operating in low- and middle-income contexts with high survey implementation costs. While achieving transparent and efficient survey implementation at low costs, researchers aiming to adopt a similar process should be aware of the locational, technical, and logistical requirements as well as the methodological challenges of this strategy.
Gaspard, Philippe G; Schwartzbrod, Janine
2003-03-01
The use of sludge in agriculture must be carried out according to many guidelines, especially regarding a precise knowledge of the pathogenic microorganisms it contains. The control of the produced sludge requires a sampling strategy that is representative of the contamination present in the sludge. Thus, we evaluated the distribution of helminth eggs in sludge to determine how to sample and at what frequency. Two plants were studied, firstly we studied sludge that was undergoing biological treatment (anaerobic digestion, prolonged aeration), secondly we evaluated the dehydration step (centrifugation and filter press). The helminth egg concentrations were measured over short periods (between 5 minutes and 7 hours) and for periods of over 24 hours (7 to 28 days). The results showed that there was much homogeneity in periods of less than 7 hours, thus it was advisable to take grab samples. An appropriate sample weight was 30 g dry matter, because this allowed an analysis in triplicate when testing treatment processes according to standards of France, (less than 3 viable eggs/10 g dry matter). Determination of the egg concentration in the plants during periods of over 24 hours showed that the parasite flow was stable. In some cases, large variations were due to the treatment processes (storage or thickening, mixing of different sludges). These results have been confirmed with the study of 6 other plants during a one year period. Thus, the recommended sampling frequency can be limited to every 3 to 6 months, by adapting the sampling methods to the characteristics of the plant.
Methodology for Augmenting Existing Paths with Additional Parallel Transects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, John E.
2013-09-30
Visual Sample Plan (VSP) is sample planning software that is used, among other purposes, to plan transect sampling paths to detect areas that were potentially used for munition training. This module was developed for application on a large site where existing roads and trails were to be used as primary sampling paths. Gap areas between these primary paths needed to found and covered with parallel transect paths. These gap areas represent areas on the site that are more than a specified distance from a primary path. These added parallel paths needed to optionally be connected together into a single path—themore » shortest path possible. The paths also needed to optionally be attached to existing primary paths, again with the shortest possible path. Finally, the process must be repeatable and predictable so that the same inputs (primary paths, specified distance, and path options) will result in the same set of new paths every time. This methodology was developed to meet those specifications.« less
Cultural variations in global versus local processing: a developmental perspective.
Oishi, Shigehiro; Jaswal, Vikram K; Lillard, Angeline S; Mizokawa, Ai; Hitokoto, Hidefumi; Tsutsui, Yoshiro
2014-12-01
We conducted 3 studies to explore cultural differences in global versus local processing and their developmental trajectories. In Study 1 (N = 363), we found that Japanese college students were less globally oriented in their processing than American or Argentine participants. We replicated this effect in Study 2 (N = 1,843) using a nationally representative sample of Japanese and American adults ages 20 to 69, and found further that adults in both cultures became more globally oriented with age. In Study 3 (N = 133), we investigated the developmental course of the cultural difference using Japanese and American children, and found it was evident by 4 years of age. Cultural variations in global versus local processing emerge by early childhood, and remain throughout adulthood. At the same time, both Japanese and Americans become increasingly global processors with age. PsycINFO Database Record (c) 2014 APA, all rights reserved.