Almutairy, Meznah; Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.
Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989
Shashidhar, Ravindranath; Dhokane, Varsha S; Hajare, Sachin N; Sharma, Arun; Bandekar, Jayant R
2007-04-01
The microbiological quality of market samples of minimally processed (MP) pineapple was examined. The effectiveness of radiation treatment in eliminating Salmonella Typhimurium from laboratory inoculated ready-to-eat pineapple slices was also studied. Microbiological quality of minimally processed pineapple samples from Mumbai market was poor; 8.8% of the samples were positive for Salmonella. D(10) (the radiation dose required to reduce bacterial population by 90%) value for S. Typhimurium inoculated in pineapple was 0.242 kGy. Inoculated pack studies in minimally processed pineapple showed that the treatment with a 2-kGy dose of gamma radiation could eliminate 5 log CFU/g of S. Typhimurium. The pathogen was not detected from radiation-processed samples up to 12 d during storage at 4 and 10 degrees C. The processing of market samples with 1 and 2 kGy was effective in improving the microbiological quality of these products.
Kovačević, Mira; Burazin, Jelena; Pavlović, Hrvoje; Kopjar, Mirela; Piližota, Vlasta
2013-04-01
Minimally processed and refrigerated vegetables can be contaminated with Listeria species bacteria including Listeria monocytogenes due to extensive handling during processing or by cross contamination from the processing environment. The objective of this study was to examine the microbiological quality of ready-to-eat minimally processed and refrigerated vegetables from supermarkets in Osijek, Croatia. 100 samples of ready-to-eat vegetables collected from different supermarkets in Osijek, Croatia, were analyzed for presence of Listeria species and Listeria monocytogenes. The collected samples were cut iceberg lettuces (24 samples), other leafy vegetables (11 samples), delicatessen salads (23 samples), cabbage salads (19 samples), salads from mixed (17 samples) and root vegetables (6 samples). Listeria species was found in 20 samples (20 %) and Listeria monocytogenes was detected in only 1 sample (1 %) of cut red cabbage (less than 100 CFU/g). According to Croatian and EU microbiological criteria these results are satisfactory. However, the presence of Listeria species and Listeria monocytogenes indicates poor hygiene quality. The study showed that these products are often improperly labeled, since 24 % of analyzed samples lacked information about shelf life, and 60 % of samples lacked information about storage conditions. With regard to these facts, cold chain abruption with extended use after expiration date is a probable scenario. Therefore, the microbiological risk for consumers of ready-to-eat minimally processed and refrigerated vegetables is not completely eliminated.
Minimally processed vegetable salads: microbial quality evaluation.
Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa
2007-05-01
The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.
Irradiation treatment of minimally processed carrots for ensuring microbiological safety
NASA Astrophysics Data System (ADS)
Ashraf Chaudry, Muhammad; Bibi, Nizakat; Khan, Misal; Khan, Maazullah; Badshah, Amal; Jamil Qureshi, Muhammad
2004-09-01
Minimally processed fruits and vegetables are very common in developed countries and are gaining popularity in developing countries due to their convenience and freshness. However, minimally processing may result in undesirable changes in colour, taste and appearance due to the transfer of microbes from skin to the flesh. Irradiation is a well-known technology for elimination of microbial contamination. Food irradiation has been approved by 50 countries and is being applied commercially in USA. The purpose of this study was to evaluate the effect of irradiation on the quality of minimally processed carrots. Fresh carrots were peeled, sliced and PE packaged. The samples were irradiated (0, 0.5, 1.0, 2.0, 2.5, 3.0 kGy) and stored at 5°C for 2 weeks. The samples were analyzed for hardness, organoleptic acceptance and microbial load at 0, 7th and 15th day. The mean firmness of the control and all irradiated samples remained between 4.31 and 4.42 kg of force, showing no adverse effect of radiation dose. The effect of storage (2 weeks) was significant ( P< 0.05) with values ranging between 4.28 and 4.39 kg of force. The total bacterial counts at 5°C for non-irradiated and 0.5 kGy irradiated samples were 6.3×10 5 cfu/g, 3.0×10 2 and few colonies(>10) in all other irradiated samples(1.0, 2.0, 2.5 and 3.0 kGy) after 2 weeks storage. No coliform or E. coli were detected in any of the samples (radiated or control) immediately after irradiation and during the entire storage period in minimally processed carrots. A dose of 2.0 kGy completely controlled the fungal and bacterial counts. The irradiated samples (2.0 kGy) were also acceptable sensorially.
Why minimally invasive skin sampling techniques? A bright scientific future.
Wang, Christina Y; Maibach, Howard I
2011-03-01
There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.
Where do the Field Plots Belong? A Multiple-Constraint Sampling Design for the BigFoot Project
NASA Astrophysics Data System (ADS)
Kennedy, R. E.; Cohen, W. B.; Kirschbaum, A. A.; Gower, S. T.
2002-12-01
A key component of a MODIS validation project is effective characterization of biophysical measures on the ground. Fine-grain ecological field measurements must be placed strategically to capture variability at the scale of the MODIS imagery. Here we describe the BigFoot project's revised sampling scheme, designed to simultaneously meet three important goals: capture landscape variability, avoid spatial autocorrelation between field plots, and minimize time and expense of field sampling. A stochastic process places plots in clumped constellations to reduce field sampling costs, while minimizing spatial autocorrelation. This stochastic process is repeated, creating several hundred realizations of plot constellations. Each constellation is scored and ranked according to its ability to match landscape variability in several Landsat-based spectral indices, and its ability to minimize field sampling costs. We show how this approach has recently been used to place sample plots at the BigFoot project's two newest study areas, one in a desert system and one in a tundra system. We also contrast this sampling approach to that already used at the four prior BigFoot project sites.
NASA Astrophysics Data System (ADS)
Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.
2009-07-01
Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.
Brandão, Marcelo L L; Almeida, Davi O; Bispo, Fernanda C P; Bricio, Silvia M L; Marin, Victor A; Miagostovich, Marize P
2014-05-01
This study aimed to assess the microbiological contamination of lettuces commercialized in Rio de Janeiro, Brazil, in order to investigate detection of norovirus genogroup II (NoV GII), Salmonella spp., total and fecal coliforms, such as Escherichia coli. For NoV detection samples were processed using the adsorption-elution concentration method associated to real-time quantitative polymerase chain reaction (qPCR). A total of 90 samples of lettuce including 30 whole fresh lettuces, 30 minimally processed (MP) lettuces, and 30 raw ready-to-eat (RTE) lettuce salads were randomly collected from different supermarkets (fresh and MP lettuce samples), food services, and self-service restaurants (RTE lettuce salads), all located in Rio de Janeiro, Brazil, from October 2010 to December 2011. NoV GII was not detected and PP7 bacteriophage used as internal control process (ICP) was recovered in 40.0%, 86.7%, and 76.7% of those samples, respectively. Salmonella spp. was not detected although fecal contamination has been observed by fecal coliform concentrations higher than 10(2) most probable number/g. E. coli was detected in 70.0%, 6.7%, and 30.0% of fresh, MP, and RTE samples, respectively. This study highlights the need to improve hygiene procedures at all stages of vegetable production and to show PP7 bacteriophage as an ICP for recovering RNA viruses' methods from MP and RTE lettuce samples, encouraging the evaluation of new protocols that facilitate the establishment of methodologies for NoV detection in a greater number of food microbiology laboratories. The PP7 bacteriophage can be used as an internal control process in methods for recovering RNA viruses from minimally processed and ready-to-eat lettuce samples. © 2014 Institute of Food Technologists®
Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza
2014-01-01
ABSTRACT The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however, other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran. PMID:25395902
Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza; Khaniki, Gholam Reza Jahed
2014-09-01
The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however, other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran.
Hepburn, Sophie; Cairns, David A; Jackson, David; Craven, Rachel A; Riley, Beverley; Hutchinson, Michelle; Wood, Steven; Smith, Matthew Welberry; Thompson, Douglas; Banks, Rosamonde E
2015-06-01
We have examined the impact of sample processing time delay, temperature, and the addition of protease inhibitors (PIs) on the urinary proteome and peptidome, an important aspect of biomarker studies. Ten urine samples from patients with varying pathologies were each divided and PIs added to one-half, with aliquots of each then processed and frozen immediately, or after a delay of 6 h at 4°C or room temperature (20-22°C), effectively yielding 60 samples in total. Samples were then analyzed by 2D-PAGE, SELDI-TOF-MS, and immunoassay. Interindividual variability in profiles was the dominant feature in all analyses. Minimal changes were observed by 2D-PAGE as a result of delay in processing, temperature, or PIs and no changes were seen in IgG, albumin, β2 -microglobulin, or α1 -microglobulin measured by immunoassay. Analysis of peptides showed clustering of some samples by presence/absence of PIs but the extent was very patient-dependent with most samples showing minimal effects. The extent of processing-induced changes and the benefit of PI addition are patient- and sample-dependent. A consistent processing methodology is essential within a study to avoid any confounding of the results. © 2014 The Authors PROTEOMICS Clinical Applications Published by Wiley-VCH Verlag GmbH & Co. KGaA.
NASA Astrophysics Data System (ADS)
Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel
2008-10-01
Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.
Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel
2008-10-01
Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.
Solberg, Siri Løvsjø; Terragni, Laura; Granheim, Sabrina Ionata
2016-08-01
To identify the use of ultra-processed foods - vectors of salt, sugar and fats - in the Norwegian diet through an assessment of food sales. Sales data from a representative sample of food retailers in Norway, collected in September 2005 (n 150) and September 2013 (n 170), were analysed. Data consisted of barcode scans of individual food item purchases, reporting type of food, price, geographical region and retail concept. Foods were categorized as minimally processed, culinary ingredients, processed products and ultra-processed. Indicators were share of purchases and share of expenditure on food categories. Six geographical regions in Norway. The barcode data included 296 121 observations in 2005 and 501 938 observations in 2013. Ultra-processed products represented 58·8 % of purchases and 48·8 % of expenditure in 2013. Minimally processed foods accounted for 17·2 % of purchases and 33·0 % of expenditure. Every third purchase was a sweet ultra-processed product. Food sales changed marginally in favour of minimally processed foods and in disfavour of processed products between 2005 and 2013 (χ 2 (3)=203 195, P<0·001, Cramer's V=0·017, P<0·001). Ultra-processed products accounted for the majority of food sales in Norway, indicating a high consumption of such products. This could be contributing to rising rates of overweight, obesity and non-communicable diseases in the country, as findings from other countries indicate. Policy measures should aim at decreasing consumption of ultra-processed products and facilitating access (including economic) to minimally processed foods.
NASA Astrophysics Data System (ADS)
Hussain, Peerzada R.; Omeera, A.; Suradkar, Prashant P.; Dar, Mohd A.
2014-10-01
Gamma irradiation alone and in combination with ascorbic acid was tested for preventing the surface browning and maintaining the quality attributes of minimally processed eggplant. Eggplant samples after preparation were subjected to treatment of gamma irradiation in the dose range of 0.25-1.0 kGy and to combination treatments of ascorbic acid dip at a concentration of 2.0% w/v and gamma irradiation (dose range 0.5-2.0 kGy) followed by storage at 3±1 °C, RH 80%. Studies revealed inverse correlation (r=-0.93) between the polyphenol oxidase (PPO) activity, browning index and the treatments of ascorbic acid and gamma irradiation. Combinatory treatment of 2.0% w/v ascorbic acid and 1.0 kGy gamma irradiation proved to be significantly (p≤0.05) effective in inhibiting the PPO activity, preventing the surface browning and maintaining the creamy white color and other quality attributes of minimally processed eggplant up to 6 days of refrigerated storage. Sensory evaluation revealed that control and 0.25 kGy irradiated samples were unacceptable only after 3 days of storage. Samples irradiated at 0.5 kGy and 0.75 kGy were unacceptable after 6 days of storage. Microbial analysis revealed that radiation processing of minimally processed eggplant at 1.0 kGy with and without ascorbic acid resulted in around 1 and 1.5 log reduction in yeast and mold count as well as bacterial count just after treatment and 6 days of storage therefore, enhances the microbial safety.
Minimal Polynomial Method for Estimating Parameters of Signals Received by an Antenna Array
NASA Astrophysics Data System (ADS)
Ermolaev, V. T.; Flaksman, A. G.; Elokhin, A. V.; Kuptsov, V. V.
2018-01-01
The effectiveness of the projection minimal polynomial method for solving the problem of determining the number of sources of signals acting on an antenna array (AA) with an arbitrary configuration and their angular directions has been studied. The method proposes estimating the degree of the minimal polynomial of the correlation matrix (CM) of the input process in the AA on the basis of a statistically validated root-mean-square criterion. Special attention is paid to the case of the ultrashort sample of the input process when the number of samples is considerably smaller than the number of AA elements, which is important for multielement AAs. It is shown that the proposed method is more effective in this case than methods based on the AIC (Akaike's Information Criterion) or minimum description length (MDL) criterion.
Nasreddine, Lara; Tamim, Hani; Itani, Leila; Nasrallah, Mona P; Isma'eel, Hussain; Nakhoul, Nancy F; Abou-Rizk, Joana; Naja, Farah
2018-01-01
To (i) estimate the consumption of minimally processed, processed and ultra-processed foods in a sample of Lebanese adults; (ii) explore patterns of intakes of these food groups; and (iii) investigate the association of the derived patterns with cardiometabolic risk. Cross-sectional survey. Data collection included dietary assessment using an FFQ and biochemical, anthropometric and blood pressure measurements. Food items were categorized into twenty-five groups based on the NOVA food classification. The contribution of each food group to total energy intake (TEI) was estimated. Patterns of intakes of these food groups were examined using exploratory factor analysis. Multivariate logistic regression analysis was used to evaluate the associations of derived patterns with cardiometabolic risk factors. Greater Beirut area, Lebanon. Adults ≥18 years (n 302) with no prior history of chronic diseases. Of TEI, 36·53 and 27·10 % were contributed by ultra-processed and minimally processed foods, respectively. Two dietary patterns were identified: the 'ultra-processed' and the 'minimally processed/processed'. The 'ultra-processed' consisted mainly of fast foods, snacks, meat, nuts, sweets and liquor, while the 'minimally processed/processed' consisted mostly of fruits, vegetables, legumes, breads, cheeses, sugar and fats. Participants in the highest quartile of the 'minimally processed/processed' pattern had significantly lower odds for metabolic syndrome (OR=0·18, 95 % CI 0·04, 0·77), hyperglycaemia (OR=0·25, 95 % CI 0·07, 0·98) and low HDL cholesterol (OR=0·17, 95 % CI 0·05, 0·60). The study findings may be used for the development of evidence-based interventions aimed at encouraging the consumption of minimally processed foods.
NASA Astrophysics Data System (ADS)
Martins, C. G.; Behrens, J. H.; Destro, M. T.; Franco, B. D. G. M.; Vizeu, D. M.; Hutzler, B.; Landgraf, M.
2004-09-01
Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37°C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10. The shelf-life was increased by 1 {1}/{2} day when the product was exposed to 1 kGy.
Hepburn, Sophie; Cairns, David A.; Jackson, David; Craven, Rachel A.; Riley, Beverley; Hutchinson, Michelle; Wood, Steven; Smith, Matthew Welberry; Thompson, Douglas
2015-01-01
Purpose We have examined the impact of sample processing time delay, temperature, and the addition of protease inhibitors (PIs) on the urinary proteome and peptidome, an important aspect of biomarker studies. Experimental design Ten urine samples from patients with varying pathologies were each divided and PIs added to one‐half, with aliquots of each then processed and frozen immediately, or after a delay of 6 h at 4°C or room temperature (20–22°C), effectively yielding 60 samples in total. Samples were then analyzed by 2D‐PAGE, SELDI‐TOF‐MS, and immunoassay. Results Interindividual variability in profiles was the dominant feature in all analyses. Minimal changes were observed by 2D‐PAGE as a result of delay in processing, temperature, or PIs and no changes were seen in IgG, albumin, β2‐microglobulin, or α1‐microglobulin measured by immunoassay. Analysis of peptides showed clustering of some samples by presence/absence of PIs but the extent was very patient‐dependent with most samples showing minimal effects. Conclusions and clinical relevance The extent of processing‐induced changes and the benefit of PI addition are patient‐ and sample‐dependent. A consistent processing methodology is essential within a study to avoid any confounding of the results. PMID:25400092
Sanz, S; Olarte, C; Ayala, F; Echávarri, J F
2009-08-01
The effect of different types of lighting (white, green, red, and blue light) on minimally processed asparagus during storage at 4 degrees C was studied. The gas concentrations in the packages, pH, mesophilic counts, and weight loss were also determined. Lighting caused an increase in physiological activity. Asparagus stored under lighting achieved atmospheres with higher CO(2) and lower O(2) content than samples kept in the dark. This activity increase explains the greater deterioration experienced by samples stored under lighting, which clearly affected texture and especially color, accelerating the appearance of greenish hues in the tips and reddish-brown hues in the spears. Exposure to light had a negative effect on the quality parameters of the asparagus and it caused a significant reduction in shelf life. Hence, the 11 d shelf life of samples kept in the dark was reduced to only 3 d in samples kept under red and green light, and to 7 d in those kept under white and blue light. However, quality indicators such as the color of the tips and texture showed significantly better behavior under blue light than with white light, which allows us to state that it is better to use this type of light or blue-tinted packaging film for the display of minimally processed asparagus to consumers.
Lombardo, Sara; Restuccia, Cristina; Muratore, Giuseppe; Barbagallo, Riccardo N; Licciardello, Fabio; Pandino, Gaetano; Scifò, Giovanna O; Mazzaglia, Agata; Ragonese, Francesca; Mauromicale, Giovanni
2017-01-01
Although nitrogen (N) fertilisation is essential for promoting crop yield, it may also affect the produce quality. Here, the influence of three N fertiliser rates (0 kg ha -1 as a control, 200 kg ha -1 and 400 kg ha -1 referred to as N 0 , N 200 and N 400 , respectively) on the overall quality of minimally processed globe artichoke heads was investigated during refrigerated storage for 12 days. Throughout the storage time, N fertilised samples had higher inulin contents than those unfertilised. In addition, the respiratory quotient of N 200 and N 400 samples was 2-fold and 2.5-fold lower than N 0 ones, whose values were close to the normal range for vegetables. All the samples reported good microbiological standards, although N 200 and N 400 achieved lower mesophilic and psychotropic counts than N 0 throughout the storage time. After 8 and 12 days of refrigerated storage, the N 200 samples showed the highest scores of positive sensory descriptors. A fertiliser level of 200 kg N ha -1 is suitable for obtaining minimally processed globe artichoke heads with good nutritional, sensory and microbiological quality, characterised by low endogenous oxidase activities. Proper packaging systems and procedures are, however, crucial for extending the product shelf-life and, thus, promoting its exportation on a wider scale. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
Aparecida de Oliveira, Maria; Abeid Ribeiro, Eliana Guimarães; Morato Bergamini, Alzira Maria; Pereira De Martinis, Elaine Cristina
2010-02-01
Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirão Preto, São Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L. monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTE QPCR SYBR Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fast and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method.
Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board
NASA Technical Reports Server (NTRS)
Breeding Shawn; Khodabandeh, Julia; Turner, Larry D. (Technical Monitor)
2001-01-01
The science requirements for materials processing is to provide the desired PI requirements of thermal gradient, solid/liquid interface front velocity for a given processing temperature desired by the PI. Processing is performed by translating the furnace with the sample in a stationary position to minimize any disturbances to the solid/liquid interface front during steady state processing. Typical sample materials for this metals and alloys furnace are lead-tin alloys, lead-antimony alloys, and aluminum alloys. Samples must be safe to process and therefore typically are contained with hermetically sealed cartridge tubes (gas tight) with inner ceramic liners (liquid tight) to prevent contamination and/or reaction of the sample material with the cartridge tube.
Ban, Zhaojun; Feng, Jianhua; Wei, Wenwen; Yang, Xiangzheng; Li, Jilan; Guan, Junfeng; Li, Jiang
2015-08-01
Edible coating has been an innovation within the bioactive packaging concept. The comparative analysis upon the effect of edible coating, sodium chlorite (SC) and their combined application on quality maintenance of minimally processed pomelo (Citrus grandis) fruits during storage at 4 °C was conducted. Results showed that the combination of edible coating and SC dipping delayed the microbial development whereas the sole coating or dipping treatment was less efficient. The synergetic application of edible coating and SC treatment under modified atmosphere packaging (MAP, 10% O2 , 10% CO2 ) was able to maintain the total soluble solids level and ascorbic acid content, while reduce the weight loss as well as development of mesophiles and psychrotrophs. Nonetheless, the N, O-carboxymethyl chitosan solely coated samples showed significantly higher level of weight loss during storage with comparison to the untreated sample. Furthermore, the combined application of edible coating and SC dipping under active MAP best maintained the sensory quality of minimally processed pomelo fruit during storage. © 2015 Institute of Food Technologists®
Ares, Gastón; Giménez, Ana; Gámbaro, Adriana
2008-01-01
The aim of the present work was to study the influence of context, particularly the stage of the decision-making process (purchase vs consumption stage), on sensory shelf life of minimally processed lettuce. Leaves of butterhead lettuce were placed in common polypropylene bags and stored at 5, 10 and 15 degrees C. Periodically, a panel of six assessors evaluated the appearance of the samples, and a panel of 40 consumers evaluated their appearance and answered "yes" or "no" to the questions: "Imagine you are in a supermarket, you want to buy a minimally processed lettuce, and you find a package of lettuce with leaves like this, would you normally buy it?" and "Imagine you have this leaf of lettuce stored in your refrigerator, would you normally consume it?". Survival analysis was used to calculate the shelf lives of minimally processed lettuce, considering both decision-making stages. Shelf lives estimated considering rejection to purchase were significantly lower than those estimated considering rejection to consume. Therefore, in order to be conservative and assure the products' quality, shelf life should be estimated considering consumers' rejection to purchase instead of rejection to consume, as traditionally has been done. On the other hand, results from logistic regressions of consumers' rejection percentage as a function of the evaluated appearance attributes suggested that consumers considered them differently while deciding whether to purchase or to consume minimally processed lettuce.
Devices and process for high-pressure magic angle spinning nuclear magnetic resonance
Hoyt, David W; Sears, Jr., Jesse A; Turcu, Romulus V.F.; Rosso, Kevin M; Hu, Jian Zhi
2014-04-08
A high-pressure magic angle spinning (MAS) rotor is detailed that includes a high-pressure sample cell that maintains high pressures exceeding 150 bar. The sample cell design minimizes pressure losses due to penetration over an extended period of time.
Devices and process for high-pressure magic angle spinning nuclear magnetic resonance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, David W.; Sears, Jesse A.; Turcu, Romulus V. F.
A high-pressure magic angle spinning (MAS) rotor is detailed that includes a high-pressure sample cell that maintains high pressures exceeding 150 bar. The sample cell design minimizes pressure losses due to penetration over an extended period of time.
Microfluidic-Based Robotic Sampling System for Radioactive Solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack D. Law; Julia L. Tripp; Tara E. Smith
A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less
NASA Astrophysics Data System (ADS)
Frimpong, G. K.; Kottoh, I. D.; Ofosu, D. O.; Larbi, D.
2015-05-01
The effect of ionizing radiation on the microbiological quality on minimally processed carrot and lettuce was studied. The aim was to investigate the effect of irradiation as a sanitizing agent on the bacteriological quality of some raw eaten salad vegetables obtained from retailers in Accra, Ghana. Minimally processed carrot and lettuce were analysed for total viable count, total coliform count and pathogenic organisms. The samples collected were treated and analysed for a 15 day period. The total viable count for carrot ranged from 1.49 to 14.01 log10 cfu/10 g while that of lettuce was 0.70 to 8.5 7 log10 cfu/10 g. It was also observed that total coliform count for carrot was 1.46-7.53 log10 cfu/10 g and 0.14-7.35 log10 cfu/10 g for lettuce. The predominant pathogenic organisms identified were Bacillus cereus, Cronobacter sakazakii, Staphylococcus aureus, and Klebsiella spp. It was concluded that 2 kGy was most effective for medium dose treatment of minimally processed carrot and lettuce.
Prevalence and counts of Salmonella spp. in minimally processed vegetables in São Paulo, Brazil.
Sant'Ana, Anderson S; Landgraf, Mariza; Destro, Maria Teresa; Franco, Bernadette D G M
2011-09-01
Minimally processed vegetables (MPV) may be important vehicles of Salmonella spp. and cause disease. This study aimed at detecting and enumerating Salmonella spp. in MPV marketed in the city of São Paulo, Brazil. A total of 512 samples of MPV packages collected in retail stores were tested for Salmonella spp. and total coliforms and Escherichia coli as indication of the hygienic status. Salmonella spp. was detected in four samples, two using the detection method and two using the counting method, where the results were 8.8 × 10(2) CFU/g and 2.4 × 10(2) CFU/g. The serovars were Salmonella Typhimurium (three samples) and Salmonella enterica subsp. enterica O:47:z4,z23:- (one sample). Fourteen samples (2.7%) presented counts of E. coli above the maximum limit established by the Brazilian regulation for MPV (10(2) CFU/g). Therefore, tightened surveillance and effective intervention strategies are necessary in order to address consumers and governments concerns on safety of MPV. Copyright © 2011 Elsevier Ltd. All rights reserved.
Plaza, Lucía; Sánchez-Moreno, Concepción; de Pascual-Teresa, Sonia; de Ancos, Begoña; Cano, M Pilar
2009-04-22
Avocado ( Persea americana Mill.) is a good source of bioactive compounds such as monounsaturated fatty acids and sterols. The impact of minimal processing on its health-promoting attributes was investigated. Avocados cut into slices or halves were packaged in plastic bags under nitrogen, air, or vacuum and stored at 8 degrees C for 13 days. The stabilities of fatty acids and sterols as well as the effect on antioxidant activity were evaluated. The main fatty acid identified and quantified in avocado was oleic acid (about 57% of total content), whereas beta-sitosterol was found to be the major sterol (about 89% of total content). In general, after refrigerated storage, a significant decrease in fatty acid content was observed. Vacuum/halves and air/slices were the samples that maintained better this content. With regard to phytosterols, there were no significant changes during storage. Antioxidant activity showed a slight positive correlation against stearic acid content. At the end of refrigerated storage, a significant increase in antiradical efficiency (AE) was found for vacuum samples. AE values were quite similar among treatments. Hence, minimal processing can be a useful tool to preserve health-related properties of avocado fruit.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2010-08-03
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less
NASA Astrophysics Data System (ADS)
Beyhaghi, Pooriya
2016-11-01
This work considers the problem of the efficient minimization of the infinite time average of a stationary ergodic process in the space of a handful of independent parameters which affect it. Problems of this class, derived from physical or numerical experiments which are sometimes expensive to perform, are ubiquitous in turbulence research. In such problems, any given function evaluation, determined with finite sampling, is associated with a quantifiable amount of uncertainty, which may be reduced via additional sampling. This work proposes the first algorithm of this type. Our algorithm remarkably reduces the overall cost of the optimization process for problems of this class. Further, under certain well-defined conditions, rigorous proof of convergence is established to the global minimum of the problem considered.
Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos
2013-01-01
Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L−1/10 min), peracetic acid (100 mg L−1/15 min) and ozonated water (1.2 mg L−1 /1 min) as alternative sanitizers to sodium hypochlorite (150 mg L−1 free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days. PMID:24516433
Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos
2013-01-01
Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L(-1)/10 min), peracetic acid (100 mg L(-1)/15 min) and ozonated water (1.2 mg L(-1)/1 min) as alternative sanitizers to sodium hypochlorite (150 mg L(-1) free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.
TEMPUS: A facility for containerless electromagnetic processing onboard spacelab
NASA Technical Reports Server (NTRS)
Lenski, H.; Willnecker, R.
1990-01-01
The electromagnetic containerless processing facility TEMPUS was recently assigned for a flight on the IML-2 mission. In comparison to the TEMPUS facility already flown on a sounding rocket, several improvements had to be implemented. These are in particular related to: safety; resource management; and the possibility to process different samples with different requirements in one mission. The basic design of this facility as well as the expected processing capabilities are presented. Two operational aspects turned out to strongly influence the facility design: control of the sample motion (first experimental results indicate that crew or ground interaction will be necessary to minimize residual sample motions during processing); and exchange of RF-coils (during processing in vacuum, evaporated sample materials will condense at the cold surface and may force a coil exchange, when a critical thickness is exceeded).
Capel, P.D.; Larson, S.J.
1995-01-01
Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.
Method for Hot Real-Time Sampling of Gasification Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomeroy, Marc D
The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beammore » Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.« less
Ranjitha, K; Shivashankara, K S; Sudhakar Rao, D V; Oberoi, Harinder Singh; Roy, T K; Bharathamma, H
2017-04-15
Effect of integrating optimized combination of pretreatment with packaging on shelf life of minimally processed cilantro leaves (MPCL) was appraised through analysis of their sensory attributes, biochemical characteristics, microbial population and flavour profile during storage. Minimally pretreated cilantro leaves pretreated with 50ppm kinetin and packed in 25μ polypropylene bags showed a shelf life of 21days. Optimized combination helped in efficiently maintaining sensory parameters, flavour profile, and retention of antioxidants in MPCL until 21days. Studies conducted on the effect of optimized combination on microbial population and flavour profile revealed that among different microorganisms, pectinolysers had a significant effect on spoilage of MPCL and their population of ⩽3.59logcfu/g was found to be acceptable. Principal component analysis of headspace volatiles revealed that (E)-2-undecenal, (E)-2-hexadecenal, (E)-2-tetradecenal & (E)-2-tetradecen-1-ol in stored samples clustered with fresh samples and therefore, could be considered as freshness indicators for MPCL. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Grugel, R. N.; Fedoseyev, A. I.; Kim, S.; Curreri, Peter A. (Technical Monitor)
2002-01-01
Gravity-driven thermosolutal convection that arises during controlled directional solidification (DS) of dendritic alloys promotes detrimental macro-segregation (e.g. freckles and steepling) in products such as turbine blades. Considerable time and effort has been spent to experimentally and theoretically investigate this phenomena; although our knowledge has advanced to the point where convection can be modeled and accurately compared to experimental results, little has been done to minimize its onset and deleterious effects. The experimental work demonstrates that segregation can be. minimized and microstructural uniformity promoted when a slow axial rotation is applied to the sample crucible during controlled directional solidification processing. Numerical modeling utilizing continuation and bifurcation methods have been employed to develop accurate physical and mathematical models with the intent of identifying and optimizing processing parameters.
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Intelligent Sampling of Hazardous Particle Populations in Resource-Constrained Environments
NASA Astrophysics Data System (ADS)
McCollough, J. P.; Quinn, J. M.; Starks, M. J.; Johnston, W. R.
2017-10-01
Sampling of anomaly-causing space environment drivers is necessary for both real-time operations and satellite design efforts, and optimizing measurement sampling helps minimize resource demands. Relating these measurements to spacecraft anomalies requires the ability to resolve spatial and temporal variability in the energetic charged particle hazard of interest. Here we describe a method for sampling particle fluxes informed by magnetospheric phenomenology so that, along a given trajectory, the variations from both temporal dynamics and spatial structure are adequately captured while minimizing oversampling. We describe the coordinates, sampling method, and specific regions and parameters employed. We compare resulting sampling cadences with data from spacecraft spanning the regions of interest during a geomagnetically active period, showing that the algorithm retains the gross features necessary to characterize environmental impacts on space systems in diverse orbital regimes while greatly reducing the amount of sampling required. This enables sufficient environmental specification within a resource-constrained context, such as limited telemetry bandwidth, processing requirements, and timeliness.
Application of higher harmonic blade feathering for helicopter vibration reduction
NASA Technical Reports Server (NTRS)
Powers, R. W.
1978-01-01
Higher harmonic blade feathering for helicopter vibration reduction is considered. Recent wind tunnel tests confirmed the effectiveness of higher harmonic control in reducing articulated rotor vibratory hub loads. Several predictive analyses developed in support of the NASA program were shown to be capable of calculating single harmonic control inputs required to minimize a single 4P hub response. In addition, a multiple-input, multiple-output harmonic control predictive analysis was developed. All techniques developed thus far obtain a solution by extracting empirical transfer functions from sampled data. Algorithm data sampling and processing requirements are minimal to encourage adaptive control system application of such techniques in a flight environment.
New Noble Gas Studies on Popping Rocks from the Mid-Atlantic Ridge near 14°N
NASA Astrophysics Data System (ADS)
Kurz, M. D.; Curtice, J.; Jones, M.; Péron, S.; Wanless, V. D.; Mittelstaedt, E. L.; Soule, S. A.; Klein, F.; Fornari, D. J.
2017-12-01
New Popping Rocks were recovered in situ on the Mid-Atlantic Ridge (MAR) near 13.77° N, using HOV Alvin on cruise AT33-03 in 2016 on RV Atlantis. We report new helium, neon, argon, and CO2 step-crushing measurements on a subset of the glass samples, with a focus on a new procedure to collect seafloor samples with minimal exposure to air. Glassy seafloor basalts were collected in sealed containers using the Alvin mechanical arm and transported to the surface without atmospheric exposure. On the ship, the seawater was drained, the volcanic glass was transferred to stainless steel ultra-high-vacuum containers (in an oxygen-free glove box), which were then evacuated using a turbo-molecular pump and sealed for transport under vacuum. All processing was carried out under a nitrogen atmosphere. A control sample was collected from each pillow outcrop and processed normally in air. The preliminary step-crushing measurements show that the anaerobically collected samples have systematically higher 20Ne/22Ne, 21Ne/22Ne and 40Ar/36Ar than the control samples. Helium abundances and isotopes are consistent between anaerobically collected samples and control samples. These results suggest that minimizing atmospheric exposure during sample processing can significantly reduce air contamination for heavy noble gases, providing a new option for seafloor sampling. Higher vesicle abundances appear to yield a greater difference in neon and argon isotopes between the anaerobic and control samples, suggesting that atmospheric contamination is related to vesicle abundance, possibly through micro-fractures. The new data show variability in the maximum mantle neon and argon isotopic compositions, and abundance ratios, suggesting that the samples experienced variable outgassing prior to eruption, and may represent different phases of a single eruption, or multiple eruptions.
Improving the performance of minimizers and winnowing schemes
Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl
2017-01-01
Abstract Motivation: The minimizers scheme is a method for selecting k-mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k-mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k-mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. Results: We provide an in-depth analysis of the effect of k-mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al.) on the expected density of minimizers in a random sequence. Availability and Implementation: The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git. Contact: gmarcais@cs.cmu.edu or carlk@cs.cmu.edu PMID:28881970
Optimizing Urine Processing Protocols for Protein and Metabolite Detection.
Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K
In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.
Three Dimensional Projection Environment for Molecular Design and Surgical Simulation
2011-08-01
bypasses the cumbersome meshing process . The deformation model is only comprised of mass nodes, which are generated by sampling the object volume before...force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post- processing technique is developed to...render distinct physi-cal tissue properties across different interaction areas. The proposed approach does not require any pre- processing and is
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
Allyl isothiocyanate enhances shelf life of minimally processed shredded cabbage.
Banerjee, Aparajita; Penna, Suprasanna; Variyar, Prasad S
2015-09-15
The effect of allyl isothiocyanate (AITC), in combination with low temperature (10°C) storage on post harvest quality of minimally processed shredded cabbage was investigated. An optimum concentration of 0.05μL/mL AITC was found to be effective in maintaining the microbial and sensory quality of the product for a period of 12days. Inhibition of browning was shown to result from a down-regulation (1.4-fold) of phenylalanine ammonia lyase (PAL) gene expression and a consequent decrease in PAL enzyme activity and o-quinone content. In the untreated control samples, PAL activity increased following up-regulation in PAL gene expression that could be linearly correlated with enhanced o-quinone formation and browning. The efficacy of AITC in extending the shelf life of minimally processed shredded cabbage and its role in down-regulation of PAL gene expression resulting in browning inhibition in the product is reported here for the first time. Copyright © 2015 Elsevier Ltd. All rights reserved.
Barbosa, Ana Andréa Teixeira; Silva de Araújo, Hyrla Grazielle; Matos, Patrícia Nogueira; Carnelossi, Marcelo Augusto Guitierrez; Almeida de Castro, Alessandra
2013-06-17
The aim of this study is to examine the effects of nisin-incorporated cellulose films on the physicochemical and microbiological qualities of minimally processed mangoes. The use of antimicrobial films did not affect the physicochemical characteristics of mangoes and showed antimicrobial activity against Staphylococcus aureus, Listeria monocytogenes, Alicyclobacillus acidoterrestris and Bacillus cereus. The mango slices were inoculated with S. aureus and L. monocytogenes (10(7)CFU/g), and the viable cell numbers remained at 10(5) and 10(6)CFU/g, respectively, after 12days. In samples packed with antimicrobial films, the viable number of L. monocytogenes cells was reduced below the detection level after 4days. After 6days, a reduction of six log units was observed for S. aureus. In conclusion, nisin showed antimicrobial activity in mangoes without interfering with the organoleptic characteristics of the fruit. This result suggests that nisin could potentially be used in active packing to improve the safety of minimally processed mangoes. Copyright © 2013 Elsevier B.V. All rights reserved.
Myelin Breakdown Mediates Age-Related Slowing in Cognitive Processing Speed in Healthy Elderly Men
ERIC Educational Resources Information Center
Lu, Po H.; Lee, Grace J.; Tishler, Todd A.; Meghpara, Michael; Thompson, Paul M.; Bartzokis, George
2013-01-01
Background: To assess the hypothesis that in a sample of very healthy elderly men selected to minimize risk for Alzheimer's disease (AD) and cerebrovascular disease, myelin breakdown in late-myelinating regions mediates age-related slowing in cognitive processing speed (CPS). Materials and methods: The prefrontal lobe white matter and the genu of…
Ultrasonic-based membrane aided sample preparation of urine proteomes.
Jesus, Jemmyson Romário; Santos, Hugo M; López-Fernández, H; Lodeiro, Carlos; Arruda, Marco Aurélio Zezzi; Capelo, J L
2018-02-01
A new ultrafast ultrasonic-based method for shotgun proteomics as well as label-free protein quantification in urine samples is developed. The method first separates the urine proteins using nitrocellulose-based membranes and then proteins are in-membrane digested using trypsin. The enzymatic digestion process is accelerated from overnight to four minutes using a sonoreactor ultrasonic device. Overall, the sample treatment pipeline comprising protein separation, digestion and identification is done in just 3h. The process is assessed using urine of healthy volunteers. The method shows that male can be differentiated from female using the protein content of urine in a fast, easy and straightforward way. 232 and 226 proteins are identified in urine of male and female, respectively. From this, 162 are common to both genders, whilst 70 are unique to male and 64 to female. From the 162 common proteins, 13 are present at levels statistically different (p < 0.05). The method matches the analytical minimalism concept as outlined by Halls, as each stage of this analysis is evaluated to minimize the time, cost, sample requirement, reagent consumption, energy requirements and production of waste products. Copyright © 2017 Elsevier B.V. All rights reserved.
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
Taguchi, Masumi; Kanki, Masashi; Yamaguchi, Yuko; Inamura, Hideichi; Koganei, Yosuke; Sano, Tetsuya; Nakamura, Hiromi; Asakura, Hiroshi
2017-03-01
Incidences of food poisoning traced to nonanimal food products have been increasingly reported. One of these was a recent large outbreak of Shiga toxin-producing Escherichia coli (STEC) O157 infection from the consumption of lightly pickled vegetables, indicating the necessity of imposing hygienic controls during manufacturing. However, little is known about the bacterial contamination levels in these minimally processed vegetables. Here we examined the prevalence of STEC, Salmonella spp., and Listeria monocytogenes in 100 lightly pickled vegetable products manufactured at 55 processing factories. Simultaneously, we also performed quantitative measurements of representative indicator bacteria (total viable counts, coliform counts, and β-glucuronidase-producing E. coli counts). STEC and Salmonella spp. were not detected in any of the samples; L. monocytogenes was detected in 12 samples manufactured at five of the factories. Microbiological surveillance at two factories (two surveys at factory A and three surveys at factory B) between June 2014 and January 2015 determined that the areas predominantly contaminated with L. monocytogenes included the refrigerators and packaging rooms. Genotyping provided further evidence that the contaminants found in these areas were linked to those found in the final products. Taken together, we demonstrated the prevalence of L. monocytogenes in lightly pickled vegetables sold at the retail level. Microbiological surveillance at the manufacturing factories further clarified the sources of the contamination in the retail products. These data indicate the necessity of implementing adequate monitoring programs to minimize health risks attributable to the consumption of these minimally processed vegetables.
Shrivastava, Sajal; Lee, Won-Il; Lee, Nae-Eung
2018-06-30
A critical unmet need in the diagnosis of bacterial infections, which remain a major cause of human morbidity and mortality, is the detection of scarce bacterial pathogens in a variety of samples in a rapid and quantitative manner. Herein, we demonstrate smartphone-based detection of Staphylococcus aureus in a culture-free, rapid, quantitative manner from minimally processed liquid samples using aptamer-functionalized fluorescent magnetic nanoparticles. The tagged S. aureus cells were magnetically captured in a detection cassette, and then fluorescence was imaged using a smartphone camera with a light-emitting diode as the excitation source. Our results showed quantitative detection capability with a minimum detectable concentration as low as 10 cfu/ml by counting individual bacteria cells, efficiently capturing S. aureus cells directly from a peanut milk sample within 10 min. When the selectivity of detection was investigated using samples spiked with other pathogenic bacteria, no significant non-specific detection occurred. Furthermore, strains of S. aureus from various origins showed comparable results, ensuring that the approach can be widely adopted. Therefore, the quantitative fluorescence imaging platform on a smartphone could allow on-site detection of bacteria, providing great potential assistance during major infectious disease outbreaks in remote and resource-limited settings. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated high-throughput protein purification using an ÄKTApurifier and a CETAC autosampler.
Yoo, Daniel; Provchy, Justin; Park, Cynthia; Schulz, Craig; Walker, Kenneth
2014-05-30
As the pace of drug discovery accelerates there is an increased focus on screening larger numbers of protein therapeutic candidates to identify those that are functionally superior and to assess manufacturability earlier in the process. Although there have been advances toward high throughput (HT) cloning and expression, protein purification is still an area where improvements can be made to conventional techniques. Current methodologies for purification often involve a tradeoff between HT automation or capacity and quality. We present an ÄKTA combined with an autosampler, the ÄKTA-AS, which has the capability of purifying up to 240 samples in two chromatographic dimensions without the need for user intervention. The ÄKTA-AS has been shown to be reliable with sample volumes between 0.5 mL and 100 mL, and the innovative use of a uniquely configured loading valve ensures reliability by efficiently removing air from the system as well as preventing sample cross contamination. Incorporation of a sample pump flush minimizes sample loss and enables recoveries ranging from the low tens of micrograms to milligram quantities of protein. In addition, when used in an affinity capture-buffer exchange format the final samples are formulated in a buffer compatible with most assays without requirement of additional downstream processing. The system is designed to capture samples in 96-well microplate format allowing for seamless integration of downstream HT analytic processes such as microfluidic or HPLC analysis. Most notably, there is minimal operator intervention to operate this system, thereby increasing efficiency, sample consistency and reducing the risk of human error. Copyright © 2014 Elsevier B.V. All rights reserved.
Improving the performance of minimizers and winnowing schemes.
Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl
2017-07-15
The minimizers scheme is a method for selecting k -mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k -mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k -mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. We provide an in-depth analysis of the effect of k -mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al. ) on the expected density of minimizers in a random sequence. The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git . gmarcais@cs.cmu.edu or carlk@cs.cmu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Minimal-assumption inference from population-genomic data
NASA Astrophysics Data System (ADS)
Weissman, Daniel; Hallatschek, Oskar
Samples of multiple complete genome sequences contain vast amounts of information about the evolutionary history of populations, much of it in the associations among polymorphisms at different loci. Current methods that take advantage of this linkage information rely on models of recombination and coalescence, limiting the sample sizes and populations that they can analyze. We introduce a method, Minimal-Assumption Genomic Inference of Coalescence (MAGIC), that reconstructs key features of the evolutionary history, including the distribution of coalescence times, by integrating information across genomic length scales without using an explicit model of recombination, demography or selection. Using simulated data, we show that MAGIC's performance is comparable to PSMC' on single diploid samples generated with standard coalescent and recombination models. More importantly, MAGIC can also analyze arbitrarily large samples and is robust to changes in the coalescent and recombination processes. Using MAGIC, we show that the inferred coalescence time histories of samples of multiple human genomes exhibit inconsistencies with a description in terms of an effective population size based on single-genome data.
Evaluation of process errors in bed load sampling using a Dune Model
Gomez, Basil; Troutman, Brent M.
1997-01-01
Reliable estimates of the streamwide bed load discharge obtained using sampling devices are dependent upon good at-a-point knowledge across the full width of the channel. Using field data and information derived from a model that describes the geometric features of a dune train in terms of a spatial process observed at a fixed point in time, we show that sampling errors decrease as the number of samples collected increases, and the number of traverses of the channel over which the samples are collected increases. It also is preferable that bed load sampling be conducted at a pace which allows a number of bed forms to pass through the sampling cross section. The situations we analyze and simulate pertain to moderate transport conditions in small rivers. In such circumstances, bed load sampling schemes typically should involve four or five traverses of a river, and the collection of 20–40 samples at a rate of five or six samples per hour. By ensuring that spatial and temporal variability in the transport process is accounted for, such a sampling design reduces both random and systematic errors and hence minimizes the total error involved in the sampling process.
Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano
2018-06-01
Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for the increased use of this minimally invasive, fast and cost-effective technique in feline medicine.
Cornwell, Brittany; Villamor, Eduardo; Mora-Plazas, Mercedes; Marin, Constanza; Monteiro, Carlos A; Baylin, Ana
2018-01-01
To determine if processed and ultra-processed foods consumed by children in Colombia are associated with lower-quality nutrition profiles than less processed foods. We obtained information on sociodemographic and anthropometric variables and dietary information through dietary records and 24 h recalls from a convenience sample of the Bogotá School Children Cohort. Foods were classified into three categories: (i) unprocessed and minimally processed foods, (ii) processed culinary ingredients and (iii) processed and ultra-processed foods. We also examined the combination of unprocessed foods and processed culinary ingredients. Representative sample of children from low- to middle-income families in Bogotá, Colombia. Children aged 5-12 years in 2011 Bogotá School Children Cohort. We found that processed and ultra-processed foods are of lower dietary quality in general. Nutrients that were lower in processed and ultra-processed foods following adjustment for total energy intake included: n-3 PUFA, vitamins A, B12, C and E, Ca and Zn. Nutrients that were higher in energy-adjusted processed and ultra-processed foods compared with unprocessed foods included: Na, sugar and trans-fatty acids, although we also found that some healthy nutrients, including folate and Fe, were higher in processed and ultra-processed foods compared with unprocessed and minimally processed foods. Processed and ultra-processed foods generally have unhealthy nutrition profiles. Our findings suggest the categorization of foods based on processing characteristics is promising for understanding the influence of food processing on children's dietary quality. More studies accounting for the type and degree of food processing are needed.
Melo, Ingrid Sofia Vieira de; Costa, Clara Andrezza Crisóstomo Bezerra; Santos, João Victor Laurindo Dos; Santos, Aldenir Feitosa Dos; Florêncio, Telma Maria de Menezes Toledo; Bueno, Nassib Bezerra
2017-01-01
The consumption of ultra-processed foods may be associated with the development of chronic diseases, both in adults and in children/adolescents. This consumption is growing worldwide, especially in low and middle-income countries. Nevertheless, its magnitude in small, poor cities from the countryside is not well characterized, especially in adolescents. This study aimed to assess the consumption of minimally processed, processed and ultra-processed foods by adolescents from a poor Brazilian city and to determine if it was associated with excess weight, high waist circumference and high blood pressure. Cross-sectional study, conducted at a public federal school that offers technical education together with high school, located in the city of Murici. Adolescents of both sexes and aged between 14-19 years old were included. Anthropometric characteristics (weight, height, waist circumference), blood pressure, and dietary intake data were assessed. Associations were calculated using Poisson regression models, adjusted by sex and age. At total, 249 adolescents were included, being 55.8% girls, with a mean age of 16 years-old. The consumption of minimally processed foods was inversely associated with excess weight (Adjusted Prevalence Ratio: 0.61, 95% Confidence Interval: [0.39-0.96], P = 0.03). Although the consumption of ultra-processed foods was not associated with excess weight, high blood pressure and high waist circumference, 46.2% of the sample reported eating these products more than weekly. Consumption of minimally processed food is inversely associated with excess weight in adolescents. Investments in nutritional education aiming the prevention of chronic diseases associated with the consumption of these foods are necessary.
Wijnands, Lucas M; Delfgou-van Asch, Ellen H M; Beerepoot-Mensink, Marieke E; van der Meij-Florijn, Alice; Fitz-James, Ife; van Leusden, Frans M; Pielaat, Annemarie
2014-03-01
Recent outbreaks with vegetable or fruits as vehicles have raised interest in the characterization of the public health risk due to microbial contamination of these commodities. Because qualitative and quantitative data regarding prevalence and concentration of various microbes are lacking, we conducted a survey to estimate the prevalence and contamination level of raw produce and the resulting minimally processed packaged salads as sold in The Netherlands. A dedicated sampling plan accounted for the amount of processed produce in relation to the amount of products, laboratory capacity, and seasonal influences. Over 1,800 samples of produce and over 1,900 samples of ready-to-eat mixed salads were investigated for Salmonella enterica serovars, Campylobacter spp., Escherichia coli O157, and Listeria monocytogenes. The overall prevalence in raw produce varied between 0.11% for E. coli O157 and L. monocytogenes and 0.38% for Salmonella. Prevalence point estimates for specific produce/pathogen combinations ranged for Salmonella from 0.53% in iceberg lettuce to 5.1% in cucumber. For Campylobacter, this ranged from 0.83% in endive to 2.7% in oak tree lettuce. These data will be used to determine the public health risk posed by the consumption of ready-to-eat mixed salads in The Netherlands.
FPGA design for constrained energy minimization
NASA Astrophysics Data System (ADS)
Wang, Jianwei; Chang, Chein-I.; Cao, Mang
2004-02-01
The Constrained Energy Minimization (CEM) has been widely used for hyperspectral detection and classification. The feasibility of implementing the CEM as a real-time processing algorithm in systolic arrays has been also demonstrated. The main challenge of realizing the CEM in hardware architecture in the computation of the inverse of the data correlation matrix performed in the CEM, which requires a complete set of data samples. In order to cope with this problem, the data correlation matrix must be calculated in a causal manner which only needs data samples up to the sample at the time it is processed. This paper presents a Field Programmable Gate Arrays (FPGA) design of such a causal CEM. The main feature of the proposed FPGA design is to use the Coordinate Rotation DIgital Computer (CORDIC) algorithm that can convert a Givens rotation of a vector to a set of shift-add operations. As a result, the CORDIC algorithm can be easily implemented in hardware architecture, therefore in FPGA. Since the computation of the inverse of the data correlction involves a series of Givens rotations, the utility of the CORDIC algorithm allows the causal CEM to perform real-time processing in FPGA. In this paper, an FPGA implementation of the causal CEM will be studied and its detailed architecture will be also described.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2011-02-11
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less
Three-level sampler having automated thresholds
NASA Technical Reports Server (NTRS)
Jurgens, R. F.
1976-01-01
A three-level sampler is described that has its thresholds controlled automatically so as to track changes in the statistics of the random process being sampled. In particular, the mean value is removed and the ratio of the standard deviation of the random process to the threshold is maintained constant. The system is configured in such a manner that slow drifts in the level comparators and digital-to-analog converters are also removed. The ratio of the standard deviation to threshold level may be chosen within the constraints of the ratios of two integers N and M. These may be chosen to minimize the quantizing noise of the sampled process.
Shelf life extension of minimally processed cabbage and cucumber through gamma irradiation.
Khattak, Amal Badshah; Bibi, Nizakat; Chaudry, Muhammad Ashraf; Khan, Misal; Khan, Maazullah; Qureshi, Muhammad Jamil
2005-01-01
The influence of irradiation of minimally processed cabbage and cucumber on microbial safety, texture, and sensory quality was investigated. Minimally processed, polyethylene-packed, and irradiated cabbage and cucumber were stored at refrigeration temperature (5 degrees C) for 2 weeks. The firmness values ranged from 3.23 kg (control) to 2.82 kg (3.0-kGy irradiated samples) for cucumbers, with a gradual decrease in firmness with increasing radiation dose (0 to 3 kGy). Cucumbers softened just after irradiation with a dose of 3.0 kGy and after 14 days storage, whereas the texture remained within acceptable limits up to a radiation dose of 2.5 kGy. The radiation treatment had no effect on the appearance scores of cabbage; however, scores decreased from 7.0 to 6.7 during storage. The appearance and flavor scores of cucumbers decreased with increasing radiation dose, and overall acceptability was better after radiation doses of 2.5 and 3.0 kGy. The aerobic plate counts per gram for cabbage increased from 3 to 5 log CFU (control), from 1.85 to 2.93 log CFU (2.5 kGy), and from a few colonies to 2.6 log CFU (3.0 kGy) after 14 days of storage at 5 degrees C. A similar trend was noted for cucumber samples. No coliform bacteria were detected at radiation doses greater than 2.0 kGy in either cabbage or cucumber samples. Total fungal counts per gram of sample were within acceptable limits for cucumbers irradiated at 3.0 kGy, and for cabbage no fungi were detected after 2.0-kGy irradiation. The D-values for Escherichia coli in cucumber and cabbage were 0.19 and 0.17 kGy, and those for Salmonella Paratyphi A were 0.25 and 0.29 kGy for cucumber and cabbage, respectively.
Ultra-processed foods and the nutritional dietary profile in Brazil
Louzada, Maria Laura da Costa; Martins, Ana Paula Bortoletto; Canella, Daniela Silva; Baraldi, Larissa Galastri; Levy, Renata Bertazzi; Claro, Rafael Moreira; Moubarac, Jean-Claude; Cannon, Geoffrey; Monteiro, Carlos Augusto
2015-01-01
OBJECTIVE To assess the impact of consuming ultra-processed foods on the nutritional dietary profile in Brazil. METHODS Cross-sectional study conducted with data from the module on individual food consumption from the 2008-2009 Pesquisa de Orçamentos Familiares (POF – Brazilian Family Budgets Survey). The sample, which represented the section of the Brazilian population aged 10 years or over, involved 32,898 individuals. Food consumption was evaluated by two 24-hour food records. The consumed food items were classified into three groups: natural or minimally processed, including culinary preparations with these foods used as a base; processed; and ultra-processed. RESULTS The average daily energy consumption per capita was 1,866 kcal, with 69.5% being provided by natural or minimally processed foods, 9.0% by processed foods and 21.5% by ultra-processed food. The nutritional profile of the fraction of ultra-processed food consumption showed higher energy density, higher overall fat content, higher saturated and trans fat, higher levels of free sugar and less fiber, protein, sodium and potassium, when compared to the fraction of consumption related to natural or minimally processed foods. Ultra-processed foods presented generally unfavorable characteristics when compared to processed foods. Greater inclusion of ultra-processed foods in the diet resulted in a general deterioration in the dietary nutritional profile. The indicators of the nutritional dietary profile of Brazilians who consumed less ultra-processed foods, with the exception of sodium, are the stratum of the population closer to international recommendations for a healthy diet. CONCLUSIONS The results from this study highlight the damage to health that is arising based on the observed trend in Brazil of replacing traditional meals, based on natural or minimally processed foods, with ultra-processed foods. These results also support the recommendation of avoiding the consumption of these kinds of foods. PMID:26176747
Ultra-processed foods and the nutritional dietary profile in Brazil.
Costa Louzada, Maria Laura da; Martins, Ana Paula Bortoletto; Canella, Daniela Silva; Baraldi, Larissa Galastri; Levy, Renata Bertazzi; Claro, Rafael Moreira; Moubarac, Jean-Claude; Cannon, Geoffrey; Monteiro, Carlos Augusto
2015-01-01
OBJECTIVE To assess the impact of consuming ultra-processed foods on the nutritional dietary profile in Brazil. METHODS Cross-sectional study conducted with data from the module on individual food consumption from the 2008-2009 Pesquisa de Orçamentos Familiares (POF - Brazilian Family Budgets Survey). The sample, which represented the section of the Brazilian population aged 10 years or over, involved 32,898 individuals. Food consumption was evaluated by two 24-hour food records. The consumed food items were classified into three groups: natural or minimally processed, including culinary preparations with these foods used as a base; processed; and ultra-processed. RESULTS The average daily energy consumption per capita was 1,866 kcal, with 69.5% being provided by natural or minimally processed foods, 9.0% by processed foods and 21.5% by ultra-processed food. The nutritional profile of the fraction of ultra-processed food consumption showed higher energy density, higher overall fat content, higher saturated and trans fat, higher levels of free sugar and less fiber, protein, sodium and potassium, when compared to the fraction of consumption related to natural or minimally processed foods. Ultra-processed foods presented generally unfavorable characteristics when compared to processed foods. Greater inclusion of ultra-processed foods in the diet resulted in a general deterioration in the dietary nutritional profile. The indicators of the nutritional dietary profile of Brazilians who consumed less ultra-processed foods, with the exception of sodium, are the stratum of the population closer to international recommendations for a healthy diet. CONCLUSIONS The results from this study highlight the damage to health that is arising based on the observed trend in Brazil of replacing traditional meals, based on natural or minimally processed foods, with ultra-processed foods. These results also support the recommendation of avoiding the consumption of these kinds of foods.
Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G
2015-07-01
Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The use of bulk EC a gradient as an exhaustive variable, known at any node of an interpolation grid, has allowed the optimization of the sampling scheme, distinguishing among areas with different priority levels.
Burtis, C.A.; Johnson, W.F.; Walker, W.A.
1985-08-05
A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.
Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.
1988-01-01
A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.
Costa, Clara Andrezza Crisóstomo Bezerra; dos Santos, João Victor Laurindo
2017-01-01
Background The consumption of ultra-processed foods may be associated with the development of chronic diseases, both in adults and in children/adolescents. This consumption is growing worldwide, especially in low and middle-income countries. Nevertheless, its magnitude in small, poor cities from the countryside is not well characterized, especially in adolescents. This study aimed to assess the consumption of minimally processed, processed and ultra-processed foods by adolescents from a poor Brazilian city and to determine if it was associated with excess weight, high waist circumference and high blood pressure. Methods Cross-sectional study, conducted at a public federal school that offers technical education together with high school, located in the city of Murici. Adolescents of both sexes and aged between 14–19 years old were included. Anthropometric characteristics (weight, height, waist circumference), blood pressure, and dietary intake data were assessed. Associations were calculated using Poisson regression models, adjusted by sex and age. Results At total, 249 adolescents were included, being 55.8% girls, with a mean age of 16 years-old. The consumption of minimally processed foods was inversely associated with excess weight (Adjusted Prevalence Ratio: 0.61, 95% Confidence Interval: [0.39–0.96], P = 0.03). Although the consumption of ultra-processed foods was not associated with excess weight, high blood pressure and high waist circumference, 46.2% of the sample reported eating these products more than weekly. Conclusion Consumption of minimally processed food is inversely associated with excess weight in adolescents. Investments in nutritional education aiming the prevention of chronic diseases associated with the consumption of these foods are necessary. PMID:29190789
Corrêa, Elizabeth Nappi; Retondario, Anabelle; Alves, Mariane de Almeida; Bricarello, Liliana Paula; Rockenbach, Gabriele; Hinnig, Patrícia de Fragas; Neves, Janaina das; Vasconcelos, Francisco de Assis Guedes de
2018-03-29
Access to food retailers is an environmental determinant that influences what people consume. This study aimed to test the association between the use of food outlets and schoolchildren's intake of minimally processed and ultra-processed foods. This was a cross-sectional study conducted in public and private schools in Florianópolis, state of Santa Catarina, southern Brazil, from September 2012 to June 2013. The sample consisted of randomly selected clusters of schoolchildren aged 7 to 14 years, who were attending 30 schools. Parents or guardians provided socioeconomic and demographic data and answered questions about use of food outlets. Dietary intake was surveyed using a dietary recall questionnaire based on the previous day's intake. The foods or food groups were classified according to the level of processing. Negative binomial regression was used for data analysis. We included 2,195 schoolchildren in the study. We found that buying foods from snack bars or fast-food outlets was associated with the intake frequency of ultra-processed foods among 11-14 years old in an adjusted model (incidence rate ratio, IRR: 1.11; 95% confidence interval, CI: 1.01;1.23). Use of butchers was associated with the intake frequency of unprocessed/minimally processed foods among children 11-14 years old in the crude model (IRR: 1.11; 95% CI: 1.01;1.22) and in the adjusted model (IRR: 1.11; 95% CI: 1.06;1.17). Use of butchers was associated with higher intake of unprocessed/minimally processed foods while use of snack bars or fast-food outlets may have a negative impact on schoolchildren's dietary habits.
NASA Astrophysics Data System (ADS)
Usmanov, Dilshadbek T.; Ninomiya, Satoshi; Hiraoka, Kenzo
2013-11-01
In this paper, the important issue of the desorption of less- and nonvolatile compounds with minimal sample decomposition in ambient mass spectrometry is approached using ambient flash desorption mass spectrometry. The preheated stainless steel filament was driven down and up along the vertical axis in 0.3 s. At the lowest position, it touched the surface of the sample with an invasion depth of 0.1 mm in 50 ms (flash heating) and was removed from the surface (fast cooling). The heating rate corresponds to ~104 °C/s at the filament temperature of 500 °C. The desorbed gaseous molecules were ionized by using a dielectric barrier discharge ion source, and the produced ions were detected by a time-of-flight (TOF) mass spectrometer. Less-volatile samples, such as pharmaceutical tablets, narcotics, explosives, and C60 gave molecular and protonated molecule ions as major ions with thermal decomposition minimally suppressed. For synthetic polymers (PMMA, PLA, and PS), the mass spectra reflected their backbone structures because of the suppression of the sequential thermal decompositions of the primary products. The present technique appears to be suitable for high-throughput qualitative analyses of many types of solid samples in the range from a few ng to 10 μg with minimal sample consumption. Some contribution from tribodesorption in addition to thermal desorption was suggested for the desorption processes. [Figure not available: see fulltext.
USDA-ARS?s Scientific Manuscript database
Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...
USDA-ARS?s Scientific Manuscript database
Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Berry, Elaine D; Millner, Patricia D; Wells, James E; Kalchayanand, Norasak; Guerini, Michael N
2013-08-01
Reducing Escherichia coli O157:H7 in livestock manures before application to cropland is critical for reducing the risk of foodborne illness associated with produce. Our objective was to determine the fate of naturally occurring E. coli O157:H7 and other pathogens during minimally managed on-farm bovine manure composting processes. Feedlot pen samples were screened to identify E. coli O157:H7-positive manure. Using this manure, four piles of each of three different composting formats were constructed in each of two replicate trials. Composting formats were (i) turned piles of manure plus hay and straw, (ii) static stockpiles of manure, and (iii) static piles of covered manure plus hay and straw. Temperatures in the tops, toes, and centers of the conical piles (ca. 6.0 m(3) each) were monitored. Compost piles that were turned every 2 weeks achieved higher temperatures for longer periods in the tops and centers than did piles that were left static. E. coli O157:H7 was not recovered from top samples of turned piles of manure plus hay and straw at day 28 and beyond, but top samples from static piles were positive for the pathogen up to day 42 (static manure stockpiles) and day 56 (static covered piles of manure plus hay and straw). Salmonella, Campylobacter spp., and Listeria monocytogenes were not found in top or toe samples at the end of the composting period, but E. coli O157:H7 and Listeria spp. were recovered from toe samples at day 84. Our findings indicate that some minimally managed composting processes can reduce E. coli O157:H7 and other pathogens in bovine manure but may be affected by season and/or initial levels of indigenous thermophilic bacteria. Our results also highlight the importance of adequate C:N formulation of initial mixtures for the production of high temperatures and rapid composting, and the need for periodic turning of the piles to increase the likelihood that all parts of the mass are subjected to high temperatures.
Cross, Paul C.; Caillaud, Damien; Heisey, Dennis M.
2013-01-01
Many ecological and epidemiological studies occur in systems with mobile individuals and heterogeneous landscapes. Using a simulation model, we show that the accuracy of inferring an underlying biological process from observational data depends on movement and spatial scale of the analysis. As an example, we focused on estimating the relationship between host density and pathogen transmission. Observational data can result in highly biased inference about the underlying process when individuals move among sampling areas. Even without sampling error, the effect of host density on disease transmission is underestimated by approximately 50 % when one in ten hosts move among sampling areas per lifetime. Aggregating data across larger regions causes minimal bias when host movement is low, and results in less biased inference when movement rates are high. However, increasing data aggregation reduces the observed spatial variation, which would lead to the misperception that a spatially targeted control effort may not be very effective. In addition, averaging over the local heterogeneity will result in underestimating the importance of spatial covariates. Minimizing the bias due to movement is not just about choosing the best spatial scale for analysis, but also about reducing the error associated with using the sampling location as a proxy for an individual’s spatial history. This error associated with the exposure covariate can be reduced by choosing sampling regions with less movement, including longitudinal information of individuals’ movements, or reducing the window of exposure by using repeated sampling or younger individuals.
Stocka, Jolanta; Tankiewicz, Maciej; Biziuk, Marek; Namieśnik, Jacek
2011-01-01
Pesticides are among the most dangerous environmental pollutants because of their stability, mobility and long-term effects on living organisms. Their presence in the environment is a particular danger. It is therefore crucial to monitor pesticide residues using all available analytical methods. The analysis of environmental samples for the presence of pesticides is very difficult: the processes involved in sample preparation are labor-intensive and time-consuming. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solvent-less and solvent-minimized techniques are becoming popular. The application of Green Chemistry principles to sample preparation is primarily leading to the miniaturization of procedures and the use of solvent-less techniques, and these are discussed in the paper. PMID:22174632
X-Ray Computed Tomography: The First Step in Mars Sample Return Processing
NASA Technical Reports Server (NTRS)
Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.
2017-01-01
The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and characterize the cached samples without altering the materials [1,2]. A recent report [4] indicates that XCT may minimally alter samples for some techniques, and work is needed to quantify these effects, maximizing science return from XCT initial analysis while minimizing effects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amoroso, J.; Peeler, D.; Edwards, T.
2012-05-11
A recommendation to eliminate all characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification sample was made by a Six-Sigma team chartered to eliminate non-value-added activities for the Defense Waste Processing Facility (DWPF) sludge batch qualification program and is documented in the report SS-PIP-2006-00030. That recommendation was supported through a technical data review by the Savannah River National Laboratory (SRNL) and is documented in the memorandums SRNL-PSE-2007-00079 and SRNL-PSE-2007-00080. At the time of writing those memorandums, the DWPF was processing sludge-only waste but, has since transitioned to a coupledmore » operation (sludge and salt). The SRNL was recently tasked to perform a similar data review relevant to coupled operations and re-evaluate the previous recommendations. This report evaluates the validity of eliminating the characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification samples based on sludge-only and coupled operations. The pour stream sample has confirmed the DWPF's ability to produce an acceptable waste form from Slurry Mix Evaporator (SME) blending and product composition/durability predictions for the previous sixteen years but, ultimately the pour stream analysis has added minimal value to the DWPF's waste qualification strategy. Similarly, the information gained from the glass fabrication and PCT of the sludge batch qualification sample was determined to add minimal value to the waste qualification strategy since that sample is routinely not representative of the waste composition ultimately processed at the DWPF due to blending and salt processing considerations. Moreover, the qualification process has repeatedly confirmed minimal differences in glass behavior from actual radioactive waste to glasses fabricated from simulants or batch chemicals. In contrast, the variability study has significantly added value to the DWPF's qualification strategy. The variability study has evolved to become the primary aspect of the DWPF's compliance strategy as it has been shown to be versatile and capable of adapting to the DWPF's various and diverse waste streams and blending strategies. The variability study, which aims to ensure durability requirements and the PCT and chemical composition correlations are valid for the compositional region to be processed at the DWPF, must continue to be performed. Due to the importance of the variability study and its place in the DWPF's qualification strategy, it will also be discussed in this report. An analysis of historical data and Production Records indicated that the recommendation of the Six Sigma team to eliminate all characterization of pour stream glass samples and the glass fabrication and PCT performed with the qualification glass does not compromise the DWPF's current compliance plan. Furthermore, the DWPF should continue to produce an acceptable waste form following the remaining elements of the Glass Product Control Program; regardless of a sludge-only or coupled operations strategy. If the DWPF does decide to eliminate the characterization of pour stream samples, pour stream samples should continue to be collected for archival reasons, which would allow testing to be performed should any issues arise or new repository test methods be developed.« less
Rawson, Ashish; Koidis, Anastasios; Rai, Dilip K; Tuohy, Maria; Brunton, Nigel
2010-07-14
The effect of blanching (95 +/- 3 degrees C) followed by sous vide (SV) processing (90 degrees C for 10 min) on levels of two polyacetylenes in parsnip disks immediately after processing and during chill storage was studied and compared with the effect of water immersion (WI) processing (70 degrees C for 2 min.). Blanching had the greatest influence on the retention of polyacetylenes in sous vide processed parsnip disks resulting in significant decreases of 24.5 and 24% of falcarinol (1) and falcarindiol (2) respectively (p < 0.05). Subsequent SV processing did not result in additional significant losses in polyacetylenes compared to blanched samples. Subsequent anaerobic storage of SV processed samples resulted in a significant decrease in 1 levels (p < 0.05) although no change in 2 levels was observed (p > 0.05). 1 levels in WI processed samples were significantly higher than in SV samples (p
Wu, Zhi-shuang; Zhang, Min; Wang, Shao-jin
2012-08-30
High-pressure (HP) inert gas processing causes inert gas and water molecules to form clathrate hydrates that restrict intracellular water activity and enzymatic reactions. This technique can be used to preserve fruits and vegetables. In this study, minimally processed (MP) pineapples were treated with HP (∼10 MPa) argon (Ar) and nitrogen (N) for 20 min. The effects of these treatments on respiration, browning and antioxidant potential of MP pineapples were investigated after cutting and during 20 days of storage at 4 °C. Lower respiration rate and ethylene production were found in HP Ar- and HP N-treated samples compared with control samples. HP Ar and HP N treatments effectively reduced browning and loss of total phenols and ascorbic acid and maintained antioxidant capacity of MP pineapples. They did not cause a significant decline in tissue firmness or increase in juice leakage. HP Ar treatments had greater effects than HP N treatments on reduction of respiration rate and ethylene production and maintenance of phenolic compounds and DPPH(•) and ABTS(•+) radical-scavenging activities. Both HP Ar and HP N processing had beneficial effects on MP pineapples throughout 20 days of storage at 4 °C. Copyright © 2012 Society of Chemical Industry.
Cardador, Maria Jose; Gallego, Mercedes
2012-07-25
Chlorine solutions are usually used to sanitize fruit and vegetables in the fresh-cut industry due to their efficacy, low cost, and simple use. However, disinfection byproducts such as haloacetic acids (HAAs) can be formed during this process, which can remain on minimally processed vegetables (MPVs). These compounds are toxic and/or carcinogenic and have been associated with human health risks; therefore, the U.S. Environmental Protection Agency has set a maximum contaminant level for five HAAs at 60 μg/L in drinking water. This paper describes the first method to determine the nine HAAs that can be present in MPV samples, with static headspace coupled with gas chromatography-mass spectrometry where the leaching and derivatization of the HAAs are carried out in a single step. The proposed method is sensitive, with limits of detection between 0.1 and 2.4 μg/kg and an average relative standard deviation of ∼8%. From the samples analyzed, we can conclude that about 23% of them contain at least two HAAs (<0.4-24 μg/kg), which showed that these compounds are formed during washing and then remain on the final product.
Generalization bounds of ERM-based learning processes for continuous-time Markov chains.
Zhang, Chao; Tao, Dacheng
2012-12-01
Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M
2016-02-01
In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.
2014-09-01
optimal diagonal loading which minimizes the MSE. The be- havior of optimal diagonal loading when the arrival process is composed of plane waves embedded...observation vectors. The examples of the ensemble correlation matrix corresponding to the input process consisting of a single or multiple plane waves...Y ∗ij is a complex-conjugate of Yij. This result is used in order to evaluate the expectations of different quadratic forms. The Poincare -Nash
Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.
Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M
2016-10-07
Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.
Filtration of water-sediment samples for the determination of organic compounds
Sandstrom, Mark W.
1995-01-01
This report describes the equipment and procedures used for on-site filtration of surface-water and ground-water samples for determination of organic compounds. Glass-fiber filters and a positive displacement pumping system are suitable for processing most samples for organic analyses. An optional system that uses disposable in-line membrane filters is suitable for a specific gas chromatography/mass spectrometry, selected-ion monitoring analytical method for determination of organonitrogen herbicides. General procedures to minimize contamination of the samples include preparing a clean workspace at the site, selecting appropriate sample-collection materials, and cleaning of the equipment with detergent, tap water, and methanol.
2014-09-30
resulted in the identification of metabolite patterns indicative of flight line exposure when compared to non -flight line control subjects...virtually non -invasive sample collection, minimal sample processing, robust and stable analytical platform, with excellent analytical and biological...identification of metabolite patterns indicative of flight line exposure when compared to non -flight line control subjects. Regardless of fuel (JP-4 or
Adaptive Local Linear Regression with Application to Printer Color Management
2008-01-01
values formed the test samples. This process guaranteed that the CIELAB test samples were in the gamut for each printer, but each printer had a...digital images has recently led to increased consumer demand for accurate color reproduction. Given a CIELAB color one would like to reproduce, the color...management problem is to determine what RGB color one must send the printer to minimize the error between the desired CIELAB color and the CIELAB
Koidis, Anastasios; Rawson, Ashish; Tuohy, Maria; Brunton, Nigel
2012-06-01
Carrots and parsnips are often consumed as minimally processed ready-to-eat convenient foods and contain in minor quantities, bioactive aliphatic C17-polyacetylenes (falcarinol, falcarindiol, falcarindiol-3-acetate). Their retention during minimal processing in an industrial trial was evaluated. Carrot and parsnips were prepared in four different forms (disc cutting, baton cutting, cubing and shredding) and samples were taken in every point of their processing line. The unit operations were: peeling, cutting and washing with chlorinated water and also retention during 7days storage was evaluated. The results showed that the initial unit operations (mainly peeling) influence the polyacetylene retention. This was attributed to the high polyacetylene content of their peels. In most cases, when washing was performed after cutting, less retention was observed possibly due to leakage during tissue damage occurred in the cutting step. The relatively high retention during storage indicates high plant matrix stability. Comparing the behaviour of polyacetylenes in the two vegetables during storage, the results showed that they were slightly more retained in parsnips than in carrots. Unit operations and especially abrasive peeling might need further optimisation to make them gentler and minimise bioactive losses. Copyright © 2011 Elsevier Ltd. All rights reserved.
A novel approach for calculating shelf life of minimally processed vegetables.
Corbo, Maria Rosaria; Del Nobile, Matteo Alessandro; Sinigaglia, Milena
2006-01-15
Shelf life of minimally processed vegetables is often calculated by using the kinetic parameters of Gompertz equation as modified by Zwietering et al. [Zwietering, M.H., Jongenburger, F.M., Roumbouts, M., van't Riet, K., 1990. Modelling of the bacterial growth curve. Applied and Environmental Microbiology 56, 1875-1881.] taking 5x10(7) CFU/g as the maximum acceptable contamination value consistent with acceptable quality of these products. As this method does not allow estimation of the standard errors of the shelf life, in this paper the modified Gompertz equation was re-parameterized to directly include the shelf life as a fitting parameter among the Gompertz parameters. Being the shelf life a fitting parameter is possible to determine its confidence interval by fitting the proposed equation to the experimental data. The goodness-of-fit of this new equation was tested by using mesophilic bacteria cell loads from different minimally processed vegetables (packaged fresh-cut lettuce, fennel and shredded carrots) that differed for some process operations or for package atmosphere. The new equation was able to describe the data well and to estimate the shelf life. The results obtained emphasize the importance of using the standard errors for the shelf life value to show significant differences among the samples.
Minimizing microbial contamination of sperm samples
Jenkins, Jill A.; Tiersch, Terrence R.; Green, Christopher C.
2011-01-01
Taken from the Methods section: With the collection and translocation of gametes from aquatic species, a potential hazard exists for microbial transfer. Contamination of semen can occur during collection, processing, storage, and transport. Some preventative measures are described below for limiting the spread and amplification of microorganisms such as bacteria, viruses, fungi, mycoplasmas, and parasites. Generally, sanitation during collection is essential. Materials and equipment used to freeze semen should be sterile. Following good practice guidelines for handling and processing samples collected for freezing is especially important for non-domestic animals where disease-free status cannot be guaranteed and unsophisticated technology is used (Russell et al. 1977).
Pendant-Drop Surface-Tension Measurement On Molten Metal
NASA Technical Reports Server (NTRS)
Man, Kin Fung; Thiessen, David
1996-01-01
Method of measuring surface tension of molten metal based on pendant-drop method implemented in quasi-containerless manner and augmented with digital processing of image data. Electrons bombard lower end of sample rod in vacuum, generating hanging drop of molten metal. Surface tension of drop computed from its shape. Technique minimizes effects of contamination.
USDA-ARS?s Scientific Manuscript database
Molecular gut-content analysis enables direct detection of arthropod predation with minimal disruption of on-going ecosystem processes. Mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, could lead to regurgitation or even rupturing of predators along with uneaten ...
Atomistic minimal model for estimating profile of electrodeposited nanopatterns
NASA Astrophysics Data System (ADS)
Asgharpour Hassankiadeh, Somayeh; Sadeghi, Ali
2018-06-01
We develop a computationally efficient and methodologically simple approach to realize molecular dynamics simulations of electrodeposition. Our minimal model takes into account the nontrivial electric field due a sharp electrode tip to perform simulations of the controllable coating of a thin layer on a surface with an atomic precision. On the atomic scale a highly site-selective electrodeposition of ions and charged particles by means of the sharp tip of a scanning probe microscope is possible. A better understanding of the microscopic process, obtained mainly from atomistic simulations, helps us to enhance the quality of this nanopatterning technique and to make it applicable in fabrication of nanowires and nanocontacts. In the limit of screened inter-particle interactions, it is feasible to run very fast simulations of the electrodeposition process within the framework of the proposed model and thus to investigate how the shape of the overlayer depends on the tip-sample geometry and dielectric properties, electrolyte viscosity, etc. Our calculation results reveal that the sharpness of the profile of a nano-scale deposited overlayer is dictated by the normal-to-sample surface component of the electric field underneath the tip.
Intraoral Laser Welding (ILW): ultrastructural and mechanical analysis
NASA Astrophysics Data System (ADS)
Fornaini, Carlo; Passaretti, Francesca; Villa, Elena; Nammour, Samir
2010-05-01
Nd:YAG, currently used since 1970 in dental laboratories to weld metals on dental prostheses has some limits such great dimensions, high costs and fixed delivery system. Recently it was proposed the possibility to use the Nd:YAG laser device commonly utilised in dental office, to repair broken fixed, removable and orthodontic prostheses and to weld metals directly into the mouth. The aim of this work is to value, through SEM (Scanning Electron Microscope), EDS (Energy Dispersive X-Ray Spectroscopy) and DMA (Dynamic Mechanical Analysis), quality and mechanical strength of the welding process comparing a device normally used in dental lab and a device normally used in dental office for oral surgery. Sixteen CoCrMo metal plates and twenty steel orthodontic wires were divided in four groups: one was welded without metal apposition by laboratory laser, one was welded with metal apposition by laboratory laser, one was welded without metal apposition by office laser and one was welded with metal apposition by office laser. The welding process was analysed by SEM, EDS and DMA to compare the differences between the different samples. By SEM analysis it was seen that the plates welded by office laser without apposition metal showed a greater number of fissurations compared with the other samples. By EDS analysis it was seen a homogeneous composition of the metals in all the samples. The mechanical tests showed a similar elastic behaviour of the samples, with minimal differences between the two devices. No wire broke even under the maximum strength by the Analyser. This study seems to demonstrate that the welding process by office Nd:YAG laser device and the welding process by laboratory Nd:YAG laser device, analysed by SEM, EDS and DMA, showed minimal and not significant differences even if these data will be confirmed by a greater number of samples.
Remotely Controlled Mixers for Light Microscopy Module (LMM) Colloid Samples
NASA Technical Reports Server (NTRS)
Kurk, Michael A. (Andy)
2015-01-01
Developed by NASA Glenn Research Center, the LMM aboard the International Space Station (ISS) is enabling multiple biomedical science experiments. Techshot, Inc., has developed a series of colloid specialty cell systems (C-SPECS) for use in the colloid science experiment module on the LMM. These low-volume mixing devices will enable uniform particle density and remotely controlled repetition of LMM colloid experiments. By automating the experiment process, C-SPECS allow colloid samples to be processed more quickly. In addition, C-SPECS will minimize the time the crew will need to spend on colloid experiments as well as eliminate the need for multiple and costly colloid samples, which are expended after a single examination. This high-throughput capability will lead to more efficient and productive use of the LMM. As commercial launch vehicles begin routine visits to the ISS, C-SPECS could become a significant means to process larger quantities of high-value materials for commercial customers.
Strategies for informed sample size reduction in adaptive controlled clinical trials
NASA Astrophysics Data System (ADS)
Arandjelović, Ognjen
2017-12-01
Clinical trial adaptation refers to any adjustment of the trial protocol after the onset of the trial. The main goal is to make the process of introducing new medical interventions to patients more efficient. The principal challenge, which is an outstanding research problem, is to be found in the question of how adaptation should be performed so as to minimize the chance of distorting the outcome of the trial. In this paper, we propose a novel method for achieving this. Unlike most of the previously published work, our approach focuses on trial adaptation by sample size adjustment, i.e. by reducing the number of trial participants in a statistically informed manner. Our key idea is to select the sample subset for removal in a manner which minimizes the associated loss of information. We formalize this notion and describe three algorithms which approach the problem in different ways, respectively, using (i) repeated random draws, (ii) a genetic algorithm, and (iii) what we term pair-wise sample compatibilities. Experiments on simulated data demonstrate the effectiveness of all three approaches, with a consistently superior performance exhibited by the pair-wise sample compatibilities-based method.
Impact of ultra-processed foods on micronutrient content in the Brazilian diet.
Louzada, Maria Laura da Costa; Martins, Ana Paula Bortoletto; Canella, Daniela Silva; Baraldi, Larissa Galastri; Levy, Renata Bertazzi; Claro, Rafael Moreira; Moubarac, Jean-Claude; Cannon, Geoffrey; Monteiro, Carlos Augusto
2015-01-01
OBJECTIVE To evaluate the impact of consuming ultra-processed foods on the micronutrient content of the Brazilian population's diet. METHODS This cross-sectional study was performed using data on individual food consumption from a module of the 2008-2009 Brazilian Household Budget Survey. A representative sample of the Brazilian population aged 10 years or over was assessed (n = 32,898). Food consumption data were collected through two 24-hour food records. Linear regression models were used to assess the association between the nutrient content of the diet and the quintiles of ultra-processed food consumption - crude and adjusted for family income per capita. RESULTS Mean daily energy intake per capita was 1,866 kcal, with 69.5% coming from natural or minimally processed foods, 9.0% from processed foods and 21.5% from ultra-processed foods. For sixteen out of the seventeen evaluated micronutrients, their content was lower in the fraction of the diet composed of ultra-processed foods compared with the fraction of the diet composed of natural or minimally processed foods. The content of 10 micronutrients in ultra-processed foods did not reach half the content level observed in the natural or minimally processed foods. The higher consumption of ultra-processed foods was inversely and significantly associated with the content of vitamins B12, vitamin D, vitamin E, niacin, pyridoxine, copper, iron, phosphorus, magnesium, selenium and zinc. The reverse situation was only observed for calcium, thiamin and riboflavin. CONCLUSIONS The findings of this study highlight that reducing the consumption of ultra-processed foods is a natural way to promote healthy eating in Brazil and, therefore, is in line with the recommendations made by the Guia Alimentar para a População Brasileira (Dietary Guidelines for the Brazilian Population) to avoid these foods.
Impact of ultra-processed foods on micronutrient content in the Brazilian diet
Louzada, Maria Laura da Costa; Martins, Ana Paula Bortoletto; Canella, Daniela Silva; Baraldi, Larissa Galastri; Levy, Renata Bertazzi; Claro, Rafael Moreira; Moubarac, Jean-Claude; Cannon, Geoffrey; Monteiro, Carlos Augusto
2015-01-01
OBJECTIVE To evaluate the impact of consuming ultra-processed foods on the micronutrient content of the Brazilian population’s diet. METHODS This cross-sectional study was performed using data on individual food consumption from a module of the 2008-2009 Brazilian Household Budget Survey. A representative sample of the Brazilian population aged 10 years or over was assessed (n = 32,898). Food consumption data were collected through two 24-hour food records. Linear regression models were used to assess the association between the nutrient content of the diet and the quintiles of ultra-processed food consumption – crude and adjusted for family income per capita. RESULTS Mean daily energy intake per capita was 1,866 kcal, with 69.5% coming from natural or minimally processed foods, 9.0% from processed foods and 21.5% from ultra-processed foods. For sixteen out of the seventeen evaluated micronutrients, their content was lower in the fraction of the diet composed of ultra-processed foods compared with the fraction of the diet composed of natural or minimally processed foods. The content of 10 micronutrients in ultra-processed foods did not reach half the content level observed in the natural or minimally processed foods. The higher consumption of ultra-processed foods was inversely and significantly associated with the content of vitamins B12, vitamin D, vitamin E, niacin, pyridoxine, copper, iron, phosphorus, magnesium, selenium and zinc. The reverse situation was only observed for calcium, thiamin and riboflavin. CONCLUSIONS The findings of this study highlight that reducing the consumption of ultra-processed foods is a natural way to promote healthy eating in Brazil and, therefore, is in line with the recommendations made by the Guia Alimentar para a População Brasileira (Dietary Guidelines for the Brazilian Population) to avoid these foods. PMID:26270019
Mixed feed and its ingredients electron beam decontamination
NASA Astrophysics Data System (ADS)
Bezuglov, V. V.; Bryazgin, A. A.; Vlasov, A. Yu; Voronin, L. A.; Ites, Yu V.; Korobeynikov, M. V.; Leonov, S. V.; Leonova, M. A.; Tkachenko, V. O.; Shtarklev, E. A.; Yuskov, Yu G.
2017-01-01
Electron beam treatment is used for food processing for decades to prevent or minimize food losses and prolong storage time. This process is also named cold pasteurization. Mixed feed ingredients supplied in Russia regularly occur to be contaminated. To reduce contamination level the contaminated mixed feed ingredients samples were treated by electron beam with doses from 2 to 12 kGy. The contamination levels were decreased to the level that ensuring storage time up to 1 year.
Zhu, Ying; Piehowski, Paul D; Zhao, Rui; Chen, Jing; Shen, Yufeng; Moore, Ronald J; Shukla, Anil K; Petyuk, Vladislav A; Campbell-Thompson, Martha; Mathews, Clayton E; Smith, Richard D; Qian, Wei-Jun; Kelly, Ryan T
2018-02-28
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Between Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui
Nanoscale or single cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here, we report the development of a nanoPOTS (Nanodroplet Processing in One pot for Trace Samples) platform as a major advance in overall sensitivity. NanoPOTS dramatically enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive LC-MS, nanoPOTS allows identification of ~1500 to ~3,000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Between Runsmore » algorithm of MaxQuant, >3000 proteins were consistently identified from as few as 10 cells. Furthermore, we demonstrate robust quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here in this paper, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Betweenmore » Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui; ...
2018-02-28
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here in this paper, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Betweenmore » Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
NASA Astrophysics Data System (ADS)
Capozzoli, Amedeo; Curcio, Claudio; Liseno, Angelo; Savarese, Salvatore; Schipani, Pietro
2016-07-01
The communication presents an innovative method for the diagnosis of reflector antennas in radio astronomical applications. The approach is based on the optimization of the number and the distribution of the far field sampling points exploited to retrieve the antenna status in terms of feed misalignments, this to drastically reduce the time length of the measurement process and minimize the effects of variable environmental conditions and simplifying the tracking process of the source. The feed misplacement is modeled in terms of an aberration function of the aperture field. The relationship between the unknowns and the far field pattern samples is linearized thanks to a Principal Component Analysis. The number and the position of the field samples are then determined by optimizing the Singular Values behaviour of the relevant operator.
Plasma processing conditions substantially influence circulating microRNA biomarker levels.
Cheng, Heather H; Yi, Hye Son; Kim, Yeonju; Kroh, Evan M; Chien, Jason W; Eaton, Keith D; Goodman, Marc T; Tait, Jonathan F; Tewari, Muneesh; Pritchard, Colin C
2013-01-01
Circulating, cell-free microRNAs (miRNAs) are promising candidate biomarkers, but optimal conditions for processing blood specimens for miRNA measurement remain to be established. Our previous work showed that the majority of plasma miRNAs are likely blood cell-derived. In the course of profiling lung cancer cases versus healthy controls, we observed a broad increase in circulating miRNA levels in cases compared to controls and that higher miRNA expression correlated with higher platelet and particle counts. We therefore hypothesized that the quantity of residual platelets and microparticles remaining after plasma processing might impact miRNA measurements. To systematically investigate this, we subjected matched plasma from healthy individuals to stepwise processing with differential centrifugation and 0.22 µm filtration and performed miRNA profiling. We found a major effect on circulating miRNAs, with the majority (72%) of detectable miRNAs substantially affected by processing alone. Specifically, 10% of miRNAs showed 4-30x variation, 46% showed 30-1,000x variation, and 15% showed >1,000x variation in expression solely from processing. This was predominantly due to platelet contamination, which persisted despite using standard laboratory protocols. Importantly, we show that platelet contamination in archived samples could largely be eliminated by additional centrifugation, even in frozen samples stored for six years. To minimize confounding effects in microRNA biomarker studies, additional steps to limit platelet contamination for circulating miRNA biomarker studies are necessary. We provide specific practical recommendations to help minimize confounding variation attributable to plasma processing and platelet contamination.
Exposure of unionid mussels to electric current: Assessing risks associated with electrofishing
Holliman, F.M.; Kwak, T.J.; Cope, W.G.; Levine, Jay F.
2007-01-01
Electric current is routinely applied in freshwater for scientific sampling of fish populations (i.e., electrofishing). Freshwater mussels (families Margaritiferidae and Unionidae) are distributed worldwide, but their recent declines in diversity and abundance constitute an imperilment of global significance. Freshwater mussels are not targeted for capture by electrofishing, and any exposure to electric current is unintentional. The effects of electric shock are not fully understood for mussels but could disrupt vital physiological processes and represent an additional threat to their survival. In a controlled laboratory environment, we examined the consequences of exposure to two typical electrofishing currents, 60-Hz pulsed DC and 60-Hz AC, for the survival of adult and early life stages of three unionid species; we included fish as a quality control measure. The outcomes suggest that electrical exposure associated with typical electrofishing poses little direct risk to freshwater mussels. That is, adult mussel survival and righting behaviors (indicators of sublethal stress) were not adversely affected by electrical exposure. Glochidia (larvae that attach to and become parasites on fish gills or fins) showed minimal immediate reduction in viability after exposure. Metamorphosis from glochidia to free-living juvenile mussels was not impaired after electric current simulated capture-prone behaviors (stunning) in infested host fish. In addition, the short-term survival of juvenile mussels was not adversely influenced by exposure to electric current. Any minimal risk to imperiled mussels must be weighed at the population level against the benefits gained by using the gear for scientific sampling of fish in the same waters. However, scientists sampling fish by electrofishing should be aware of mussel reproductive periods and processes in order to minimize the harmful effects to host fish, especially in areas where mussel conservation is a concern. ?? Copyright by the American Fisheries Society 2007.
Effect of sample inhomogeneity in KAr dating
Engels, J.C.; Ingamells, C.O.
1970-01-01
Error in K-Ar ages is often due more to deficiencies in the splitting process, whereby portions of the sample are taken for potassium and for argon determination, than to imprecision in the analytical methods. The effect of the grain size of a sample and of the composition of a contaminating mineral can be evaluated, and this provides a useful guide in attempts to minimize error. Rocks and minerals should be prepared for age determination with the effects of contaminants and grain size in mind. The magnitude of such effects can be much larger than intuitive estimates might indicate. ?? 1970.
Microfluidics-to-Mass Spectrometry: A review of coupling methods and applications
Wang, Xue; Yi, Lian; Mukhitov, Nikita; Schrell, Adrian M.; Dhumpa, Raghuram; Roper, Michael G.
2014-01-01
Microfluidic devices offer great advantages in integrating sample processes, minimizing sample and reagent volumes, and increasing analysis speed, while mass spectrometry detection provides high information content, is sensitive, and can be used in quantitative analyses. The coupling of microfluidic devices to mass spectrometers is becoming more common with the strengths of both systems being combined to analyze precious and complex samples. This review summarizes select achievements published between 2010 – July 2014 in novel coupling between microfluidic devices and mass spectrometers. The review is subdivided by the types of ionization sources employed, and the different microfluidic systems used. PMID:25458901
Image re-sampling detection through a novel interpolation kernel.
Hilal, Alaa
2018-06-01
Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.
Martini, Marinna A.; Sherwood, Chris; Horwitz, Rachel; Ramsey, Andree; Lightsom, Fran; Lacy, Jessie; Xu, Jingping
2006-01-01
3.\tpreserving minimally processed and partially processed versions of data sets. STG usually deploys ADV and PCADP probes configured as downward looking, mounted on bottom tripods, with the objective of measuring high-resolution near-bed currents. The velocity profiles are recorded with minimal internal data processing. Also recorded are parameters such as temperature, conductivity, optical backscatter, light transmission, and high frequency pressure. Sampling consists of high-frequency bursts(1–10 Hz) bursts of long duration (5–30 minutes) at regular and recurring intervals for a duration of 1 to 6 months. The result is very large data files, often 500 MB per Hydra, per deployment, in Sontek's compressed binary format. This section introduces the Hydratools toolbox and provides information about the history of the system's development. The USGS philosophy regarding data quality is discussed to provide an understating of the motivation for creating the system. General information about the following topics will also be discussed: hardware and software required for the system, basic processing steps, limitations of program usage, and features that are unique to the programs.
Optimal regulation in systems with stochastic time sampling
NASA Technical Reports Server (NTRS)
Montgomery, R. C.; Lee, P. S.
1980-01-01
An optimal control theory that accounts for stochastic variable time sampling in a distributed microprocessor based flight control system is presented. The theory is developed by using a linear process model for the airplane dynamics and the information distribution process is modeled as a variable time increment process where, at the time that information is supplied to the control effectors, the control effectors know the time of the next information update only in a stochastic sense. An optimal control problem is formulated and solved for the control law that minimizes the expected value of a quadratic cost function. The optimal cost obtained with a variable time increment Markov information update process where the control effectors know only the past information update intervals and the Markov transition mechanism is almost identical to that obtained with a known and uniform information update interval.
NASA Technical Reports Server (NTRS)
Fletcher, L. A.; Allen, C. C.; Bastien, R.
2008-01-01
NASA's Johnson Space Center (JSC) and the Astromaterials Curator are charged by NPD 7100.10D with the curation of all of NASA s extraterrestrial samples, including those from future missions. This responsibility includes the development of new sample handling and preparation techniques; therefore, the Astromaterials Curator must begin developing procedures to preserve, prepare and ship samples at sub-freezing temperatures in order to enable future sample return missions. Such missions might include the return of future frozen samples from permanently-shadowed lunar craters, the nuclei of comets, the surface of Mars, etc. We are demonstrating the ability to curate samples under cold conditions by designing, installing and testing a cold curation glovebox. This glovebox will allow us to store, document, manipulate and subdivide frozen samples while quantifying and minimizing contamination throughout the curation process.
Intact preservation of environmental samples by freezing under an alternating magnetic field.
Morono, Yuki; Terada, Takeshi; Yamamoto, Yuhji; Xiao, Nan; Hirose, Takehiro; Sugeno, Masaya; Ohwada, Norio; Inagaki, Fumio
2015-04-01
The study of environmental samples requires a preservation system that stabilizes the sample structure, including cells and biomolecules. To address this fundamental issue, we tested the cell alive system (CAS)-freezing technique for subseafloor sediment core samples. In the CAS-freezing technique, an alternating magnetic field is applied during the freezing process to produce vibration of water molecules and achieve a stable, super-cooled liquid phase. Upon further cooling, the temperature decreases further, achieving a uniform freezing of sample with minimal ice crystal formation. In this study, samples were preserved using the CAS and conventional freezing techniques at 4, -20, -80 and -196 (liquid nitrogen) °C. After 6 months of storage, microbial cell counts by conventional freezing significantly decreased (down to 10.7% of initial), whereas that by CAS-freezing resulted in minimal. When Escherichia coli cells were tested under the same freezing conditions and storage for 2.5 months, CAS-frozen E. coli cells showed higher viability than the other conditions. In addition, an alternating magnetic field does not impact on the direction of remanent magnetization in sediment core samples, although slight partial demagnetization in intensity due to freezing was observed. Consequently, our data indicate that the CAS technique is highly useful for the preservation of environmental samples. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Pishravian, Arash; Aghabozorgi Sahaf, Masoud Reza
2012-12-01
In this paper speech-music separation using Blind Source Separation is discussed. The separating algorithm is based on the mutual information minimization where the natural gradient algorithm is used for minimization. In order to do that, score function estimation from observation signals (combination of speech and music) samples is needed. The accuracy and the speed of the mentioned estimation will affect on the quality of the separated signals and the processing time of the algorithm. The score function estimation in the presented algorithm is based on Gaussian mixture based kernel density estimation method. The experimental results of the presented algorithm on the speech-music separation and comparing to the separating algorithm which is based on the Minimum Mean Square Error estimator, indicate that it can cause better performance and less processing time
Dialysis Extraction for Chromatography
NASA Technical Reports Server (NTRS)
Jahnsen, V. J.
1985-01-01
Chromatographic-sample pretreatment by dialysis detects traces of organic contaminants in water samples analyzed in field with minimal analysis equipment and minimal quantities of solvent. Technique also of value wherever aqueous sample and solvent must not make direct contact.
A parallel Jacobson-Oksman optimization algorithm. [parallel processing (computers)
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Markos, A. T.
1975-01-01
A gradient-dependent optimization technique which exploits the vector-streaming or parallel-computing capabilities of some modern computers is presented. The algorithm, derived by assuming that the function to be minimized is homogeneous, is a modification of the Jacobson-Oksman serial minimization method. In addition to describing the algorithm, conditions insuring the convergence of the iterates of the algorithm and the results of numerical experiments on a group of sample test functions are presented. The results of these experiments indicate that this algorithm will solve optimization problems in less computing time than conventional serial methods on machines having vector-streaming or parallel-computing capabilities.
Ahro, M; Hakala, M; Kauppinen, J; Kallio, H
2001-10-01
Four apple wine fermentation processes have been observed by means of direct-inlet gas-phase FTIR spectroscopy. The apple juice concentrates were each fermented by two species of Saccharomyces cerevisiae starters, and the experiment was repeated. The development of the concentrations of 1-propanol, 4-methylpyridine, acetaldehyde, acetic acid, and ethyl acetate was monitored. Two different sampling methods were used--static headspace and direct injection of the must. The performance of the FTIR method is limited by the high ethanol concentration. It can be mathematically proven that the amount of sample can be selected so that any distortion due to ethanol is minimized. Headspace GC-MS was used for preliminary compound identification.
Davis, Ben; Grosvenor, Chriss; Johnk, Robert; Novotny, David; Baker-Jarvis, James; Janezic, Michael
2007-01-01
Building materials are often incorporated into complex, multilayer macrostructures that are simply not amenable to measurements using coax or waveguide sample holders. In response to this, we developed an ultra-wideband (UWB) free-field measurement system. This measurement system uses a ground-plane-based system and two TEM half-horn antennas to transmit and receive the RF signal. The material samples are placed between the antennas, and reflection and transmission measurements made. Digital signal processing techniques are then applied to minimize environmental and systematic effects. The processed data are compared to a plane-wave model to extract the material properties with optimization software based on genetic algorithms.
Tappi, Silvia; Tylewicz, Urszula; Romani, Santina; Siroli, Lorenzo; Patrignani, Francesca; Dalla Rosa, Marco; Rocculi, Pietro
2016-10-05
Vacuum impregnation (VI) is a processing operation that permits the impregnation of fruit and vegetable porous tissues with a fast and more homogeneous penetration of active compounds compared to the classical diffusion processes. The objective of this research was to investigate the impact on VI treatment with the addition of calcium lactate on qualitative parameters of minimally processed melon during storage. For this aim, this work was divided in 2 parts. Initially, the optimization of process parameters was carried out in order to choose the optimal VI conditions for improving texture characteristics of minimally processed melon that were then used to impregnate melons for a shelf-life study in real storage conditions. On the basis of a 2 3 factorial design, the effect of Calcium lactate (CaLac) concentration between 0% and 5% and of minimum pressure (P) between 20 and 60 MPa were evaluated on color and texture. Processing parameters corresponding to 5% CaLac concentration and 60 MPa of minimum pressure were chosen for the storage study, during which the modifications of main qualitative parameters were evaluated. Despite of the high variability of the raw material, results showed that VI allowed a better maintenance of texture during storage. Nevertheless, other quality traits were negatively affected by the application of vacuum. Impregnated products showed a darker and more translucent appearance on the account of the alteration of the structural properties. Moreover microbial shelf-life was reduced to 4 d compared to the 7 obtained for control and dipped samples. © 2016 Institute of Food Technologists®.
Further improvement of hydrostatic pressure sample injection for microchip electrophoresis.
Luo, Yong; Zhang, Qingquan; Qin, Jianhua; Lin, Bingcheng
2007-12-01
Hydrostatic pressure sample injection method is able to minimize the number of electrodes needed for a microchip electrophoresis process; however, it neither can be applied for electrophoretic DNA sizing, nor can be implemented on the widely used single-cross microchip. This paper presents an injector design that makes the hydrostatic pressure sample injection method suitable for DNA sizing. By introducing an assistant channel into the normal double-cross injector, a rugged DNA sample plug suitable for sizing can be successfully formed within the cross area during the sample loading. This paper also demonstrates that the hydrostatic pressure sample injection can be performed in the single-cross microchip by controlling the radial position of the detection point in the separation channel. Rhodamine 123 and its derivative as model sample were successfully separated.
Droplet-Based Segregation and Extraction of Concentrated Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, C R; Buckley, P; Hamilton, J
2007-02-23
Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less
Montowska, Magdalena; Alexander, Morgan R; Tucker, Gregory A; Barrett, David A
2015-11-15
We present the application of a novel ambient LESA-MS method for the authentication of processed meat products. A set of 25 species and protein-specific heat stable peptide markers has been detected in processed samples manufactured from beef, pork, horse, chicken and turkey meat. We demonstrate that several peptides derived from myofibrillar and sarcoplasmic proteins are sufficiently resistant to processing to serve as specific markers of processed products. The LESA-MS technique required minimal sample preparation without fractionation and enabled the unambiguous and simultaneous identification of skeletal muscle proteins and peptides as well as other components of animal origin, including the milk protein such as casein alpha-S1, in whole meat product digests. We have identified, for the first time, six fast type II and five slow/cardiac type I MHC peptide markers in various processed meat products. The study demonstrates that complex mixtures of processed proteins/peptides can be examined effectively using this approach. Copyright © 2015 Elsevier Ltd. All rights reserved.
Michael, Claire W; Naik, Kalyani; McVicker, Michael
2013-05-01
We developed a value stream map (VSM) of the Papanicolaou test procedure to identify opportunities to reduce waste and errors, created a new VSM, and implemented a new process emphasizing Lean tools. Preimplementation data revealed the following: (1) processing time (PT) for 1,140 samples averaged 54 hours; (2) 27 accessioning errors were detected on review of 357 random requisitions (7.6%); (3) 5 of the 20,060 tests had labeling errors that had gone undetected in the processing stage. Four were detected later during specimen processing but 1 reached the reporting stage. Postimplementation data were as follows: (1) PT for 1,355 samples averaged 31 hours; (2) 17 accessioning errors were detected on review of 385 random requisitions (4.4%); and (3) no labeling errors were undetected. Our results demonstrate that implementation of Lean methods, such as first-in first-out processes and minimizing batch size by staff actively participating in the improvement process, allows for higher quality, greater patient safety, and improved efficiency.
Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.
2015-01-01
Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126
Ventham, Nicholas T; Gardner, Richard A; Kennedy, Nicholas A; Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R; Fernandes, Daryl L; Satsangi, Jack; Spencer, Daniel I R
2015-01-01
Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures.
Predicting green: really radical (plant) predictive processing
Friston, Karl
2017-01-01
In this article we account for the way plants respond to salient features of their environment under the free-energy principle for biological systems. Biological self-organization amounts to the minimization of surprise over time. We posit that any self-organizing system must embody a generative model whose predictions ensure that (expected) free energy is minimized through action. Plants respond in a fast, and yet coordinated manner, to environmental contingencies. They pro-actively sample their local environment to elicit information with an adaptive value. Our main thesis is that plant behaviour takes place by way of a process (active inference) that predicts the environmental sources of sensory stimulation. This principle, we argue, endows plants with a form of perception that underwrites purposeful, anticipatory behaviour. The aim of the article is to assess the prospects of a radical predictive processing story that would follow naturally from the free-energy principle for biological systems; an approach that may ultimately bear upon our understanding of life and cognition more broadly. PMID:28637913
Molecular epidemiology biomarkers-Sample collection and processing considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Nina T.; Pfleger, Laura; Berger, Eileen
2005-08-07
Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less
Anaerobic co-digestion of sewage sludge and molasses
NASA Astrophysics Data System (ADS)
Kalemba, Katarzyna; Barbusiński, Krzysztof
2017-11-01
The efficiency of simultaneous digestion of sewage sludge and by-product of refining sugar beets (molasses) was investigated. The study was conducted for 28 days under mesophilic conditions. 0.5%, 1%, 1.5%, 2% and 3% (m/m) of molasses was added to the mixture of sludge. The result of the study showed that addition of molasses had positive effect the biogas production. The biggest biogas yield was achieved in sample with 0.5% of molasses (95.69 mL/g VS). In this sample biogas production increased by 21% in comparison with reference sample (without molasses). The biggest methane content (73%) was also observed in the sample with 0.5% of molasses. For comparison in reference sample was produced biogas with 70% content of methane. The dose over 0.5% of molasses caused inhibition of fermentation process. The minimal degree (38%) of degradation of organic matter was achieved in reference sample (38.53%) and in sample with 0.5% of molasses (39.71%) but in other samples was in the range of 35.61-36.76 % (from 3% to 1%, respectively). Digestion process have adverse effect on dewatering properties of sludge. Before co-digestion capillary suction time was from 31 s to 55 s, and after process increased from 36 s to 556 s (from 0% to 3% of molasses, respectively).
Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach
Ferri, Gabriele; Cococcioni, Marco; Alvarez, Alberto
2015-01-01
This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality), used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called Aη, is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support). The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided and show that SoDDS, which is currently used at NATO STO Centre for Maritime Research and Experimentation (CMRE), can represent a step forward towards a systematic mission planning of glider fleets, dramatically reducing the efforts of glider operators. PMID:26712763
Mission Planning and Decision Support for Underwater Glider Networks: A Sampling on-Demand Approach.
Ferri, Gabriele; Cococcioni, Marco; Alvarez, Alberto
2015-12-26
This paper describes an optimal sampling approach to support glider fleet operators and marine scientists during the complex task of planning the missions of fleets of underwater gliders. Optimal sampling, which has gained considerable attention in the last decade, consists in planning the paths of gliders to minimize a specific criterion pertinent to the phenomenon under investigation. Different criteria (e.g., A, G, or E optimality), used in geosciences to obtain an optimum design, lead to different sampling strategies. In particular, the A criterion produces paths for the gliders that minimize the overall level of uncertainty over the area of interest. However, there are commonly operative situations in which the marine scientists may prefer not to minimize the overall uncertainty of a certain area, but instead they may be interested in achieving an acceptable uncertainty sufficient for the scientific or operational needs of the mission. We propose and discuss here an approach named sampling on-demand that explicitly addresses this need. In our approach the user provides an objective map, setting both the amount and the geographic distribution of the uncertainty to be achieved after assimilating the information gathered by the fleet. A novel optimality criterion, called A η , is proposed and the resulting minimization problem is solved by using a Simulated Annealing based optimizer that takes into account the constraints imposed by the glider navigation features, the desired geometry of the paths and the problems of reachability caused by ocean currents. This planning strategy has been implemented in a Matlab toolbox called SoDDS (Sampling on-Demand and Decision Support). The tool is able to automatically download the ocean fields data from MyOcean repository and also provides graphical user interfaces to ease the input process of mission parameters and targets. The results obtained by running SoDDS on three different scenarios are provided and show that SoDDS, which is currently used at NATO STO Centre for Maritime Research and Experimentation (CMRE), can represent a step forward towards a systematic mission planning of glider fleets, dramatically reducing the efforts of glider operators.
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Sample processor for chemical analysis
NASA Technical Reports Server (NTRS)
Boettger, Heinz G. (Inventor)
1980-01-01
An apparatus is provided which can process numerous samples that must be chemically analyzed by the application of fluids such as liquid reagents, solvents and purge gases, as well as the application of dumps for receiving the applied fluid after they pass across the sample, in a manner that permits numerous samples to be processed in a relatively short time and with minimal manpower. The processor includes a rotor which can hold numerous cartridges containing inert or adsorbent material for holding samples, and a pair of stators on opposite sides of the rotor. The stators form stations spaced along the path of the cartridges which lie in the rotor, and each station can include an aperture in one stator through which a fluid can be applied to a cartridge resting at that station, and an aperture in the other stator which can receive the fluid which has passed through the cartridge. The stators are sealed to the ends of the cartridges lying on the rotor, to thereby isolate the stations from one another.
Obtaining Cross-Sections of Paint Layers in Cultural Artifacts Using Femtosecond Pulsed Lasers
Harada, Takaaki; Spence, Stephanie; Margiolakis, Athanasios; Deckoff-Jones, Skylar; Ploeger, Rebecca; Shugar, Aaron N.; Hamm, James F.; Dani, Keshav M.; Dani, Anya R.
2017-01-01
Recently, ultrafast lasers exhibiting high peak powers and extremely short pulse durations have created a new paradigm in materials processing. The precision and minimal thermal damage provided by ultrafast lasers in the machining of metals and dielectrics also suggests a novel application in obtaining precise cross-sections of fragile, combustible paint layers in artwork and cultural heritage property. Cross-sections of paint and other decorative layers on artwork provide critical information into its history and authenticity. However, the current methodology which uses a scalpel to obtain a cross-section can cause further damage, including crumbling, delamination, and paint compression. Here, we demonstrate the ability to make controlled cross-sections of paint layers with a femtosecond pulsed laser, with minimal damage to the surrounding artwork. The femtosecond laser cutting overcomes challenges such as fragile paint disintegrating under scalpel pressure, or oxidation by the continuous-wave (CW) laser. Variations in laser power and translational speed of the laser while cutting exhibit different benefits for cross-section sampling. The use of femtosecond lasers in studying artwork also presents new possibilities in analyzing, sampling, and cleaning of artwork with minimal destructive effects. PMID:28772468
Obtaining Cross-Sections of Paint Layers in Cultural Artifacts Using Femtosecond Pulsed Lasers.
Harada, Takaaki; Spence, Stephanie; Margiolakis, Athanasios; Deckoff-Jones, Skylar; Ploeger, Rebecca; Shugar, Aaron N; Hamm, James F; Dani, Keshav M; Dani, Anya R
2017-01-26
Recently, ultrafast lasers exhibiting high peak powers and extremely short pulse durations have created a new paradigm in materials processing. The precision and minimal thermal damage provided by ultrafast lasers in the machining of metals and dielectrics also suggests a novel application in obtaining precise cross-sections of fragile, combustible paint layers in artwork and cultural heritage property. Cross-sections of paint and other decorative layers on artwork provide critical information into its history and authenticity. However, the current methodology which uses a scalpel to obtain a cross-section can cause further damage, including crumbling, delamination, and paint compression. Here, we demonstrate the ability to make controlled cross-sections of paint layers with a femtosecond pulsed laser, with minimal damage to the surrounding artwork. The femtosecond laser cutting overcomes challenges such as fragile paint disintegrating under scalpel pressure, or oxidation by the continuous-wave (CW) laser. Variations in laser power and translational speed of the laser while cutting exhibit different benefits for cross-section sampling. The use of femtosecond lasers in studying artwork also presents new possibilities in analyzing, sampling, and cleaning of artwork with minimal destructive effects.
NASA Astrophysics Data System (ADS)
Wang, Yubin; Ismail, Marliya; Farid, Mohammed
2017-10-01
Currently baby food is sterilized using retort processing that gives an extended shelf life. However, this type of heat processing leads to reduction of organoleptic and nutrition value. Alternatively, the combination of pressure and heat could be used to achieve sterilization at reduced temperatures. This study investigates the potential of pressure-assisted thermal sterilization (PATS) technology for baby food sterilization. Here, baby food (apple puree), inoculated with Bacillus subtilis spores was treated using PATS at different operating temperatures, pressures and times and was compared with thermal only treatment. The results revealed that the decimal reduction time of B. subtilis in PATS treatment was lower than that of thermal only treatment. At a similar spore inactivation, the retention of ascorbic acid of PATS-treated sample was higher than that of thermally treated sample. The results indicated that PATS could be a potential technology for baby food processing while minimizing quality deterioration.
Oligosaccharide formation during commercial pear juice processing.
Willems, Jamie L; Low, Nicholas H
2016-08-01
The effect of enzyme treatment and processing on the oligosaccharide profile of commercial pear juice samples was examined by high performance anion exchange chromatography with pulsed amperometric detection and capillary gas chromatography with flame ionization detection. Industrial samples representing the major stages of processing produced with various commercial enzyme preparations were studied. Through the use of commercially available standards and laboratory scale enzymatic hydrolysis of pectin, starch and xyloglucan; galacturonic acid oligomers, glucose oligomers (e.g., maltose and cellotriose) and isoprimeverose were identified as being formed during pear juice production. It was found that the majority of polysaccharide hydrolysis and oligosaccharide formation occurred during enzymatic treatment at the pear mashing stage and that the remaining processing steps had minimal impact on the carbohydrate-based chromatographic profile of pear juice. Also, all commercial enzyme preparations and conditions (time and temperature) studied produced similar carbohydrate-based chromatographic profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.
Voller, L M; Dawson, P L; Han, I Y
1996-12-01
New aseptic processes are being used and refined to produce convenient, shelf stable liquid products containing meat particles. These processes utilize high temperature, short time thermal treatments to minimize food quality change; however, little research has been conducted on the effects of this process on the texture of meat from mature hens traditionally used for canning. The objective of this study was to examine textural and structural changes in meat structure due to different high temperature (HT) heat treatments and meat moisture contents were examined by use of electron microscopy and torsion analyses. Cooked gels of different moisture contents (71.2 to 74.8%) were formulated from spent fowl breast meat and exposed to processing temperatures of 120 or 124 C. The HT processing resulted in stronger (tougher) meat gels that were more deformable (more chewy) than gels that were not processed by HT. Water added prior to cooking was not retained in samples that were cooked and then processed at 124 C, but was retained in the samples processed at 120 C. Electron micrographs showed a more organized and open gel structure in the samples with higher moisture content and lower temperature (120 C) processing compared to the lower moisture and higher (124 C) temperature treatments.
Microfluidic systems and methods of transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T.; Jacobson, Stephen C.; McClain, Maxine A.; Ramsey, J. Michael
2004-08-31
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
Microfluidic systems and methods for transport and lysis of cells and analysis of cell lysate
Culbertson, Christopher T [Oak Ridge, TN; Jacobson, Stephen C [Knoxville, TN; McClain, Maxine A [Knoxville, TN; Ramsey, J Michael [Knoxville, TN
2008-09-02
Microfluidic systems and methods are disclosed which are adapted to transport and lyse cellular components of a test sample for analysis. The disclosed microfluidic systems and methods, which employ an electric field to rupture the cell membrane, cause unusually rapid lysis, thereby minimizing continued cellular activity and resulting in greater accuracy of analysis of cell processes.
Rezende, Ana Carolina B; Igarashi, Maria Crystina; Destro, Maria Teresa; Franco, Bernadette D G M; Landgraf, Mariza
2014-10-01
This study evaluated the effects of irradiation on the reduction of Shiga toxin-producing Escherichia coli (STEC), Salmonella strains, and Listeria monocytogenes, as well as on the sensory characteristics of minimally processed spinach. Spinach samples were inoculated with a cocktail of three strains each of STEC, Salmonella strains, and L. monocytogenes, separately, and were exposed to gamma radiation doses of 0, 0.2, 0.4, 0.6, 0.8, and 1.0 kGy. Samples that were exposed to 0.0, 1.0, and 1.5 kGy and kept under refrigeration (4°C) for 12 days were submitted to sensory analysis. D10 -values ranged from 0.19 to 0.20 kGy for Salmonella and from 0.20 to 0.21 for L. monocytogenes; for STEC, the value was 0.17 kGy. Spinach showed good acceptability, even after exposure to 1.5 kGy. Because gamma radiation reduced the selected pathogens without causing significant changes in the quality of spinach leaves, it may be a useful method to improve safety in the fresh produce industry.
Li, Chao; Yu, Jiaquan; Schehr, Jennifer; Berry, Scott M; Leal, Ticiana A; Lang, Joshua M; Beebe, David J
2018-05-23
The concept of high liquid repellency in multi-liquid-phase systems (e.g., aqueous droplets in an oil background) has been applied to areas of biomedical research to realize intrinsic advantages not available in single-liquid-phase systems. Such advantages have included minimizing analyte loss, facile manipulation of single-cell samples, elimination of biofouling, and ease of use regarding loading and retrieving of the sample. In this paper, we present generalized design rules for predicting the wettability of solid-liquid-liquid systems (especially for discrimination between exclusive liquid repellency (ELR) and finite liquid repellency) to extend the applications of ELR. We then apply ELR to two model systems with open microfluidic design in cell biology: (1) in situ underoil culture and combinatorial coculture of mammalian cells in order to demonstrate directed single-cell multiencapsulation with minimal waste of samples as compared to stochastic cell seeding and (2) isolation of a pure population of circulating tumor cells, which is required for certain downstream analyses including sequencing and gene expression profiling.
Characterization of Lithium Ion Battery Materials with Valence Electron Energy-Loss Spectroscopy.
Castro, Fernando C; Dravid, Vinayak P
2018-06-01
Cutting-edge research on materials for lithium ion batteries regularly focuses on nanoscale and atomic-scale phenomena. Electron energy-loss spectroscopy (EELS) is one of the most powerful ways of characterizing composition and aspects of the electronic structure of battery materials, particularly lithium and the transition metal mixed oxides found in the electrodes. However, the characteristic EELS signal from battery materials is challenging to analyze when there is strong overlap of spectral features, poor signal-to-background ratios, or thicker and uneven sample areas. A potential alternative or complementary approach comes from utilizing the valence EELS features (<20 eV loss) of battery materials. For example, the valence EELS features in LiCoO2 maintain higher jump ratios than the Li-K edge, most notably when spectra are collected with minimal acquisition times or from thick sample regions. EELS maps of these valence features give comparable results to the Li-K edge EELS maps of LiCoO2. With some spectral processing, the valence EELS maps more accurately highlight the morphology and distribution of LiCoO2 than the Li-K edge maps, especially in thicker sample regions. This approach is beneficial for cases where sample thickness or beam sensitivity limit EELS analysis, and could be used to minimize electron dosage and sample damage or contamination.
A decomposition approach to the design of a multiferroic memory bit
NASA Astrophysics Data System (ADS)
Acevedo, Ruben; Liang, Cheng-Yen; Carman, Gregory P.; Sepulveda, Abdon E.
2017-06-01
The objective of this paper is to present a methodology for the design of a memory bit to minimize the energy required to write data at the bit level. By straining a ferromagnetic nickel nano-dot by means of a piezoelectric substrate, its magnetization vector rotates between two stable states defined as a 1 and 0 for digital memory. The memory bit geometry, actuation mechanism and voltage control law were used as design variables. The approach used was to decompose the overall design process into simpler sub-problems whose structure can be exploited for a more efficient solution. This method minimizes the number of fully dynamic coupled finite element analyses required to converge to a near optimal design, thus decreasing the computational time for the design process. An in-plane sample design problem is presented to illustrate the advantages and flexibility of the procedure.
On-line IR analyzer system to monitor cephamycin C loading on ion-exchange resin
NASA Astrophysics Data System (ADS)
Shank, Sheldon; Russ, Warren; Gravatt, Douglas; Lee, Wesley; Donahue, Steven M.
1992-08-01
An on-line infrared analyzer is being developed for monitoring cephamycin C loading on ion exchange resin. Accurate measurement of product loading offers productivity improvements with direct savings from product loss avoidance, minimized raw material cost, and reduced off-line laboratory testing. Ultrafiltered fermentation broth is fed onto ion exchange columns under conditions which adsorb the product, cephamycin C, to the resin while allowing impurities to pass unretained. Product loading is stopped when the on-line analyzer determines that resin capacity for adsorbing product is nearly exhausted. Infrared spectroscopy has been shown capable of quantifying cephamycin C in the process matrix at concentrations that support process control decisions. Process-to-analyzer interface challenges have been resolved, including sample conditioning requirements. Analyzer requirements have been defined. The sample conditioning station is under design.
Beattle, A J; Oliver, I
1994-12-01
Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.
Perceptual Color Characterization of Cameras
Vazquez-Corral, Javier; Connah, David; Bertalmío, Marcelo
2014-01-01
Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as XY Z, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a 3 × 3 matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson et al., to perform a perceptual color characterization. In particular, we search for the 3 × 3 matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE ΔE error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3% for the ΔE error, 7% for the S-CIELAB error and 13% for the CID error measures. PMID:25490586
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
Perumal, G; Ayyagari, A; Chakrabarti, A; Kannan, D; Pati, S; Grewal, H S; Mukherjee, S; Singh, S; Arora, H S
2017-10-25
Substrate-cell interactions for a bioimplant are driven by substrate's surface characteristics. In addition, the performance of an implant and resistance to degradation are primarily governed by its surface properties. A bioimplant typically degrades by wear and corrosion in the physiological environment, resulting in metallosis. Surface engineering strategies for limiting degradation of implants and enhancing their performance may reduce or eliminate the need for implant removal surgeries and the associated cost. In the current study, we tailored the surface properties of stainless steel using submerged friction stir processing (FSP), a severe plastic deformation technique. FSP resulted in significant microstructural refinement from 22 μm grain size for the as-received alloy to 0.8 μm grain size for the processed sample with increase in hardness by nearly 1.5 times. The wear and corrosion behavior of the processed alloy was evaluated in simulated body fluid. The processed sample demonstrated remarkable improvement in both wear and corrosion resistance, which is explained by surface strengthening and formation of a highly stable passive layer. The methylthiazol tetrazolium assay demonstrated that the processed sample is better in supporting cell attachment, proliferation with minimal toxicity, and hemolysis. The athrombogenic characteristic of the as-received and processed samples was evaluated by fibrinogen adsorption and platelet adhesion via the enzyme-linked immunosorbent assay and lactate dehydrogenase assay, respectively. The processed sample showed less platelet and fibrinogen adhesion compared with the as-received alloy, signifying its high thromboresistance. The current study suggests friction stir processing to be a versatile toolbox for enhancing the performance and reliability of currently used bioimplant materials.
Wu, Dongrui; Lance, Brent J; Parsons, Thomas D
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.
Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188
Bierhals, Vânia S; Chiumarelli, Marcela; Hubinger, Miriam D
2011-01-01
This research studied the influence of treatment with ascorbic acid, citric acid, and calcium lactate dipping and cassava starch edible coatings on quality parameters and shelf life of fresh-cut pineapple in slices during 12 d at 5 °C. After previous tests, the treatments selected for this study were samples dipped into antibrowning solution with 0.5% of ascorbic acid and 1% of citric acid, with and without 2% of calcium lactate and coated with 2% of cassava starch suspensions. Changes in weight loss, juice leakage, mechanical properties (stress at failure), color parameters (L* and H*), ascorbic acid content, sensory acceptance, and microbial growth of fruits were evaluated. Samples only treated with antibrowning agents were used as control. Edible coatings with and without calcium lactate were efficient in reducing weight loss, juice leakage, and maintaining firmness during storage. However, these samples showed more browning and the ascorbic acid content was reduced. All treatments presented good sensory acceptance (scores above 6). The determining factor of shelf life of pineapple slices was the microbial spoilage. A shelf life of 8 d was obtained for pineapple slices only treated with antibrowning agents. On the other hand, coated samples showed a reduced shelf life of 7 d and higher yeast and mold growth. Thus, although cassava starch coatings were efficient in reducing respiration rate, weight loss, and juice leakage and maintained mechanical properties, these treatments were not able to increase the shelf life of minimally processed pineapple. Practical Application: Pineapple fruit is highly appreciated for its aroma, flavor, and juiciness, but its immediate consumption is difficult. Therefore, pineapple is a potential fruit for minimal processing. However, shelf life of fresh-cut pineapple is very limited by changes in color, texture, appearance, off-flavors, and microbial growth. The use of edible coatings as gas and water vapor barrier and antibrowning agents can extend the storage time and maintain the quality of fresh-cut produce. Cassava starch and alginate coatings are alternative to preserve minimally processed pineapples without changing the quality parameters of fresh fruit. Thus, this study is useful for consumers and fresh-cut industry interested in knowing factors affecting shelf life and quality of fresh-cut pineapple.
ERIC Educational Resources Information Center
Gül, Gülnihal; Eren, Bilgehan
2018-01-01
Many reasons such as social exclusion, economic insufficiency, and prejudice make it difficult for Gypsy children to reach qualified education and cause their expectations for the future to be minimized. Yet, it is considered that the property of "inclination to music," linked especially with Gypsies, will positively affect the…
Integrated system for gathering, processing, and reporting data relating to site contamination
Long, D.D.; Goldberg, M.S.; Baker, L.A.
1997-11-11
An integrated screening system comprises an intrusive sampling subsystem, a field mobile laboratory subsystem, a computer assisted design/geographical information subsystem, and a telecommunication linkup subsystem, all integrated to provide synergistically improved data relating to the extent of site soil/groundwater contamination. According to the present invention, data samples related to the soil, groundwater or other contamination of the subsurface material are gathered and analyzed to measure contaminants. Based on the location of origin of the samples in three-dimensional space, the analyzed data are transmitted to a location display. The data from analyzing samples and the data from the locating the origin are managed to project the next probable sample location. The next probable sample location is then forwarded for use as a guide in the placement of ensuing sample location, whereby the number of samples needed to accurately characterize the site is minimized. 10 figs.
Integrated system for gathering, processing, and reporting data relating to site contamination
Long, Delmar D.; Goldberg, Mitchell S.; Baker, Lorie A.
1997-01-01
An integrated screening system comprises an intrusive sampling subsystem, a field mobile laboratory subsystem, a computer assisted design/geographical information subsystem, and a telecommunication linkup subsystem, all integrated to provide synergistically improved data relating to the extent of site soil/groundwater contamination. According to the present invention, data samples related to the soil, groundwater or other contamination of the subsurface material are gathered and analyzed to measure contaminants. Based on the location of origin of the samples in three-dimensional space, the analyzed data are transmitted to a location display. The data from analyzing samples and the data from the locating the origin are managed to project the next probable sample location. The next probable sample location is then forwarded for use as a guide in the placement of ensuing sample location, whereby the number of samples needed to accurately characterize the site is minimized.
Experiment kits for processing biological samples inflight on SLS-2
NASA Technical Reports Server (NTRS)
Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.
1995-01-01
This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.
Bogachev, Igor; Yudin, Artem; Grigoryev, Evgeniy; ...
2015-11-02
Refractory oxide dispersion strengthened 13Cr-2Mo steel powder was successfully consolidated to near theoretical density using high voltage electric discharge compaction. Cylindrical samples with relative density from 90% to 97% and dimensions of 10 mm in diameter and 10–15 mm in height were obtained. Consolidation conditions such as pressure and voltage were varied in some ranges to determine the optimal compaction regime. Three different concentrations of yttria were used to identify its effect on the properties of the samples. It is shown that the utilized ultra-rapid consolidation process in combination with high transmitted energy allows obtaining high density compacts, retaining themore » initial structure with minimal grain growth. The experimental results indicate some heterogeneity of the structure which may occur in the external layers of the tested samples due to various thermal and electromagnetic in-processing effects. As a result, the choice of the optimal parameters of the consolidation enables obtaining samples of acceptable quality.« less
Bogachev, Igor; Yudin, Artem; Grigoryev, Evgeniy; Chernov, Ivan; Staltsov, Maxim; Khasanov, Oleg; Olevsky, Eugene
2015-11-02
Refractory oxide dispersion strengthened 13Cr-2Mo steel powder was successfully consolidated to near theoretical density using high voltage electric discharge compaction. Cylindrical samples with relative density from 90% to 97% and dimensions of 10 mm in diameter and 10-15 mm in height were obtained. Consolidation conditions such as pressure and voltage were varied in some ranges to determine the optimal compaction regime. Three different concentrations of yttria were used to identify its effect on the properties of the samples. It is shown that the utilized ultra-rapid consolidation process in combination with high transmitted energy allows obtaining high density compacts, retaining the initial structure with minimal grain growth. The experimental results indicate some heterogeneity of the structure which may occur in the external layers of the tested samples due to various thermal and electromagnetic in-processing effects. The choice of the optimal parameters of the consolidation enables obtaining samples of acceptable quality.
Materials science research in microgravity
NASA Technical Reports Server (NTRS)
Perepezko, John H.
1992-01-01
There are several important attributes of an extended duration microgravity environment that offer a new dimension in the control of the microstructure, processing, and properties of materials. First, when gravitational effects are minimized, buoyancy driven convection flows are also minimized. The flows due to density differences, brought about either by composition or temperature gradients will then be reduced or eliminated to permit a more precise control of the temperature and the composition of a melt which is critical in achieving high quality crystal growth of electronic materials or alloy structures. Secondly, body force effects such as sedimentation, hydrostatic pressure, and deformation are similarly reduced. These effects may interfere with attempts to produce uniformly dispersed or aligned second phases during melt solidification. Thirdly, operating in a microgravity environment will facilitate the containerless processing of melts to eliminate the limitations of containment for reactive melts. The noncontacting forces such as those developed from electromagnet, electrostatic, or acoustic fields can be used to position samples. With this mode of operation, contamination can be minimized to enable the study of reactive melts and to eliminate extraneous crystal nucleation so that novel crystalline structures and new glass compositions may be produced. In order to take advantage of the microgravity environment for materials research, it has become clear that reliable processing models based on a sound ground based experimental experience and an established thermophysical property data base are essential.
Advanced overlay: sampling and modeling for optimized run-to-run control
NASA Astrophysics Data System (ADS)
Subramany, Lokesh; Chung, WoongJae; Samudrala, Pavan; Gao, Haiyong; Aung, Nyan; Gomez, Juan Manuel; Gutjahr, Karsten; Park, DongSuk; Snow, Patrick; Garcia-Medina, Miguel; Yap, Lipkong; Demirer, Onur Nihat; Pierson, Bill; Robinson, John C.
2016-03-01
In recent years overlay (OVL) control schemes have become more complicated in order to meet the ever shrinking margins of advanced technology nodes. As a result, this brings up new challenges to be addressed for effective run-to- run OVL control. This work addresses two of these challenges by new advanced analysis techniques: (1) sampling optimization for run-to-run control and (2) bias-variance tradeoff in modeling. The first challenge in a high order OVL control strategy is to optimize the number of measurements and the locations on the wafer, so that the "sample plan" of measurements provides high quality information about the OVL signature on the wafer with acceptable metrology throughput. We solve this tradeoff between accuracy and throughput by using a smart sampling scheme which utilizes various design-based and data-based metrics to increase model accuracy and reduce model uncertainty while avoiding wafer to wafer and within wafer measurement noise caused by metrology, scanner or process. This sort of sampling scheme, combined with an advanced field by field extrapolated modeling algorithm helps to maximize model stability and minimize on product overlay (OPO). Second, the use of higher order overlay models means more degrees of freedom, which enables increased capability to correct for complicated overlay signatures, but also increases sensitivity to process or metrology induced noise. This is also known as the bias-variance trade-off. A high order model that minimizes the bias between the modeled and raw overlay signature on a single wafer will also have a higher variation from wafer to wafer or lot to lot, that is unless an advanced modeling approach is used. In this paper, we characterize the bias-variance trade off to find the optimal scheme. The sampling and modeling solutions proposed in this study are validated by advanced process control (APC) simulations to estimate run-to-run performance, lot-to-lot and wafer-to- wafer model term monitoring to estimate stability and ultimately high volume manufacturing tests to monitor OPO by densely measured OVL data.
Roche, Anne I; Kroska, Emily B; Miller, Michelle L; Kroska, Sydney K; O'Hara, Michael W
2018-03-22
Childhood trauma is associated with a variety of risky, unhealthy, or problem behaviors. The current study aimed to explore experiential avoidance and mindfulness processes as mechanisms through which childhood trauma and problem behavior are linked in a college sample. The sample consisted of college-aged young adults recruited November-December, 2016 (N = 414). Participants completed self-report measures of childhood trauma, current problem behavior, experiential avoidance, and mindfulness processes. Bootstrapped mediation analyses examined the mechanistic associations of interest. Mediation analyses indicated that experiential avoidance was a significant mediator of the association between childhood trauma and problem behavior. Additionally, multiple mediation analyses indicated that specific mindfulness facets-act with awareness and nonjudgment of inner experience-significantly mediated the same association. Interventions for college students who have experienced childhood trauma might profitably target mechanisms such as avoidance and mindfulness in order to minimize engagement in problem behavior.
Comparative Testis Tissue Proteomics Using 2-Dye Versus 3-Dye DIGE Analysis.
Holland, Ashling
2018-01-01
Comparative tissue proteomics aims to analyze alterations of the proteome in response to a stimulus. Two-dimensional difference gel electrophoresis (2D-DIGE) is a modified and advanced form of 2D gel electrophoresis. DIGE is a powerful biochemical method that compares two or three protein samples on the same analytical gel, and can be used to establish differentially expressed protein levels between healthy normal and diseased pathological tissue sample groups. Minimal DIGE labeling can be used via a 2-dye system with Cy3 and Cy5 or a 3-dye system with Cy2, Cy3, and Cy5 to fluorescently label samples with CyDye flours pre-electrophoresis. DIGE circumvents gel-to-gel variability by multiplexing samples to a single gel and through the use of a pooled internal standard for normalization. This form of quantitative high-resolution proteomics facilitates the comparative analysis and evaluation of tissue protein compositions. Comparing tissue groups under different conditions is crucially important for advancing the biomedical field by characterization of cellular processes, understanding pathophysiological development and tissue biomarker discovery. This chapter discusses 2D-DIGE as a comparative tissue proteomic technique and describes in detail the experimental steps required for comparative proteomic analysis employing both options of 2-dye and 3-dye DIGE minimal labeling.
Nonlinear vibrational microscopy
Holtom, Gary R.; Xie, Xiaoliang Sunney; Zumbusch, Andreas
2000-01-01
The present invention is a method and apparatus for microscopic vibrational imaging using coherent Anti-Stokes Raman Scattering or Sum Frequency Generation. Microscopic imaging with a vibrational spectroscopic contrast is achieved by generating signals in a nonlinear optical process and spatially resolved detection of the signals. The spatial resolution is attained by minimizing the spot size of the optical interrogation beams on the sample. Minimizing the spot size relies upon a. directing at least two substantially co-axial laser beams (interrogation beams) through a microscope objective providing a focal spot on the sample; b. collecting a signal beam together with a residual beam from the at least two co-axial laser beams after passing through the sample; c. removing the residual beam; and d. detecting the signal beam thereby creating said pixel. The method has significantly higher spatial resolution then IR microscopy and higher sensitivity than spontaneous Raman microscopy with much lower average excitation powers. CARS and SFG microscopy does not rely on the presence of fluorophores, but retains the resolution and three-dimensional sectioning capability of confocal and two-photon fluorescence microscopy. Complementary to these techniques, CARS and SFG microscopy provides a contrast mechanism based on vibrational spectroscopy. This vibrational contrast mechanism, combined with an unprecedented high sensitivity at a tolerable laser power level, provides a new approach for microscopic investigations of chemical and biological samples.
Oxidation resistant coatings for ceramic matrix composite components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaubert, V.M.; Stinton, D.P.; Hirschfeld, D.A.
Corrosion resistant Ca{sub 0.6}Mg{sub 0.4}Zr{sub 4}(PO{sub 4}){sub 6} (CMZP) and Ca{sub 0.5}Sr{sub 0.5}Zr{sub 4}(PO{sub 4}){sub 6} (CS-50) coatings for fiber-reinforced SiC-matrix composite heat exchanger tubes have been developed. Aqueous slurries of both oxides were prepared with high solids loading. One coating process consisted of dipping the samples in a slip. A tape casting process has also been created that produced relatively thin and dense coatings covering a large area. A processing technique was developed, utilizing a pre-sintering step, which produced coatings with minimal cracking.
Weber, Michael; Mickoleit, Michaela; Huisken, Jan
2014-01-01
This chapter introduces the concept of light sheet microscopy along with practical advice on how to design and build such an instrument. Selective plane illumination microscopy is presented as an alternative to confocal microscopy due to several superior features such as high-speed full-frame acquisition, minimal phototoxicity, and multiview sample rotation. Based on our experience over the last 10 years, we summarize the key concepts in light sheet microscopy, typical implementations, and successful applications. In particular, sample mounting for long time-lapse imaging and the resulting challenges in data processing are discussed in detail. © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Martins, Cecília Geraldes; Aragon-Alegro, Lina Casale; Behrens, Jorge Herman; Oliveira Souza, Kátia Leani; Martins Vizeu, Dirceu; Hutzler, Beatriz Weltman; Teresa Destro, Maria; Landgraf, Mariza
2008-06-01
This study aimed at evaluating the acceptance of MP watermelon and pineapple exposed to 1.0 and 2.5 kGy compared to non-irradiated samples. No significant differences were observed in liking between irradiated and non-irradiated samples, and also between doses of 1.0 and 2.5 kGy. Significant differences in sourness (pineapple) or sweetness (watermelon) and between intention of purchase of irradiated and non-irradiated fruits were not observed as well. Results showed that MP watermelon and pineapple could be irradiated with doses up to 2.5 kGy without significant changes in acceptability.
Du, Bowen; Lofton, Jonathan M; Peter, Katherine T; Gipe, Alexander D; James, C Andrew; McIntyre, Jenifer K; Scholz, Nathaniel L; Baker, Joel E; Kolodziej, Edward P
2017-09-20
Untreated urban stormwater runoff contributes to poor water quality in receiving waters. The ability to identify toxicants and other bioactive molecules responsible for observed adverse effects in a complex mixture of contaminants is critical to effective protection of ecosystem and human health, yet this is a challenging analytical task. The objective of this study was to develop analytical methods using liquid chromatography coupled to high-resolution quadrupole time-of-flight mass spectrometry (LC-QTOF-MS) to detect organic contaminants in highway runoff and in runoff-exposed fish (adult coho salmon, Oncorhynchus kisutch). Processing of paired water and tissue samples facilitated contaminant prioritization and aided investigation of chemical bioavailability and uptake processes. Simple, minimal processing effort solid phase extraction (SPE) and elution procedures were optimized for water samples, and selective pressurized liquid extraction (SPLE) procedures were optimized for fish tissues. Extraction methods were compared by detection of non-target features and target compounds (e.g., quantity and peak area), while minimizing matrix interferences. Suspect screening techniques utilized in-house and commercial databases to prioritize high-risk detections for subsequent MS/MS characterization and identification efforts. Presumptive annotations were also screened with an in-house linear regression (log K ow vs. retention time) to exclude isobaric compounds. Examples of confirmed identifications (via reference standard comparison) in highway runoff include ethoprophos, prometon, DEET, caffeine, cotinine, 4(or 5)-methyl-1H-methylbenzotriazole, and acetanilide. Acetanilide was also detected in runoff-exposed fish gill and liver samples. Further characterization of highway runoff and fish tissues (14 and 19 compounds, respectively with tentative identification by MS/MS data) suggests that many novel or poorly characterized organic contaminants exist in urban stormwater runoff and exposed biota.
Design criteria for developing low-resource magnetic bead assays using surface tension valves
Adams, Nicholas M.; Creecy, Amy E.; Majors, Catherine E.; Wariso, Bathsheba A.; Short, Philip A.; Wright, David W.; Haselton, Frederick R.
2013-01-01
Many assays for biological sample processing and diagnostics are not suitable for use in settings that lack laboratory resources. We have recently described a simple, self-contained format based on magnetic beads for extracting infectious disease biomarkers from complex biological samples, which significantly reduces the time, expertise, and infrastructure required. This self-contained format has the potential to facilitate the application of other laboratory-based sample processing assays in low-resource settings. The technology is enabled by immiscible fluid barriers, or surface tension valves, which stably separate adjacent processing solutions within millimeter-diameter tubing and simultaneously permit the transit of magnetic beads across the interfaces. In this report, we identify the physical parameters of the materials that maximize fluid stability and bead transport and minimize solution carryover. We found that fluid stability is maximized with ≤0.8 mm i.d. tubing, valve fluids of similar density to the adjacent solutions, and tubing with ≤20 dyn/cm surface energy. Maximizing bead transport was achieved using ≥2.4 mm i.d. tubing, mineral oil valve fluid, and a mass of 1-3 mg beads. The amount of solution carryover across a surface tension valve was minimized using ≤0.2 mg of beads, tubing with ≤20 dyn/cm surface energy, and air separators. The most favorable parameter space for valve stability and bead transport was identified by combining our experimental results into a single plot using two dimensionless numbers. A strategy is presented for developing additional self-contained assays based on magnetic beads and surface tension valves for low-resource diagnostic applications. PMID:24403996
Russo, Pasquale; Botticella, Giuseppe; Capozzi, Vittorio; Massa, Salvatore; Spano, Giuseppe; Beneduce, Luciano
2014-01-01
In the present work we developed a MPN quantitative real-time PCR (MPN-qPCR) method for a fast and reliable detection and quantification of Listeria monocytogenes and Escherichia coli O157:H7 in minimally processed vegetables. In order to validate the proposed technique, the results were compared with conventional MPN followed by phenotypic and biochemical assays methods. When L. monocytogenes and E. coli O157:H7 were artificially inoculated in fresh-cut vegetables, a concentration as low as 1 CFU g−1 could be detected in 48 hours for both pathogens. qPCR alone allowed a limit of detection of 101 CFU g−1 after 2 hours of enrichment for L. monocytogenes and E. coli O157:H7. Since minimally processed ready-to-eat vegetables are characterized by very short shelf life, our method can potentially address the consistent reduction of time for microbial analysis, allowing a better management of quality control. Moreover, the occurrences of both pathogenic bacteria in mixed salad samples and fresh-cut melons were monitored in two production plants from the receipt of the raw materials to the early stages of shelf life. No sample was found to be contaminated by L. monocytogenes. One sample of raw mixed salad was found positive to an H7 enterohemorrhagic serotype. PMID:24949460
Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P
2013-06-11
TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).
Schulze, H Georg; Turner, Robin F B
2014-01-01
Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.
Carlson, Jules C; Challis, Jonathan K; Hanson, Mark L; Wong, Charles S
2013-02-01
The stability of 24 chemicals, including pharmaceuticals and personal care products, and some agrochemicals on extraction media was evaluated by preloading them onto Oasis hydrophilic lipophilic balanced solid-phase extraction (SPE) cartridges and polar organic chemical integrative samplers (POCIS) followed by storage at -20°C over time. After 20 months, the average loss was 11% on POCIS, with only 2,4-dichlorophenoxyacetic acid, atrazine, chlorpyrifos, and gemfibrozil showing a statistically significant decline compared with initial concentrations. Losses on SPE cartridges were below 19%, with an average loss of 9%. In addition to laboratory spiked samples, multiple POCIS deployed in wastewater-impacted surface waters and SPE extracts of these waters were stored in their original coextracted matrix for nearly two years with minimal observed losses. Errors from typical sampling, handling, and concentration estimates from POCIS sampling rates were typically ± 15 to 30% relative standard deviation, so observed storage losses are minimal for most POCIS applications. While losses during storage on SPE cartridges for 20 months were small but statistically significant for many compounds, addition of labeled internal standards prior to freezing should correct for such losses. Thus, storage of processed water samples for analysis of polar organic pollutants is viable for archival purposes or studies for which samples cannot be analyzed in the short term. Copyright © 2012 SETAC.
An Analysis of U.S. Army Health Hazard Assessments During the Acquisition of Military Materiel
2010-06-03
protective equipment (PPE) (Milz, Conrad, & Soule , 2003). Engineering controls can eliminate hazards through system design, substitution of hazardous...Milz, Conrad, & Soule , 2003). Engineering control measures can serve to 7 minimize hazards where they cannot be eliminated, with preference for...during the materiel acquisitions process, and (c) will evaluate a sample of the database for accuracy by comparing the data entries to original reports
Minimal-post-processing 320-Gbps true random bit generation using physical white chaos.
Wang, Anbang; Wang, Longsheng; Li, Pu; Wang, Yuncai
2017-02-20
Chaotic external-cavity semiconductor laser (ECL) is a promising entropy source for generation of high-speed physical random bits or digital keys. The rate and randomness is unfortunately limited by laser relaxation oscillation and external-cavity resonance, and is usually improved by complicated post processing. Here, we propose using a physical broadband white chaos generated by optical heterodyning of two ECLs as entropy source to construct high-speed random bit generation (RBG) with minimal post processing. The optical heterodyne chaos not only has a white spectrum without signature of relaxation oscillation and external-cavity resonance but also has a symmetric amplitude distribution. Thus, after quantization with a multi-bit analog-digital-convertor (ADC), random bits can be obtained by extracting several least significant bits (LSBs) without any other processing. In experiments, a white chaos with a 3-dB bandwidth of 16.7 GHz is generated. Its entropy rate is estimated as 16 Gbps by single-bit quantization which means a spectrum efficiency of 96%. With quantization using an 8-bit ADC, 320-Gbps physical RBG is achieved by directly extracting 4 LSBs at 80-GHz sampling rate.
NASA Astrophysics Data System (ADS)
Loch-Olszewska, Hanna; Szwabiński, Janusz
2018-05-01
The ergodicity breaking phenomenon has already been in the area of interest of many scientists, who tried to uncover its biological and chemical origins. Unfortunately, testing ergodicity in real-life data can be challenging, as sample paths are often too short for approximating their asymptotic behaviour. In this paper, the authors analyze the minimal lengths of empirical trajectories needed for claiming the ɛ-ergodicity based on two commonly used variants of an autoregressive fractionally integrated moving average model. The dependence of the dynamical functional on the parameters of the process is studied. The problem of choosing proper ɛ for ɛ-ergodicity testing is discussed with respect to especially the variation of the innovation process and the data sample length, with a presentation on two real-life examples.
Loch-Olszewska, Hanna; Szwabiński, Janusz
2018-05-28
The ergodicity breaking phenomenon has already been in the area of interest of many scientists, who tried to uncover its biological and chemical origins. Unfortunately, testing ergodicity in real-life data can be challenging, as sample paths are often too short for approximating their asymptotic behaviour. In this paper, the authors analyze the minimal lengths of empirical trajectories needed for claiming the ε-ergodicity based on two commonly used variants of an autoregressive fractionally integrated moving average model. The dependence of the dynamical functional on the parameters of the process is studied. The problem of choosing proper ε for ε-ergodicity testing is discussed with respect to especially the variation of the innovation process and the data sample length, with a presentation on two real-life examples.
Evaluation of low-dose irradiation on microbiological quality of white carrots and string beans
NASA Astrophysics Data System (ADS)
Koike, Amanda C. R.; Santillo, Amanda G.; Rodrigues, Flávio T.; Duarte, Renato C.; Villavicencio, Anna Lucia C. H.
2012-08-01
The minimally processed food provided the consumer with a product quality, safety and practicality. However, minimal processing of food does not reduce pathogenic population of microorganisms to safe levels. Ionizing radiation used in low doses is effective to maintain the quality of food, reducing the microbiological load but rather compromising the nutritional values and sensory property. The association of minimal processing with irradiation could improve the quality and safety of product. The purpose of this study was to evaluate the effectiveness of low-doses of ionizing radiation on the reduction of microorganisms in minimally processed foods. The results show that the ionizing radiation of minimally processed vegetables could decontaminate them without several changes in its properties.
Low-Power SOI CMOS Transceiver
NASA Technical Reports Server (NTRS)
Fujikawa, Gene (Technical Monitor); Cheruiyot, K.; Cothern, J.; Huang, D.; Singh, S.; Zencir, E.; Dogan, N.
2003-01-01
The work aims at developing a low-power Silicon on Insulator Complementary Metal Oxide Semiconductor (SOI CMOS) Transceiver for deep-space communications. RF Receiver must accomplish the following tasks: (a) Select the desired radio channel and reject other radio signals, (b) Amplify the desired radio signal and translate them back to baseband, and (c) Detect and decode the information with Low BER. In order to minimize cost and achieve high level of integration, receiver architecture should use least number of external filters and passive components. It should also consume least amount of power to minimize battery cost, size, and weight. One of the most stringent requirements for deep-space communication is the low-power operation. Our study identified that two candidate architectures listed in the following meet these requirements: (1) Low-IF receiver, (2) Sub-sampling receiver. The low-IF receiver uses minimum number of external components. Compared to Zero-IF (Direct conversion) architecture, it has less severe offset and flicker noise problems. The Sub-sampling receiver amplifies the RF signal and samples it using track-and-hold Subsampling mixer. These architectures provide low-power solution for the short- range communications missions on Mars. Accomplishments to date include: (1) System-level design and simulation of a Double-Differential PSK receiver, (2) Implementation of Honeywell SOI CMOS process design kit (PDK) in Cadence design tools, (3) Design of test circuits to investigate relationships between layout techniques, geometry, and low-frequency noise in SOI CMOS, (4) Model development and verification of on-chip spiral inductors in SOI CMOS process, (5) Design/implementation of low-power low-noise amplifier (LNA) and mixer for low-IF receiver, and (6) Design/implementation of high-gain LNA for sub-sampling receiver. Our initial results show that substantial improvement in power consumption is achieved using SOI CMOS as compared to standard CMOS process. Potential advantages of SOI CMOS for deep-space communication electronics include: (1) Radiation hardness, (2) Low-power operation, and (3) System-on-Chip (SOC) solutions.
Raman spectroscopy as a PAT for pharmaceutical blending: Advantages and disadvantages.
Riolo, Daniela; Piazza, Alessandro; Cottini, Ciro; Serafini, Margherita; Lutero, Emilio; Cuoghi, Erika; Gasparini, Lorena; Botturi, Debora; Marino, Iari Gabriel; Aliatis, Irene; Bersani, Danilo; Lottici, Pier Paolo
2018-02-05
Raman spectroscopy has been positively evaluated as a tool for the in-line and real-time monitoring of powder blending processes and it has been proved to be effective in the determination of the endpoint of the mixing, showing its potential role as process analytical technology (PAT). The aim of this study is to show advantages and disadvantages of Raman spectroscopy with respect to the most traditional HPLC analysis. The spectroscopic results, obtained directly on raw powders, sampled from a two-axis blender in real case conditions, were compared with the chromatographic data obtained on the same samples. The formulation blend used for the experiment consists of active pharmaceutical ingredient (API, concentrations 6.0% and 0.5%), lactose and magnesium stearate (as excipients). The first step of the monitoring process was selecting the appropriate wavenumber region where the Raman signal of API is maximal and interference from the spectral features of excipients is minimal. Blend profiles were created by plotting the area ratios of the Raman peak of API (A API ) at 1598cm -1 and the Raman bands of excipients (A EXC ), in the spectral range between 1560 and 1630cm -1 , as a function of mixing time: the API content can be considered homogeneous when the time-dependent dispersion of the area ratio is minimized. In order to achieve a representative sampling with Raman spectroscopy, each sample was mapped in a motorized XY stage by a defocused laser beam of a micro-Raman apparatus. Good correlation between the two techniques has been found only for the composition at 6.0% (w/w). However, standard deviation analysis, applied to both HPLC and Raman data, showed that Raman results are more substantial than HPLC ones, since Raman spectroscopy enables generating data rich blend profiles. In addition, the relative standard deviation calculated from a single map (30 points) turned out to be representative of the degree of homogeneity for that blend time. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sydoff, Marie; Stenström, Kristina
2010-04-01
The Department of Physics at Lund University is participating in a European Union project called EUMAPP (European Union Microdose AMS Partnership Programme), in which sample preparation and accelerator mass spectrometry (AMS) measurements of biological samples from microdosing studies have been made. This paper describes a simplified method of converting biological samples to solid graphite for 14C analysis with AMS. The method is based on online combustion of the samples, and reduction of CO 2 in septa-sealed vials. The septa-sealed vials and disposable materials are used to eliminate sample cross-contamination. Measurements of ANU and Ox I standards show deviations of 2% and 3%, respectively, relative to reference values. This level of accuracy is sufficient for biological samples from microdosing studies. Since the method has very few handling steps from sample to graphite, the risk of failure during the sample preparation process is minimized, making the method easy to use in routine preparation of samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
TERRI, FELLINGER
2004-12-21
The Defense Waste Processing Facility, DWPF, currently generates approximately 1.4 million gallons of recycle water per year during Sludge-Only operations. DWPF has minimized condensate generation to 1.4 million gallons by not operating the Steam Atomized Scrubbers, SASs, for the melter off gas system. By not operating the SASs, DWPF has reduced the total volume by approximately 800,000 gallons of condensate per year. Currently, the recycle stream is sent to back to the Tank Farm and processed through the 2H Evaporator system. To alleviate the load on the 2H Evaporator system, an acid evaporator design is being considered as an alternatemore » processing and/or concentration method for the DWPF recycle stream. In order to support this alternate processing option, the DWPF has requested that the chemical and radionuclide compositions of the Off Gas Condensate Tank, OGCT, Slurry Mix Evaporator Condensate Tank, SMECT, Recycle Collection Tank, RCT, and the Decontamination Waste Treatment Tank, DWTT, be determined as a part of the process development work for the acid evaporator design. Samples have been retrieved from the OGCT, RCT, and SMECT and have been sent to the Savannah River National Laboratory, SRNL for this characterization. The DWTT samples have been recently shipped to SRNL. The results for the DWTT samples will be issued at later date.« less
Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi
2016-12-15
Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.
Preliminary Assessment/Site Inspection Work Plan for Granite Mountain Radio Relay System
1994-09-01
represent field conditions, and (3) sampling results are repeatable. Final (04 WV---,,1-, ,W•, S 2, mbr . 19W4 13 RyCWed 1.5.2 Sample Handling Sample...procedures specified in Section 2.1.3. Samples collected from shallow depths will be obtained by submerging a stainless- steel, Teflon, or glass... submerged in a manner that minimizes agitation of sediment and the water sample. If a seep or spring has minimal discharge flow, gravel, boulders, and soil
Functionalization of SiO2 Surfaces for Si Monolayer Doping with Minimal Carbon Contamination.
van Druenen, Maart; Collins, Gillian; Glynn, Colm; O'Dwyer, Colm; Holmes, Justin D
2018-01-17
Monolayer doping (MLD) involves the functionalization of semiconductor surfaces followed by an annealing step to diffuse the dopant into the substrate. We report an alternative doping method, oxide-MLD, where ultrathin SiO 2 overlayers are functionalized with phosphonic acids for doping Si. Similar peak carrier concentrations were achieved when compared with hydrosilylated surfaces (∼2 × 10 20 atoms/cm 3 ). Oxide-MLD offers several advantages over conventional MLD, such as ease of sample processing, superior ambient stability, and minimal carbon contamination. The incorporation of an oxide layer minimizes carbon contamination by facilitating attachment of carbon-free precursors or by impeding carbon diffusion. The oxide-MLD strategy allows selection of many inexpensive precursors and therefore allows application to both p- and n-doping. The phosphonic acid-functionalized SiO 2 surfaces were investigated using X-ray photoelectron spectroscopy and attenuated total reflectance Fourier transform infrared spectroscopy, whereas doping was assessed using electrochemical capacitance voltage and Hall measurements.
Keaveney, Edel M; Price, Ruth K; Hamill, Lesley L; Wallace, Julie M W; McNulty, Helene; Ward, Mary; Strain, J J; Ueland, Per M; Molloy, Anne M; Piironen, Vieno; von Reding, Walter; Shewry, Peter R; Ward, Jane L; Welch, Robert W
2015-02-14
The bran and particularly the aleurone fraction of wheat are high in betaine and other physiological methyl donors, which may exert beneficial physiological effects. We conducted two randomised, controlled, cross-over postprandial studies to assess and compare plasma betaine and other methyl donor-related responses following the consumption of minimally processed bran and aleurone fractions (study A) and aleurone bread (study B). For both studies, standard pharmacokinetic parameters were derived for betaine, choline, folate, dimethylglycine (DMG), total homocysteine and methionine from plasma samples taken at 0, 0·5, 1, 2 and 3 h. In study A (n 14), plasma betaine concentrations were significantly and substantially elevated from 0·5 to 3 h following the consumption of both bran and aleurone compared with the control; however, aleurone gave significantly higher responses than bran. Small, but significant, increases were also observed in DMG measures; however, no significant responses were observed in other analytes. In study B (n 13), plasma betaine concentrations were significantly and substantially higher following consumption of the aleurone bread compared with the control bread; small, but significant, increases were also observed in DMG and folate measures in response to consumption of the aleurone bread; however, no significant responses were observed in other analytes. Peak plasma betaine concentrations, which were 1·7-1·8 times the baseline levels, were attained earlier following the consumption of minimally processed aleurone compared with the aleurone bread (time taken to reach peak concentration 1·2 v. 2·1 h). These results showed that the consumption of minimally processed wheat bran, and particularly the aleurone fraction, yielded substantial postprandial increases in plasma betaine concentrations. Furthermore, these effects appear to be maintained when aleurone was incorporated into bread.
Opposed Jet Turbulent Diffusion Flames
1990-09-05
not well known, and the effect of turbulence on the mixing 1 process in a stagnation flame, is still an important issue. Our purpose is to address this...is obtained by the relationship L()£)=N(c)c To minimize noise problems statistics over 30 samples have been processed . Figs 10 a,b show a comparison...ToaFinal IFRomiQLjL5L.To 3/31/J94 September 5, 1990 68 I$ u’I14A16. SUP6LSMENTAAY NOTATION1 1 .COSAn COME I& SuejEcT TanaS (ca 4Ww o"I inwv &Wd iW
Experimentally validated mathematical model of analyte uptake by permeation passive samplers.
Salim, F; Ioannidis, M; Górecki, T
2017-11-15
A mathematical model describing the sampling process in a permeation-based passive sampler was developed and evaluated numerically. The model was applied to the Waterloo Membrane Sampler (WMS), which employs a polydimethylsiloxane (PDMS) membrane as a permeation barrier, and an adsorbent as a receiving phase. Samplers of this kind are used for sampling volatile organic compounds (VOC) from air and soil gas. The model predicts the spatio-temporal variation of sorbed and free analyte concentrations within the sampler components (membrane, sorbent bed and dead volume), from which the uptake rate throughout the sampling process can be determined. A gradual decline in the uptake rate during the sampling process is predicted, which is more pronounced when sampling higher concentrations. Decline of the uptake rate can be attributed to diminishing analyte concentration gradient within the membrane, which results from resistance to mass transfer and the development of analyte concentration gradients within the sorbent bed. The effects of changing the sampler component dimensions on the rate of this decline in the uptake rate can be predicted from the model. Performance of the model was evaluated experimentally for sampling of toluene vapors under controlled conditions. The model predictions proved close to the experimental values. The model provides a valuable tool to predict changes in the uptake rate during sampling, to assign suitable exposure times at different analyte concentration levels, and to optimize the dimensions of the sampler in a manner that minimizes these changes during the sampling period.
NASA Astrophysics Data System (ADS)
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples.
Aerospace Non Chrome Corrosion Inhibiting Primer Systems
2009-09-01
Meet all HSE specifications, TSCA, REACh, Akzo • Strippable • No weight increase over current system • Meet specification requirements for corrosion, not...Positive and negative controls • Primer only and topcoated samples Aerospace Coatings | Title 9 OEM CF Optimization/ Down Selects •Usual issues...found to be true • Good NSS ≠ Good filiform ≠ Good cure ≠ Good application properties • Down select process is to minimize ≠ and move to a balance of
A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff
Breault, Robert F.; Granato, Gregory E.
2000-01-01
Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.
Scalable and balanced dynamic hybrid data assimilation
NASA Astrophysics Data System (ADS)
Kauranne, Tuomo; Amour, Idrissa; Gunia, Martin; Kallio, Kari; Lepistö, Ahti; Koponen, Sampsa
2017-04-01
Scalability of complex weather forecasting suites is dependent on the technical tools available for implementing highly parallel computational kernels, but to an equally large extent also on the dependence patterns between various components of the suite, such as observation processing, data assimilation and the forecast model. Scalability is a particular challenge for 4D variational assimilation methods that necessarily couple the forecast model into the assimilation process and subject this combination to an inherently serial quasi-Newton minimization process. Ensemble based assimilation methods are naturally more parallel, but large models force ensemble sizes to be small and that results in poor assimilation accuracy, somewhat akin to shooting with a shotgun in a million-dimensional space. The Variational Ensemble Kalman Filter (VEnKF) is an ensemble method that can attain the accuracy of 4D variational data assimilation with a small ensemble size. It achieves this by processing a Gaussian approximation of the current error covariance distribution, instead of a set of ensemble members, analogously to the Extended Kalman Filter EKF. Ensemble members are re-sampled every time a new set of observations is processed from a new approximation of that Gaussian distribution which makes VEnKF a dynamic assimilation method. After this a smoothing step is applied that turns VEnKF into a dynamic Variational Ensemble Kalman Smoother VEnKS. In this smoothing step, the same process is iterated with frequent re-sampling of the ensemble but now using past iterations as surrogate observations until the end result is a smooth and balanced model trajectory. In principle, VEnKF could suffer from similar scalability issues as 4D-Var. However, this can be avoided by isolating the forecast model completely from the minimization process by implementing the latter as a wrapper code whose only link to the model is calling for many parallel and totally independent model runs, all of them implemented as parallel model runs themselves. The only bottleneck in the process is the gathering and scattering of initial and final model state snapshots before and after the parallel runs which requires a very efficient and low-latency communication network. However, the volume of data communicated is small and the intervening minimization steps are only 3D-Var, which means their computational load is negligible compared with the fully parallel model runs. We present example results of scalable VEnKF with the 4D lake and shallow sea model COHERENS, assimilating simultaneously continuous in situ measurements in a single point and infrequent satellite images that cover a whole lake, with the fully scalable VEnKF.
Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel
NASA Astrophysics Data System (ADS)
Xie, Yanmin
2011-08-01
Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Micro-balance sensor integrated with atomic layer deposition chamber
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinson, Alex B. F.; Libera, Joseph A.; Elam, Jeffrey W.
The invention is directed to QCM measurements in monitoring ALD processes. Previously, significant barriers remain in the ALD processes and accurate execution. To turn this exclusively dedicated in situ technique into a routine characterization method, an integral QCM fixture was developed. This new design is easily implemented on a variety of ALD tools, allows rapid sample exchange, prevents backside deposition, and minimizes both the footprint and flow disturbance. Unlike previous QCM designs, the fast thermal equilibration enables tasks such as temperature-dependent studies and ex situ sample exchange, further highlighting the feasibility of this QCM design for day-to-day use. Finally, themore » in situ mapping of thin film growth rates across the ALD reactor was demonstrated in a popular commercial tool operating in both continuous and quasi-static ALD modes.« less
A minimum distance estimation approach to the two-sample location-scale problem.
Zhang, Zhiyi; Yu, Qiqing
2002-09-01
As reported by Kalbfleisch and Prentice (1980), the generalized Wilcoxon test fails to detect a difference between the lifetime distributions of the male and female mice died from Thymic Leukemia. This failure is a result of the test's inability to detect a distributional difference when a location shift and a scale change exist simultaneously. In this article, we propose an estimator based on the minimization of an average distance between two independent quantile processes under a location-scale model. Large sample inference on the proposed estimator, with possible right-censorship, is discussed. The mouse leukemia data are used as an example for illustration purpose.
Gómez-López, Vicente M; Ragaert, Peter; Jeyachchandran, Visvalingam; Debevere, Johan; Devlieghere, Frank
2008-01-15
Gaseous ClO2 was evaluated for effectiveness in prolonging the shelf-life of minimally processed (MP) lettuce and MP cabbage, previously immersed in a cysteine solution in order to inhibit browning occurring during ClO2 treatment. Each vegetable was shredded, washed, and separated in two portions, one to be treated with ClO2 gas and the other to remain untreated as reference sample. The batch to be treated with ClO2 gas was immersed for 1 min in a 0.5% solution of HCl.L-cysteine monohydrate. Then both batches were spun dried. MP vegetables were decontaminated in a cabinet at 90-91% relative humidity and 22-25 degrees C up to 10 min, including 30 s of ClO2 injection into the cabinet. The ClO2 concentration rose to 1.74 mg/L (MP lettuce) and 1.29 mg/L (MP cabbage). Then samples were stored under modified atmosphere at 7 degrees C for shelf-life studies. Changes in O2 and CO2 headspace concentrations, microbiological quality (aerobic plate count (APC), psychrotrophs, lactic acid bacteria, and yeasts), sensory quality, and pH were followed during storage. The respiration rate of the minimally processed vegetables was significantly increased by the ClO2 gas treatment only in the case of MP cabbage (P<0.05). The gas treatment reduced initially APC and psychrotroph count of MP lettuce and APC, psychrotroph counts, yeast counts and pH of MP cabbage (P<0.05). ClO2 treatment did not cause initially any significant (P<0.05) sensorial alteration, except for a weak off-odour in MP lettuce. Interestingly, no browning was observed after treating, which can be accounted to the use of L-cysteine. Although an initial microbiological reduction was observed due to ClO2 gas treatment, APC and psychrotroph counts reached in the samples treated with ClO2 higher levels than in those non-treated with ClO2 before the third day of the shelf-life study. Untreated and treated samples of MP lettuce were sensorial unacceptable due to bad overall visual quality after 4 days, while treated and untreated MP cabbage remained sensorial acceptable during the 9 days of the study. L-cysteine reduced (P<0.05) the decontamination efficacy of ClO2 when applied to MP cabbage but not in the case of MP lettuce. Gaseous ClO2 failed to prolong the shelf-life of MP lettuce and MP cabbage, the reason for the enhanced growth of microorganisms in decontaminated samples should be investigated. Nonetheless, our results prove that it is possible to inhibit browning caused by ClO2.
Zeugner, Silke; Mayr, Thomas; Zietz, Christian; Aust, Daniela E; Baretton, Gustavo B
2015-01-01
The term "pre-analytics" summarizes all procedures concerned with specimen collection or processing as well as logistical aspects like transport or storage of tissue specimens. All or these variables as well as tissue-specific characteristics affect sample quality. While certain parameters like warm ischemia or tissue-specific characteristics cannot be changed, other parameters can be assessed and optimized. The aim of this study was to determine RNA quality by assessing the RIN values of specimens from different organs and to assess the influence of vacuum preservation. Samples from the GI tract, in general, appear to have lower RNA quality when compared to samples from other organ sites. This may be due to the digestive enzymes or bacterial colonization. Processing time in pathology does not significantly influence RNA quality. Tissue preservation with a vacuum sealer leads to preserved RNA quality over an extended period of time and offers a feasible alternative to minimize the influence of transport time into pathology.
A Minimal Optical Trapping and Imaging Microscopy System
Hernández Candia, Carmen Noemí; Tafoya Martínez, Sara; Gutiérrez-Medina, Braulio
2013-01-01
We report the construction and testing of a simple and versatile optical trapping apparatus, suitable for visualizing individual microtubules (∼25 nm in diameter) and performing single-molecule studies, using a minimal set of components. This design is based on a conventional, inverted microscope, operating under plain bright field illumination. A single laser beam enables standard optical trapping and the measurement of molecular displacements and forces, whereas digital image processing affords real-time sample visualization with reduced noise and enhanced contrast. We have tested our trapping and imaging instrument by measuring the persistence length of individual double-stranded DNA molecules, and by following the stepping of single kinesin motor proteins along clearly imaged microtubules. The approach presented here provides a straightforward alternative for studies of biomaterials and individual biomolecules. PMID:23451216
Roll, Isaac B.; Driver, Erin M.; Halden, Rolf U.
2016-01-01
Annual U.S. expenditures of $2B for site characterization invite the development of new technologies to improve data quality while reducing costs and minimizing uncertainty in groundwater monitoring. This work presents a new instrument for time-integrated sampling of environmental fluids using in situ solid-phase extraction (SPE). The In Situ Sampler (IS2) is an automated submersible device capable of extracting dissolved contaminants from water (100s – 1000s mL) over extended periods (hours to weeks), retaining the analytes, and rejecting the processed fluid. A field demonstration of the IS2 revealed 28-day average concentration of hexavalent chromium in a shallow aquifer affected by tidal stresses via sampling of groundwater as both liquid and sorbed composite samples, each obtained in triplicate. In situ SPE exhibited 75 ± 6% recovery and an 8-fold improvement in reporting limit. Relative to use of conventional methods (100%), beneficial characteristics of the device and method included minimal hazardous material generation (2%), transportation cost (10%), and associated carbon footprint (2%). The IS2 is compatible with commercial SPE resins and standard extraction methods, and has been certified for more general use (i.e., inorganics and organics) by the Environmental Security Technology Certification Program (ESTCP) of the U.S. Department of Defense. PMID:26971208
Roll, Isaac B; Driver, Erin M; Halden, Rolf U
2016-06-15
Annual U.S. expenditures of $2B for site characterization invite the development of new technologies to improve data quality while reducing costs and minimizing uncertainty in groundwater monitoring. This work presents a new instrument for time-integrated sampling of environmental fluids using in situ solid-phase extraction (SPE). The In Situ Sampler (IS2) is an automated submersible device capable of extracting dissolved contaminants from water (100s-1000smL) over extended periods (hours to weeks), retaining the analytes, and rejecting the processed fluid. A field demonstration of the IS2 revealed 28-day average concentration of hexavalent chromium in a shallow aquifer affected by tidal stresses via sampling of groundwater as both liquid and sorbed composite samples, each obtained in triplicate. In situ SPE exhibited 75±6% recovery and an 8-fold improvement in reporting limit. Relative to use of conventional methods (100%), beneficial characteristics of the device and method included minimal hazardous material generation (2%), transportation cost (10%), and associated carbon footprint (2%). The IS2 is compatible with commercial SPE resins and standard extraction methods, and has been certified for more general use (i.e., inorganics and organics) by the Environmental Security Technology Certification Program (ESTCP) of the U.S. Department of Defense. Copyright © 2016 Elsevier B.V. All rights reserved.
Survey methods for assessing land cover map accuracy
Nusser, S.M.; Klaas, E.E.
2003-01-01
The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.
Numerical study of the process parameters in spark plasma sintering (sps)
NASA Astrophysics Data System (ADS)
Chowdhury, Redwan Jahid
Spark plasma sintering (SPS) is one of the most widely used sintering techniques that utilizes pulsed direct current together with uniaxial pressure to consolidate a wide variety of materials. The unique mechanisms of SPS enable it to sinter powder compacts at a lower temperature and in a shorter time than the conventional hot pressing, hot isostatic pressing and vacuum sintering process. One of the limitations of SPS is the presence of temperature gradients inside the sample, which could result in non-uniform physical and microstructural properties. Detailed study of the temperature and current distributions inside the sintered sample is necessary to minimize the temperature gradients and achieve desired properties. In the present study, a coupled thermal-electric model was developed using finite element codes in ABAQUS software to investigate the temperature and current distributions inside the conductive and non-conductive samples. An integrated experimental-numerical methodology was implemented to determine the system contact resistances accurately. The developed sintering model was validated by a series of experiments, which showed good agreements with simulation results. The temperature distribution inside the sample depends on some process parameters such as sample and tool geometry, punch and die position, applied current and thermal insulation around the die. The role of these parameters on sample temperature distribution was systematically analyzed. The findings of this research could prove very useful for the reliable production of large size sintered samples with controlled and tailored properties.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Stephenson, Serena; Pollard, Maria; Boit, Kipchirchir
2013-09-01
The prevalence of optical spectroscopy techniques being applied to the online analysis of continuous processes has increased in the past couple of decades. The ability to continuously "watch" changing stream compositions as operating conditions change has proven invaluable to pilot and world-scale manufacturing in the chemical and petrochemical industries. Presented here is an application requiring continuous monitoring of parts per million (ppm) by weight levels of hydrogen chloride (HCl), water (H2O), and carbon dioxide (CO2) in two gas-phase streams, one nitrogen-rich and one ethylene-rich. Because ethylene has strong mid-infrared (IR) absorption, building an IR method capable of quantifying HCl, H2O, and CO2 posed some challenges. A long-path (5.11m) Fourier transform infrared (FT-IR) spectrometer was used in the mid-infrared region between 1800 and 5000 cm(-1), with a 1 cm(-1) resolution and a 10 s spectral update time. Sample cell temperature and pressure were controlled and measured to minimize measurement variability. Models using a modified classical least squares method were developed and validated first in the laboratory and then using the process stream. Analytical models and process sampling conditions were adjusted to minimize interference of ethylene in the ethylene-rich stream. The predictive capabilities of the measurements were ±0.5 ppm for CO2 in either stream; ±1.1 and ±1.3 ppm for H2O in the nitrogen-rich and ethylene-rich streams, respectively; and ±1.0 and ±2.4 ppm for HCl in the nitrogen-rich and ethylene-rich streams, respectively. Continuous operation of the instrument in the process stream was demonstrated using an automated stream switching sample system set to 10 min intervals. Response time for all components of interest was sufficient to acquire representative stream composition data. This setup provides useful insight into the process for troubleshooting and optimizing plant operating conditions.
Diffusion in Single Supported Lipid Bilayers
NASA Astrophysics Data System (ADS)
Armstrong, C. L.; Trapp, M.; Rheinstädter, M. C.
2011-03-01
Despite their potential relevance for the development of functionalized surfaces and biosensors, the study of single supported membranes using neutron scattering has been limited by the challenge of obtaining relevant dynamic information from a sample with minimal material. Using state of the art neutron instrumentation we have, for the first time, modeled lipid diffusion in single supported lipid bilayers. While we find that the diffusion coefficient for the single bilayer system is comparable to a multi-lamellar lipid system, the molecular mechanism for lipid motion in the single bilayer is a continuous diffusion process with no sign of the flow-like ballistic motion reported in the stacked membrane system. In the future, these membranes will be used to hold and align proteins, mimicking physiological conditions enabling the study of protein structure, function and interactions in relevant and highly topical membrane/protein systems with minimal sample material. C.L. Armstrong, M.D. Kaye, M. Zamponi, E. Mamontov, M. Tyagi, T. Jenkins and M.C. Rheinstädter, Soft Matter Communication, 2010, Advance Article, DOI: 10.1039/C0SM00637H
Pastor, Antoni; Farré, Magí; Fitó, Montserrat; Fernandez-Aranda, Fernando; de la Torre, Rafael
2014-05-01
The analysis of peripheral endocannabinoids (ECs) is a good biomarker of the EC system. Their concentrations, from clinical studies, strongly depend on sample collection and time processing conditions taking place in clinical and laboratory settings. The analysis of 2-monoacylglycerols (MGs) (i.e., 2-arachidonoylglycerol or 2-oleoylglycerol) is a particularly challenging issue because of their ex vivo formation and chemical isomerization that occur after blood sample collection. We provide evidence that their ex vivo formation can be minimized by adding Orlistat, an enzymatic lipase inhibitor, to plasma. Taking into consideration the low cost of Orlistat, we recommend its addition to plasma collecting tubes while maintaining sample cold chain until storage. We have validated a method for the determination of the EC profile of a range of MGs and N-acylethanolamides in plasma that preserves the original isomer ratio of MGs. Nevertheless, the chemical isomerization of 2-MGs can only be avoided by an immediate processing and analysis of samples due to their instability during conservation. We believe that this new methodology can aid in the harmonization of the measurement of ECs and related compounds in clinical samples.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-08-01
The high-pressure processing conditions were optimized for pineapple puree within the domain of 400-600 MPa, 40-60 °C, and 10-20 min using the response surface methodology (RSM). The target was to maximize the inactivation of polyphenoloxidase (PPO) along with a minimal loss in beneficial bromelain (BRM) activity, ascorbic acid (AA) content, antioxidant capacity, and color in the sample. The optimum condition was 600 MPa, 50 °C, and 13 min, having the highest desirability of 0.604, which resulted in 44% PPO and 47% BRM activities. However, 93% antioxidant activity and 85% AA were retained in optimized sample with a total color change (∆E*) value less than 2.5. A 10-fold reduction in PPO activity was obtained at 600 MPa/70 °C/20 min; however, the thermal degradation of nutrients was severe at this condition. Fuzzy mathematical approach confirmed that sensory acceptance of the optimized sample was close to the fresh sample; whereas, the thermally pasteurized sample (treated at 0.1 MPa, 95 °C for 12 min) had the least sensory score as compared to others. © 2015 Institute of Food Technologists®
Self-paced model learning for robust visual tracking
NASA Astrophysics Data System (ADS)
Huang, Wenhui; Gu, Jason; Ma, Xin; Li, Yibin
2017-01-01
In visual tracking, learning a robust and efficient appearance model is a challenging task. Model learning determines both the strategy and the frequency of model updating, which contains many details that could affect the tracking results. Self-paced learning (SPL) has recently been attracting considerable interest in the fields of machine learning and computer vision. SPL is inspired by the learning principle underlying the cognitive process of humans, whose learning process is generally from easier samples to more complex aspects of a task. We propose a tracking method that integrates the learning paradigm of SPL into visual tracking, so reliable samples can be automatically selected for model learning. In contrast to many existing model learning strategies in visual tracking, we discover the missing link between sample selection and model learning, which are combined into a single objective function in our approach. Sample weights and model parameters can be learned by minimizing this single objective function. Additionally, to solve the real-valued learning weight of samples, an error-tolerant self-paced function that considers the characteristics of visual tracking is proposed. We demonstrate the robustness and efficiency of our tracker on a recent tracking benchmark data set with 50 video sequences.
Riley, Paul W.; Gallea, Benoit; Valcour, Andre
2017-01-01
Background: Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. Methods: The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Results: Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. Conclusions: To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process. PMID:28706751
Riley, Paul W; Gallea, Benoit; Valcour, Andre
2017-01-01
Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Numerical simulation of residual stress in laser based additive manufacturing process
NASA Astrophysics Data System (ADS)
Kalyan Panda, Bibhu; Sahoo, Seshadev
2018-03-01
Minimizing the residual stress build-up in metal-based additive manufacturing plays a pivotal role in selecting a particular material and technique for making an industrial part. In beam-based additive manufacturing, although a great deal of effort has been made to minimize the residual stresses, it is still elusive how to do so by simply optimizing the processing parameters, such as beam size, beam power, and scan speed. Amid different types of additive manufacturing processes, Direct Metal Laser Sintering (DMLS) process uses a high-power laser to melt and sinter layers of metal powder. The rapid solidification and heat transfer on powder bed endows a high cooling rate which leads to the build-up of residual stresses, that will affect the mechanical properties of the build parts. In the present work, the authors develop a numerical thermo-mechanical model for the measurement of residual stress in the AlSi10Mg build samples by using finite element method. Transient temperature distribution in the powder bed was assessed using the coupled thermal to structural model. Subsequently, the residual stresses were estimated with varying laser power. From the simulation result, it found that the melt pool dimensions increase with increasing the laser power and the magnitude of residual stresses in the built part increases.
An open-source and low-cost monitoring system for precision enology.
Di Gennaro, Salvatore Filippo; Matese, Alessandro; Mancin, Mirko; Primicerio, Jacopo; Palliotti, Alberto
2014-12-05
Winemaking is a dynamic process, where microbiological and chemical effects may strongly differentiate products from the same vineyard and even between wine vats. This high variability means an increase in work in terms of control and process management. The winemaking process therefore requires a site-specific approach in order to optimize cellar practices and quality management, suggesting a new concept of winemaking, identified as Precision Enology. The Institute of Biometeorology of the Italian National Research Council has developed a wireless monitoring system, consisting of a series of nodes integrated in barrel bungs with sensors for the measurement of wine physical and chemical parameters in the barrel. This paper describes an open-source evolution of the preliminary prototype, using Arduino-based technology. Results have shown good performance in terms of data transmission and accuracy, minimal size and power consumption. The system has been designed to create a low-cost product, which allows a remote and real-time control of wine evolution in each barrel, minimizing costs and time for sampling and laboratory analysis. The possibility of integrating any kind of sensors makes the system a flexible tool that can satisfy various monitoring needs.
Karunakaran, Chithra; Lahlali, Rachid; Zhu, Ning; Webb, Adam M.; Schmidt, Marina; Fransishyn, Kyle; Belev, George; Wysokinski, Tomasz; Olson, Jeremy; Cooper, David M. L.; Hallin, Emil
2015-01-01
Minimally invasive investigation of plant parts (root, stem, leaves, and flower) has good potential to elucidate the dynamics of plant growth, morphology, physiology, and root-rhizosphere interactions. Laboratory based absorption X-ray imaging and computed tomography (CT) systems are extensively used for in situ feasibility studies of plants grown in natural and artificial soil. These techniques have challenges such as low contrast between soil pore space and roots, long X-ray imaging time, and low spatial resolution. In this study, the use of synchrotron (SR) based phase contrast X-ray imaging (PCI) has been demonstrated as a minimally invasive technique for imaging plants. Above ground plant parts and roots of 10 day old canola and wheat seedlings grown in sandy clay loam soil were successfully scanned and reconstructed. Results confirmed that SR-PCI can deliver good quality images to study dynamic and real time processes such as cavitation and water-refilling in plants. The advantages of SR-PCI, effect of X-ray energy, and effective pixel size to study plant samples have been demonstrated. The use of contrast agents to monitor physiological processes in plants was also investigated and discussed. PMID:26183486
Ukuku, Dike O; Geveke, David J; Chau, Lee; Niemira, Brendan A
2016-08-16
Fresh-cut cantaloupes have been associated with outbreaks of Salmonellosis. Minimally processed fresh-cut fruits have a limited shelf life because of deterioration caused by spoilage microflora and physiological processes. The objectives of this study were to use a wet steam process to 1) reduce indigenous spoilage microflora and inoculated populations of Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes on the surface of cantaloupes, and 2) reduce the populations counts in cantaloupe fresh-cut pieces after rind removal and cutting. The average inocula of Salmonella, E. coli O157:H7 and Listeria monocytogenes was 10(7)CFU/ml and the populations recovered on the cantaloupe rind surfaces after inoculation averaged 4.5, 4.8 and 4.1logCFU/cm(2), respectively. Whole cantaloupes were treated with a wet steam processing unit for 180s, and the treated melons were stored at 5°C for 29days. Bacterial populations in fresh-cut pieces prepared from treated and control samples stored at 5 and 10°C for up to 12days were determined and changes in color (CIE L*, a*, and b*) due to treatments were measured during storage. Presence and growth of aerobic mesophilic bacteria and Salmonella, E. coli O157:H7 and L. monocytogenes were determined in fresh-cut cantaloupe samples. There were no visual signs of physical damage on all treated cantaloupe surfaces immediately after treatments and during storage. All fresh-cut pieces from treated cantaloupes rind surfaces were negative for bacterial pathogens even after an enrichment process. Steam treatment significantly (p<0.05) changed the color of the fresh-cut pieces. Minimal wet steam treatment of cantaloupes rind surfaces designated for fresh-cut preparation will enhance the microbial safety of fresh-cut pieces, by reducing total bacterial populations. This process holds the potential to significantly reduce the incidence of foodborne illness associated with fresh-cut fruits. Published by Elsevier B.V.
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhary, Kenny; Najm, Habib N.
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
Chowdhary, Kenny; Najm, Habib N.
2016-04-13
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
NASA Astrophysics Data System (ADS)
Aguilar-Arevalo, A. A.; Brown, B. C.; Bugel, L.; Cheng, G.; Church, E. D.; Conrad, J. M.; Dharmapalan, R.; Djurcic, Z.; Finley, D. A.; Ford, R.; Garcia, F. G.; Garvey, G. T.; Grange, J.; Huelsnitz, W.; Ignarra, C.; Imlay, R.; Johnson, R. A.; Karagiorgi, G.; Katori, T.; Kobilarcik, T.; Louis, W. C.; Mariani, C.; Marsh, W.; Mills, G. B.; Mirabal, J.; Moore, C. D.; Mousseau, J.; Nienaber, P.; Osmanov, B.; Pavlovic, Z.; Perevalov, D.; Polly, C. C.; Ray, H.; Roe, B. P.; Russell, A. D.; Shaevitz, M. H.; Spitz, J.; Stancu, I.; Tayloe, R.; Van de Water, R. G.; Wascko, M. O.; White, D. H.; Wickremasinghe, D. A.; Zeller, G. P.; Zimmerman, E. D.
2013-08-01
The largest sample ever recorded of ν¯μ charged-current quasielastic (CCQE, ν¯μ+p→μ++n) candidate events is used to produce the minimally model-dependent, flux-integrated double-differential cross section (d2σ)/(dTμdcosθμ) for ν¯μ CCQE for a mineral oil target. This measurement exploits the large statistics of the MiniBooNE antineutrino mode sample and provides the most complete information of this process to date. In order to facilitate historical comparisons, the flux-unfolded total cross section σ(Eν) and single-differential cross section (dσ)/(dQ2) on both mineral oil and on carbon are also reported. The observed cross section is somewhat higher than the predicted cross section from a model assuming independently acting nucleons in carbon with canonical form factor values. The shape of the data are also discrepant with this model. These results have implications for intranuclear processes and can help constrain signal and background processes for future neutrino oscillation measurements.
Efficiency of the Inertia Friction Welding Process and Its Dependence on Process Parameters
NASA Astrophysics Data System (ADS)
Senkov, O. N.; Mahaffey, D. W.; Tung, D. J.; Zhang, W.; Semiatin, S. L.
2017-07-01
It has been widely assumed, but never proven, that the efficiency of the inertia friction welding (IFW) process is independent of process parameters and is relatively high, i.e., 70 to 95 pct. In the present work, the effect of IFW parameters on process efficiency was established. For this purpose, a series of IFW trials was conducted for the solid-state joining of two dissimilar nickel-base superalloys (LSHR and Mar-M247) using various combinations of initial kinetic energy ( i.e., the total weld energy, E o), initial flywheel angular velocity ( ω o), flywheel moment of inertia ( I), and axial compression force ( P). The kinetics of the conversion of the welding energy to heating of the faying sample surfaces ( i.e., the sample energy) vs parasitic losses to the welding machine itself were determined by measuring the friction torque on the sample surfaces ( M S) and in the machine bearings ( M M). It was found that the rotating parts of the welding machine can consume a significant fraction of the total energy. Specifically, the parasitic losses ranged from 28 to 80 pct of the total weld energy. The losses increased (and the corresponding IFW process efficiency decreased) as P increased (at constant I and E o), I decreased (at constant P and E o), and E o (or ω o) increased (at constant P and I). The results of this work thus provide guidelines for selecting process parameters which minimize energy losses and increase process efficiency during IFW.
Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M
2017-05-01
microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.
NASA Astrophysics Data System (ADS)
Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad
2017-11-01
Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Freeze Technology for Nuclear Applications - 13590
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rostmark, Susanne C.; Knutsson, Sven; Lindberg, Maria
2013-07-01
Freezing of soil materials is a complicated process of a number of physical processes: - freezing of pore water in a thermal gradient, - cryogenic suction causing water migration and - ice formation expanding pores inducing frost heave. Structural changes due to increase of effective stress during freezing also take place. The over consolidation gives a powerful dewatering/drying effect and the freeze process causes separation of contaminates. Artificial ground freezing (AGF is a well established technique first practiced in south Wales, as early as 1862. AGF is mostly used to stabilize tunnels and excavations. During the last ten years underwatermore » applications of freeze technologies based on the AGF have been explored in Sweden. The technology can, and has been, used in many different steps in a remediation action. Freeze Sampling where undisturbed samples are removed in both soft and hard sediment/sludge, Freeze Dredging; retrieval of sediment with good precision and minimal redistribution, and Freeze Drying; volume reduction of contaminated sludge/sediment. The application of these technologies in a nuclear or radioactive environment provides several advantages. Sampling by freezing gives for example an advantage of an undisturbed sample taken at a specified depth, salvaging objects by freezing or removal of sludges is other applications of this, for the nuclear industry, novel technology. (authors)« less
Luiten, Claire M; Steenhuis, Ingrid Hm; Eyles, Helen; Ni Mhurchu, Cliona; Waterlander, Wilma E
2016-02-01
To examine the availability of packaged food products in New Zealand supermarkets by level of industrial processing, nutrient profiling score (NPSC), price (energy, unit and serving costs) and brand variety. Secondary analysis of cross-sectional survey data on packaged supermarket food and non-alcoholic beverages. Products were classified according to level of industrial processing (minimally, culinary and ultra-processed) and their NPSC. Packaged foods available in four major supermarkets in Auckland, New Zealand. Packaged supermarket food products for the years 2011 and 2013. The majority (84% in 2011 and 83% in 2013) of packaged foods were classified as ultra-processed. A significant positive association was found between the level of industrial processing and NPSC, i.e., ultra-processed foods had a worse nutrient profile (NPSC=11.63) than culinary processed foods (NPSC=7.95), which in turn had a worse nutrient profile than minimally processed foods (NPSC=3.27), P<0.001. No clear associations were observed between the three price measures and level of processing. The study observed many variations of virtually the same product. The ten largest food manufacturers produced 35% of all packaged foods available. In New Zealand supermarkets, ultra-processed foods comprise the largest proportion of packaged foods and are less healthy than less processed foods. The lack of significant price difference between ultra- and less processed foods suggests ultra-processed foods might provide time-poor consumers with more value for money. These findings highlight the need to improve the supermarket food supply by reducing numbers of ultra-processed foods and by reformulating products to improve their nutritional profile.
Zhou, Shiyue; Tello, Nadia; Harvey, Alex; Boyes, Barry; Orlando, Ron; Mechref, Yehia
2016-06-01
Glycans have numerous functions in various biological processes and participate in the progress of diseases. Reliable quantitative glycomic profiling techniques could contribute to the understanding of the biological functions of glycans, and lead to the discovery of potential glycan biomarkers for diseases. Although LC-MS is a powerful analytical tool for quantitative glycomics, the variation of ionization efficiency and MS intensity bias are influencing quantitation reliability. Internal standards can be utilized for glycomic quantitation by MS-based methods to reduce variability. In this study, we used stable isotope labeled IgG2b monoclonal antibody, iGlycoMab, as an internal standard to reduce potential for errors and to reduce variabililty due to sample digestion, derivatization, and fluctuation of nanoESI efficiency in the LC-MS analysis of permethylated N-glycans released from model glycoproteins, human blood serum, and breast cancer cell line. We observed an unanticipated degradation of isotope labeled glycans, tracked a source of such degradation, and optimized a sample preparation protocol to minimize degradation of the internal standard glycans. All results indicated the effectiveness of using iGlycoMab to minimize errors originating from sample handling and instruments. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Reversal electron attachment ionizer for detection of trace species
NASA Technical Reports Server (NTRS)
Bernius, Mark T. (Inventor); Chutjian, Ara (Inventor)
1990-01-01
An in-line reversal electron, high-current ionizer capable of focusing a beam of electrons to a reversal region and executing a reversal of said electrons, such that the electrons possess zero kinetic energy at the point of reversal, may be used to produce both negative and positive ions. A sample gas is introduced at the point of electron reversal for low energy electron-(sample gas) molecule attachment with high efficiency. The attachment process produces negative ions from the sample gas, which includes species present in trace (minute) amounts. These ions are extracted efficiently and directed to a mass analyzer where they may be detected and identified. The generation and detection of positive ions is accomplished in a similar fashion with minimal adjustment to potentials applied to the apparatus.
Reversal electron attachment ionizer for detection of trace species
NASA Technical Reports Server (NTRS)
Bernius, Mark T. (Inventor); Chutjian, Ara (Inventor)
1989-01-01
An in-line reversal electron, high-current ionizer capable of focusing a beam of electrons to a reversal region and executing a reversal of the electrons, such that the electrons possess zero kinetic energy at the point of reversal, may be used to produce both negative and positive ions. A sample gas is introduced at the point of electron reversal for low energy electron-(sample gas) molecule attachment with high efficiency. The attachment process produces negative ions from the sample gas, which includes species present in trace (minute) amounts. These ions are extracted efficiently and directed to a mass analyzer where they may be detected and identified. The generation and detection of positive ions is accomplished in a similar fashion with minimal adjustment to potentials applied to the apparatus.
Chen, Wei-Qiang; Obermayr, Philipp; Černigoj, Urh; Vidič, Jana; Panić-Janković, Tanta; Mitulović, Goran
2017-11-01
Classical proteomics approaches involve enzymatic hydrolysis of proteins (either separated by polyacrylamide gels or in solution) followed by peptide identification using LC-MS/MS analysis. This method requires normally more than 16 h to complete. In the case of clinical analysis, it is of the utmost importance to provide fast and reproducible analysis with minimal manual sample handling. Herein we report the method development for online protein digestion on immobilized monolithic enzymatic reactors (IMER) to accelerate protein digestion, reduce manual sample handling, and provide reproducibility to the digestion process in clinical laboratory. An integrated online digestion and separation method using monolithic immobilized enzymatic reactor was developed and applied to digestion and separation of in-vitro-fertilization media. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples. Graphical Abstract ᅟ.
Guo, Jiin-Huarng; Luh, Wei-Ming
2009-05-01
When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.
Accuracy in the estimation of quantitative minimal area from the diversity/area curve.
Vives, Sergi; Salicrú, Miquel
2005-05-01
The problem of representativity is fundamental in ecological studies. A qualitative minimal area that gives a good representation of species pool [C.M. Bouderesque, Methodes d'etude qualitative et quantitative du benthos (en particulier du phytobenthos), Tethys 3(1) (1971) 79] can be discerned from a quantitative minimal area which reflects the structural complexity of community [F.X. Niell, Sobre la biologia de Ascophyllum nosodum (L.) Le Jolis en Galicia, Invest. Pesq. 43 (1979) 501]. This suggests that the populational diversity can be considered as the value of the horizontal asymptote corresponding to the curve sample diversity/biomass [F.X. Niell, Les applications de l'index de Shannon a l'etude de la vegetation interdidale, Soc. Phycol. Fr. Bull. 19 (1974) 238]. In this study we develop a expression to determine minimal areas and use it to obtain certain information about the community structure based on diversity/area curve graphs. This expression is based on the functional relationship between the expected value of the diversity and the sample size used to estimate it. In order to establish the quality of the estimation process, we obtained the confidence intervals as a particularization of the functional (h-phi)-entropies proposed in [M. Salicru, M.L. Menendez, D. Morales, L. Pardo, Asymptotic distribution of (h,phi)-entropies, Commun. Stat. (Theory Methods) 22 (7) (1993) 2015]. As an example used to demonstrate the possibilities of this method, and only for illustrative purposes, data about a study on the rocky intertidal seawed populations in the Ria of Vigo (N.W. Spain) are analyzed [F.X. Niell, Estudios sobre la estructura, dinamica y produccion del Fitobentos intermareal (Facies rocosa) de la Ria de Vigo. Ph.D. Mem. University of Barcelona, Barcelona, 1979].
NASA Technical Reports Server (NTRS)
Choi, H. J.; Su, Y. T.
1986-01-01
The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.
Romero-Cortes, Teresa; Salgado-Cervantes, Marco Antonio; García-Alamilla, Pedro; García-Alvarado, Miguel Angel; Rodríguez-Jimenes, Guadalupe del C; Hidalgo-Morales, Madeleine; Robles-Olvera, Víctor
2013-08-15
During traditional cocoa processing, the end of fermentation is empirically determined by the workers; consequently, a high variability on the quality of fermented cocoa beans is observed. Some physicochemical properties (such as fermentation index) have been used to measure the degree of fermentation and changes in quality, but only after the fermentation process has concluded, using dried cocoa beans. This would suggest that it is necessary to establish a relationship between the chemical changes inside the cocoa bean and the fermentation conditions during the fermentation in order to standardize the process. Cocoa beans were traditionally fermented inside wooden boxes, sampled every 24 h and analyzed to evaluate fermentation changes in complete bean, cotyledon and dried beans. The value of the fermentation index suggested as the minimal adequate (≥1) was observed at 72 h in all bean parts analyzed. At this time, values of pH, spectral absorption, total protein hydrolysis and vicilin-class globulins of fermented beans suggested that they were well fermented. Since no difference was found between the types of samples, the pH value could be used as a first indicator of the end of the fermentation and confirmed by evaluation of the fermentation index using undried samples, during the process. © 2013 Society of Chemical Industry.
McBirney, Samantha E; Trinh, Kristy; Wong-Beringer, Annie; Armani, Andrea M
2016-10-01
Optical density (OD) measurements are the standard approach used in microbiology for characterizing bacteria concentrations in culture media. OD is based on measuring the optical absorbance of a sample at a single wavelength, and any error will propagate through all calculations, leading to reproducibility issues. Here, we use the conventional OD technique to measure the growth rates of two different species of bacteria, Pseudomonas aeruginosa and Staphylococcus aureus. The same samples are also analyzed over the entire UV-Vis wavelength spectrum, allowing a distinctly different strategy for data analysis to be performed. Specifically, instead of only analyzing a single wavelength, a multi-wavelength normalization process is implemented. When the OD method is used, the detected signal does not follow the log growth curve. In contrast, the multi-wavelength normalization process minimizes the impact of bacteria byproducts and environmental noise on the signal, thereby accurately quantifying growth rates with high fidelity at low concentrations.
Thermophysical Properties Measurements of Zr62Cu20Al10Ni8
NASA Technical Reports Server (NTRS)
Bradshaw, Richard C.; Waren, Mary; Rogers, Jan R.; Rathz, Thomas J.; Gangopadhyay, Anup K.; Kelton, Ken F.; Hyers, Robert W.
2006-01-01
Thermophysical property studies performed at high temperature can prove challenging because of reactivity problems brought on by the elevated temperatures. Contaminants from measuring devices and container walls can cause changes in properties. To prevent this, containerless processing techniques can be employed to isolate a sample during study. A common method used for this is levitation. Typical levitation methods used for containerless processing are, aerodynamically, electromagnetically and electrostatically based. All levitation methods reduce heterogeneous nucleation sites, 'which in turn provide access to metastable undercooled phases. In particular, electrostatic levitation is appealing because sample motion and stirring are minimized; and by combining it with optically based non-contact measuring techniques, many thermophysical properties can be measured. Applying some of these techniques, surface tension, viscosity and density have been measured for the glass forming alloy Zr62Cu20Al10Ni8 and will be presented with a brief overview of the non-contact measuring method used.
Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko
2015-01-01
Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.
Rapid DNA analysis for automated processing and interpretation of low DNA content samples.
Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F
2016-01-01
Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.
Crossed hot-wire data acquisition and reduction system
NASA Technical Reports Server (NTRS)
Westphal, R. V.; Mehta, R. D.
1984-01-01
The report describes a system for rapid computerized calibration acquisition, and processing of data from a crossed hot-wire anemometer is described. Advantages of the system are its speed, minimal use of analog electronics, and improved accuracy of the resulting data. Two components of mean velocity and turbulence statistics up to third order are provided by the data reduction. Details of the hardware, calibration procedures, response equations, software, and sample results from measurements in a turbulent plane mixing layer are presented.
ERIC Educational Resources Information Center
National Academy of Education, Stanford, CA.
This report evaluates the conduct, validity, and uses of the National Assessment of Educational Progress (NAEP) Trial State Assessment (TSA). The report addresses such pressing problems as how participation in NAEP can be maintained and appropriate samples can be achieved; how errors can be minimized in the complex process of scaling and analyzing…
Martínez-Ceron, María C; Giudicessi, Silvana L; Marani, Mariela M; Albericio, Fernando; Cascone, Osvaldo; Erra-Balsells, Rosa; Camperi, Silvia A
2010-05-15
Optimization of bead analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) after the screening of one-bead-one-peptide combinatorial libraries was achieved, involving the fine-tuning of the whole process. Guanidine was replaced by acetonitrile (MeCN)/acetic acid (AcOH)/water (H(2)O), improving matrix crystallization. Peptide-bead cleavage with NH(4)OH was cheaper and safer than, yet as efficient as, NH(3)/tetrahydrofuran (THF). Peptide elution in microtubes instead of placing the beads in the sample plate yielded more sample aliquots. Successive dry layers deposit sample preparation was better than the dried droplet method. Among the matrices analyzed, alpha-cyano-4-hydroxycinnamic acid resulted in the best peptide ion yield. Cluster formation was minimized by the addition of additives to the matrix. Copyright 2010 Elsevier Inc. All rights reserved.
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
Silva, Anderson Clayton da; Santos, Priscila Dayane de Freitas; Palazzi, Nicole Campezato; Leimann, Fernanda Vitória; Fuchs, Renata Hernandez Barros; Bracht, Lívia; Gonçalves, Odinei Hess
2017-05-24
Nontoxic conserving agents are in demand by the food industry due to consumers concern about synthetic conservatives, especially in minimally processed food. The antimicrobial activity of curcumin, a natural phenolic compound, has been extensively investigated but hydrophobicity is an issue when applying curcumin to foodstuff. The objective of this work was to evaluate curcumin microcrystals as an antimicrobial agent in minimally processed carrots. The antimicrobial activity of curcumin microcrystals was evaluated in vitro against Gram-positive (Bacillus cereus and Staphylococcus aureus) and Gram-negative (Escherichia coli and Pseudomonas aeruginosa) microorganisms, showing a statistically significant (p < 0.05) decrease in the minimum inhibitory concentration compared to in natura, pristine curcumin. Curcumin microcrystals were effective in inhibiting psychrotrophic and mesophile microorganisms in minimally processed carrots. Sensory analyses were carried out showing no significant difference (p < 0.05) between curcumin microcrystal-treated carrots and non-treated carrots in triangular and tetrahedral discriminative tests. Sensory tests also showed that curcumin microcrystals could be added as a natural preservative in minimally processed carrots without causing noticeable differences that could be detected by the consumer. One may conclude that the analyses of the minimally processed carrots demonstrated that curcumin microcrystals are a suitable natural compound to inhibit the natural microbiota of carrots from a statistical point of view.
Psychoacoustic processing of test signals
NASA Astrophysics Data System (ADS)
Kadlec, Frantisek
2003-10-01
For the quantitative evaluation of electroacoustic system properties and for psychoacoustic testing it is possible to utilize harmonic signals with fixed frequency, sweeping signals, random signals or their combination. This contribution deals with the design of various test signals with emphasis on audible perception. During the digital generation of signals, some additional undesirable frequency components and noise are produced, which are dependent on signal amplitude and sampling frequency. A mathematical analysis describes the origin of this distortion. By proper selection of signal frequency and amplitude it is possible to minimize those undesirable components. An additional step is to minimize the audible perception of this signal distortion by the application of additional noise (dither). For signals intended for listening tests a dither with triangular or Gaussian probability density function was found to be most effective. Signals modified this way may be further improved by the application of noise shaping, which transposes those undesirable products into frequency regions where they are perceived less, according to psychoacoustic principles. The efficiency of individual processing steps was confirmed both by measurements and by listening tests. [Work supported by the Czech Science Foundation.
Hill, LaBarron K.; Williams, DeWayne P.; Thayer, Julian F.
2016-01-01
Human faces automatically attract visual attention and this process appears to be guided by social group memberships. In two experiments, we examined how social groups guide selective attention toward in-group and out-group faces. Black and White participants detected a target letter among letter strings superimposed on faces (Experiment 1). White participants were less accurate on trials with racial out-group (Black) compared to in-group (White) distractor faces. Likewise, Black participants were less accurate on trials with racial out-group (White) compared to in-group (Black) distractor faces. However, this pattern of out-group bias was only evident under high perceptual load—when the task was visually difficult. To examine the malleability of this pattern of racial bias, a separate sample of participants were assigned to mixed-race minimal groups (Experiment 2). Participants assigned to groups were less accurate on trials with their minimal in-group members compared to minimal out-group distractor faces, regardless of race. Again, this pattern of out-group bias was only evident under high perceptual load. Taken together, these results suggest that social identity guides selective attention toward motivationally relevant social groups—shifting from out-group bias in the domain of race to in-group bias in the domain of minimal groups—when perceptual resources are scarce. PMID:27556646
Gómez-López, Vicente M; Ragaert, Peter; Ryckeboer, Jaak; Jeyachchandran, Visvalingam; Debevere, Johan; Devlieghere, Frank
2007-06-10
Minimally processed vegetables (MPV) have a short shelf-life. Neutral electrolysed oxidising water (NEW) is a novel decontamination method. The objective of this study was to test the potential of NEW to extend the shelf-life of a MPV, namely shredded cabbage. Samples of shredded cabbage were immersed in NEW containing 40 mg/L of free chlorine or tap water (control) up to 5 min, and then stored under equilibrium modified atmosphere at 4 degrees C and 7 degrees C. Proliferation of aerobic mesophilic bacteria, psychrotrophic bacteria, lactic acid bacteria and yeasts were studied during the shelf-life. Also pH and sensorial quality of the samples as well as O(2) and CO(2) composition of the headspace of the bags was evaluated. From the microbial groups, only psychrotrophic counts decreased significantly (P<0.05) due to the effect of NEW, but the counts in treated samples and controls were similar after 3 days of storage at 4 degrees C and 7 degrees C. Packaging configurations kept O(2) concentration around 5% and prevented CO(2) accumulation. pH increased from 6.1-6.2 to 6.4 during the shelf-life. No microbial parameter reached unacceptable counts after 14 days at 4 degrees C and 8 days of storage at 7 degrees C. The shelf-life of controls stored at 4 degrees C was limited to 9 days by overall visual quality (OVQ), while samples treated with NEW remained acceptable during the 14 days of the experiment. The shelf-life of controls stored at 7 degrees C was limited to 6 days by OVQ and browning, while that of samples treated with NEW were limited to 9 days by OVQ, browning and dryness. According to these results, a shelf-life extension of at least 5 days and 3 days in samples stored respectively at 4 degrees C and 7 degrees C can be achieved by treating shredded cabbage with NEW. NEW seems to be a promising method to prolong the shelf-life of MPV.
Minimal T-wave representation and its use in the assessment of drug arrhythmogenicity.
Shakibfar, Saeed; Graff, Claus; Kanters, Jørgen K; Nielsen, Jimmi; Schmidt, Samuel; Struijk, Johannes J
2017-05-01
Recently, numerous models and techniques have been developed for analyzing and extracting features from the T wave which could be used as biomarkers for drug-induced abnormalities. The majority of these techniques and algorithms use features that determine readily apparent characteristics of the T wave, such as duration, area, amplitude, and slopes. In the present work the T wave was down-sampled to a minimal rate, such that a good reconstruction was still possible. The entire T wave was then used as a feature vector to assess drug-induced repolarization effects. The ability of the samples or combinations of samples obtained from the minimal T-wave representation to correctly classify a group of subjects before and after receiving d,l-sotalol 160 mg and 320 mg was evaluated using a linear discriminant analysis (LDA). The results showed that a combination of eight samples from the minimal T-wave representation can be used to identify normal from abnormal repolarization significantly better compared to the heart rate-corrected QT interval (QTc). It was further indicated that the interval from the peak of the T wave to the end of the T wave (Tpe) becomes relatively shorter after I K r inhibition by d,l-sotalol and that the most pronounced repolarization changes were present in the ascending segment of the minimal T-wave representation. The minimal T-wave representation can potentially be used as a new tool to identify normal from abnormal repolarization in drug safety studies. © 2016 Wiley Periodicals, Inc.
The Gulliver sample return mission to Deimos
NASA Astrophysics Data System (ADS)
Britt, D. T.; Robinson, M.; Gulliver Team
The Martian moon Deimos presents a unique opportunity for a sample return mission. Deimos is spectrally analogous to type D asteroids, which are thought to be composed of highly primitive carbonaceous material that originated in the outer asteroid belt. It also is in orbit around Mars and has been accumulating material ejected from the Martian surface ever since the earliest periods of Martian history, over 4.4 Gyrs ago. There are a number of factors that make sample return from Deimos extremely attractive. It is Better: Deimos is a repository for two kinds of extremely significant and scientifically exciting ancient samples: (1) Primitive spectral D-type material that may have accreted in the outer asteroid belt and Trojan swarm. This material samples the composition of solar nebula well outside the zone of terrestrial planets and provides a direct sample of primitive material so common past 3 AU but so uncommon in the meteorite collection. (2) Ancient Mars, which could include the full range of Martian crustal and upper mantle material from the early differentiation and crustal-forming epoch as well as samples from the era of high volatile flux, thick atmosphere, and possible surface water. The Martian material on Deimos would be dominated by ejecta from the ancient crust of Mars, delivered during the Noachian Period of basin-forming impacts and heavy bombardment. It is Closer: Compared to other primitive D-type asteroids, Deimos is by far the most accessible. Because of its orbit around Mars, Deimos is far closer than any other D asteroid. It is Safer: Deimos is also by far the safest small body for sample return yet imaged. It is an order of magnitude less rocky than Eros and the NEAR-Shoemaker mission succeeded in landing on Eros with a spacecraft not designed for landing and proximity maneuvering. Because of Viking imagery we already know a great deal about the surface roughness of Deimos. It is known to be very smooth and have moderate topography and gravitational slopes. It is Easier: Deimos is farther from Mars and smaller than Phobos. This location minimizes the delta-V penalties from entering the Martian gravity well; minimizes the energy requirements for sampling maneuvers; and minimizes Martian tidal effects on S/C operations. After initial processing these samples will be made available as soon as possible to the international cosmochemistry community for detailed analysis. The mission management team includes Lockheed Martin Astronautics (flight system, I&T) and JPL (payload, mission ops, and mission management).
Studies on the electrical transport properties of carbon nanotube composites
NASA Astrophysics Data System (ADS)
Tarlton, Taylor Warren
This work presents a probabilistic approach to model the electrical transport properties of carbon nanotube composite materials. A pseudo-random generation method is presented with the ability to generate 3-D samples with a variety of different configurations. Periodic boundary conditions are employed in the directions perpendicular to transport to minimize edge effects. Simulations produce values for drift velocity, carrier mobility, and conductivity in samples that account for geometrical features resembling those found in the lab. All results show an excellent agreement to the well-known power law characteristic of percolation processes, which is used to compare across simulations. The effect of sample morphology, like nanotube waviness and aspect ratio, and agglomeration on charge transport within CNT composites is evaluated within this model. This study determines the optimum simulation box-sizes that lead to minimize size-effects without rendering the simulation unaffordable. In addition, physical parameters within the model are characterized, involving various density functional theory calculations within Atomistix Toolkit. Finite element calculations have been performed to solve Maxwell's Equations for static fields in the COMSOL Multiphysics software package in order to better understand the behavior of the electric field within the composite material to further improve the model within this work. The types of composites studied within this work are often studied for use in electromagnetic shielding, electrostatic reduction, or even monitoring structural changes due to compression, stretching, or damage through their effect on the conductivity. However, experimental works have shown that based on various processing techniques the electrical properties of specific composites can vary widely. Therefore, the goal of this work has been to form a model with the ability to accurately predict the conductive properties as a function physical characteristics of the composite material in order to aid in the design of these composites.
Sampling of tar from sewage sludge gasification using solid phase adsorption.
Ortiz González, Isabel; Pérez Pastor, Rosa Ma; Sánchez Hervás, José Ma
2012-06-01
Sewage sludge is a residue from wastewater treatment plants which is considered to be harmful to the environment and all living organisms. Gasification technology is a potential source of renewable energy that converts the sewage sludge into gases that can be used to generate energy or as raw material in chemical synthesis processes. But tar produced during gasification is one of the problems for the implementation of the gasification technology. Tar can condense on pipes and filters and may cause blockage and corrosion in the engines and turbines. Consequently, to minimize tar content in syngas, the ability to quantify tar levels in process streams is essential. The aim of this work was to develop an accurate tar sampling and analysis methodology using solid phase adsorption (SPA) in order to apply it to tar sampling from sewage sludge gasification gases. Four types of commercial SPA cartridges have been tested to determine the most suitable one for the sampling of individual tar compounds in such streams. Afterwards, the capacity, breakthrough volume and sample stability of the Supelclean™ ENVI-Carb/NH(2), which is identified as the most suitable, have been determined. Basically, no significant influences from water, H(2)S or NH(3) were detected. The cartridge was used in sampling real samples, and comparable results were obtained with the present and traditional methods.
Collecting Quality Infrared Spectra from Microscopic Samples of Suspicious Powders in a Sealed Cell.
Kammrath, Brooke W; Leary, Pauline E; Reffner, John A
2017-03-01
The infrared (IR) microspectroscopical analysis of samples within a sealed-cell containing barium fluoride is a critical need when identifying toxic agents or suspicious powders of unidentified composition. The dispersive nature of barium fluoride is well understood and experimental conditions can be easily adjusted during reflection-absorption measurements to account for differences in focus between the visible and IR regions of the spectrum. In most instances, the ability to collect a viable spectrum is possible when using the sealed cell regardless of whether visible or IR focus is optimized. However, when IR focus is optimized, it is possible to collect useful data from even smaller samples. This is important when a minimal sample is available for analysis or the desire to minimize risk of sample exposure is important. While the use of barium fluoride introduces dispersion effects that are unavoidable, it is possible to adjust instrument settings when collecting IR spectra in the reflection-absorption mode to compensate for dispersion and minimize impact on the quality of the sample spectrum.
Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun
2012-11-30
The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical. The developed VBA modules could process raw data of GC-FID very quickly and easily. Also, they could assess the similarity between samples by peak pattern recognition using whole peaks without spectral identification of each peak that appeared in the chromatogram. The results collectively suggest that the modules would be useful tools to augment similarity assessment between seized methamphetamine samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Cynicism about organizational change: an attribution process perspective.
Wanous, John P; Reichers, Arnon E; Austin, James T
2004-06-01
The underlying attribution process for cynicism about organizational change is examined with six samples from four different organizations. The samples include hourly (n=777) and salaried employees (n= 155) from a manufacturing plant, faculty (n=293) and staff (n=302) from a large university, managers from a utility company (n=97), and young managers (n=65) from various organizations who were attending an evening MBA program. This form of cynicism is defined as the combination of Pessimism (about future change efforts) and a Dispositional attribution (why past efforts to change failed). Three analyses support this definition. First, an exploratory factor analysis (from the largest sample) produced two factors, one composed of Pessimism and the Dispositional attribution items and the second of the Situational attribution items. Second, the average correlation (across several samples) between Pessimism and Dispositional attribution is much higher (.59) than the average correlation between Pessimism and Situational attribution (.17). Third, scores on two different trait-based measures of cynicism correlate highest with the Dispositional attribution component of cynicism. A practical implication is that organizational leaders may minimize cynicism by managing both employees' pessimism about organizational change and employees' attributions about it. Specific suggestions for how this might be done are offered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, T.; Jones, H.; Wong, K.
The Marshall Islands Environmental Characterization and Dose Assessment Program has recently implemented waste minimization measures to reduce low level radioactive (LLW) and low level mixed (LLWMIXED) waste streams at the Lawrence Livermore National Laboratory (LLNL). Several thousand environmental samples are collected annually from former US nuclear test sites in the Marshall Islands, and returned to LLNL for processing and radiometric analysis. In the past, we analyzed coconut milk directly by gamma-spectrometry after adding formaldehyde (as preservative) and sealing the fluid in metal cans. This procedure was not only tedious and time consuming but generated storage and waste disposal problems. Wemore » have now reduced the number of coconut milk samples required for analysis from 1500 per year to approximately 250, and developed a new analytical procedure which essentially eliminates the associated mixed radioactive waste stream. Coconut milk samples are mixed with a few grams of ammonium-molydophosphate (AMP) which quantitatively scavenges the target radionuclide cesium 137 in an ion-exchange process. The AMP is then separated from the mixture and sealed in a plastic container. The bulk sample material can be disposed of as a non- radioactive non-hazardous waste, and the relatively small amount of AMP conveniently counted by gamma-spectrometry, packaged and stored for future use.« less
Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek
2011-10-30
The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirtley, John R., E-mail: jkirtley@stanford.edu; Rosenberg, Aaron J.; Palmstrom, Johanna C.
Superconducting QUantum Interference Device (SQUID) microscopy has excellent magnetic field sensitivity, but suffers from modest spatial resolution when compared with other scanning probes. This spatial resolution is determined by both the size of the field sensitive area and the spacing between this area and the sample surface. In this paper we describe scanning SQUID susceptometers that achieve sub-micron spatial resolution while retaining a white noise floor flux sensitivity of ≈2μΦ{sub 0}/Hz{sup 1/2}. This high spatial resolution is accomplished by deep sub-micron feature sizes, well shielded pickup loops fabricated using a planarized process, and a deep etch step that minimizes themore » spacing between the sample surface and the SQUID pickup loop. We describe the design, modeling, fabrication, and testing of these sensors. Although sub-micron spatial resolution has been achieved previously in scanning SQUID sensors, our sensors not only achieve high spatial resolution but also have integrated modulation coils for flux feedback, integrated field coils for susceptibility measurements, and batch processing. They are therefore a generally applicable tool for imaging sample magnetization, currents, and susceptibilities with higher spatial resolution than previous susceptometers.« less
Bhattacharya, Dipanjan; Singh, Vijay Raj; Zhi, Chen; So, Peter T. C.; Matsudaira, Paul; Barbastathis, George
2012-01-01
Laser sheet based microscopy has become widely accepted as an effective active illumination method for real time three-dimensional (3D) imaging of biological tissue samples. The light sheet geometry, where the camera is oriented perpendicular to the sheet itself, provides an effective method of eliminating some of the scattered light and minimizing the sample exposure to radiation. However, residual background noise still remains, limiting the contrast and visibility of potentially interesting features in the samples. In this article, we investigate additional structuring of the illumination for improved background rejection, and propose a new technique, “3D HiLo” where we combine two HiLo images processed from orthogonal directions to improve the condition of the 3D reconstruction. We present a comparative study of conventional structured illumination based demodulation methods, namely 3Phase and HiLo with a newly implemented 3D HiLo approach and demonstrate that the latter yields superior signal-to-background ratio in both lateral and axial dimensions, while simultaneously suppressing image processing artifacts. PMID:23262684
Bhattacharya, Dipanjan; Singh, Vijay Raj; Zhi, Chen; So, Peter T C; Matsudaira, Paul; Barbastathis, George
2012-12-03
Laser sheet based microscopy has become widely accepted as an effective active illumination method for real time three-dimensional (3D) imaging of biological tissue samples. The light sheet geometry, where the camera is oriented perpendicular to the sheet itself, provides an effective method of eliminating some of the scattered light and minimizing the sample exposure to radiation. However, residual background noise still remains, limiting the contrast and visibility of potentially interesting features in the samples. In this article, we investigate additional structuring of the illumination for improved background rejection, and propose a new technique, "3D HiLo" where we combine two HiLo images processed from orthogonal directions to improve the condition of the 3D reconstruction. We present a comparative study of conventional structured illumination based demodulation methods, namely 3Phase and HiLo with a newly implemented 3D HiLo approach and demonstrate that the latter yields superior signal-to-background ratio in both lateral and axial dimensions, while simultaneously suppressing image processing artifacts.
A thermal emission spectral library of rock-forming minerals
NASA Astrophysics Data System (ADS)
Christensen, Philip R.; Bandfield, Joshua L.; Hamilton, Victoria E.; Howard, Douglas A.; Lane, Melissa D.; Piatek, Jennifer L.; Ruff, Steven W.; Stefanov, William L.
2000-04-01
A library of thermal infrared spectra of silicate, carbonate, sulfate, phosphate, halide, and oxide minerals has been prepared for comparison to spectra obtained from planetary and Earth-orbiting spacecraft, airborne instruments, and laboratory measurements. The emphasis in developing this library has been to obtain pure samples of specific minerals. All samples were hand processed and analyzed for composition and purity. The majority are 710-1000 μm particle size fractions, chosen to minimize particle size effects. Spectral acquisition follows a method described previously, and emissivity is determined to within 2% in most cases. Each mineral spectrum is accompanied by descriptive information in database form including compositional information, sample quality, and a comments field to describe special circumstances and unique conditions. More than 150 samples were selected to include the common rock-forming minerals with an emphasis on igneous and sedimentary minerals. This library is available in digital form and will be expanded as new, well-characterized samples are acquired.
Boson Sampling with Single-Photon Fock States from a Bright Solid-State Source.
Loredo, J C; Broome, M A; Hilaire, P; Gazzano, O; Sagnes, I; Lemaitre, A; Almeida, M P; Senellart, P; White, A G
2017-03-31
A boson-sampling device is a quantum machine expected to perform tasks intractable for a classical computer, yet requiring minimal nonclassical resources as compared to full-scale quantum computers. Photonic implementations to date employed sources based on inefficient processes that only simulate heralded single-photon statistics when strongly reducing emission probabilities. Boson sampling with only single-photon input has thus never been realized. Here, we report on a boson-sampling device operated with a bright solid-state source of single-photon Fock states with high photon-number purity: the emission from an efficient and deterministic quantum dot-micropillar system is demultiplexed into three partially indistinguishable single photons, with a single-photon purity 1-g^{(2)}(0) of 0.990±0.001, interfering in a linear optics network. Our demultiplexed source is between 1 and 2 orders of magnitude more efficient than current heralded multiphoton sources based on spontaneous parametric down-conversion, allowing us to complete the boson-sampling experiment faster than previous equivalent implementations.
Wu, Mixia; Shu, Yu; Li, Zhaohai; Liu, Aiyi
2016-01-01
A sequential design is proposed to test whether the accuracy of a binary diagnostic biomarker meets the minimal level of acceptance. The accuracy of a binary diagnostic biomarker is a linear combination of the marker’s sensitivity and specificity. The objective of the sequential method is to minimize the maximum expected sample size under the null hypothesis that the marker’s accuracy is below the minimal level of acceptance. The exact results of two-stage designs based on Youden’s index and efficiency indicate that the maximum expected sample sizes are smaller than the sample sizes of the fixed designs. Exact methods are also developed for estimation, confidence interval and p-value concerning the proposed accuracy index upon termination of the sequential testing. PMID:26947768
NASA Technical Reports Server (NTRS)
Shiller, Alan M.
2003-01-01
It is well-established that sampling and sample processing can easily introduce contamination into dissolved trace element samples if precautions are not taken. However, work in remote locations sometimes precludes bringing bulky clean lab equipment into the field and likewise may make timely transport of samples to the lab for processing impossible. Straightforward syringe filtration methods are described here for collecting small quantities (15 mL) of 0.45- and 0.02-microm filtered river water in an uncontaminated manner. These filtration methods take advantage of recent advances in analytical capabilities that require only small amounts of waterfor analysis of a suite of dissolved trace elements. Filter clogging and solute rejection artifacts appear to be minimal, although some adsorption of metals and organics does affect the first approximately 10 mL of water passing through the filters. Overall the methods are clean, easy to use, and provide reproducible representations of the dissolved and colloidal fractions of trace elements in river waters. Furthermore, sample processing materials can be prepared well in advance in a clean lab and transported cleanly and compactly to the field. Application of these methods is illustrated with data from remote locations in the Rocky Mountains and along the Yukon River. Evidence from field flow fractionation suggests that the 0.02-microm filters may provide a practical cutoff to distinguish metals associated with small inorganic and organic complexes from those associated with silicate and oxide colloids.
Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J
2011-08-01
Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.
Re-construction layer effect of LiNi0.8Co0.15Mn0.05O2 with solvent evaporation process
NASA Astrophysics Data System (ADS)
Park, Kwangjin; Park, Jun-Ho; Hong, Suk-Gi; Choi, Byungjin; Heo, Sung; Seo, Seung-Woo; Min, Kyoungmin; Park, Jin-Hwan
2017-03-01
The solvent evaporation method on the structural changes and surface chemistry of the cathode and the effect of electrochemical performance of Li1.0Ni0.8Co0.15Mn0.05O2 (NCM) has been investigated. After dissolving of Li residuals using minimum content of solvent in order to minimize the damage of pristine material and the evaporation time, the solvent was evaporated without filtering and remaining powder was re-heated at 700 °C in oxygen environment. Two kinds of solvent, de-ionized water and diluted nitric acid, were used as a solvent. The almost 40% of Li residuals were removed using solvent evaporation method. The NCM sample after solvent evaporation process exhibited an increase in the initial capacity (214.3 mAh/g) compared to the pristine sample (207.4 mAh/g) at 0.1C because of enhancement of electric conductivity caused by decline of Li residuals. The capacity retention of NCM sample after solvent evaporation process (96.0% at the 50th cycle) was also improved compared to that of the pristine NCM sample (90.6% at the 50th cycle). The uniform Li residual layer after solvent treated and heat treatment acted like a coating layer, leading to enhance the cycle performance. The NCM sample using diluted nitric acid showed better performance than that using de-ionized water.
Garty, Guy; Chen, Youhua; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Brenner, David J.
2011-01-01
Purpose Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. Materials and methods The RABiT analyzes fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cutoff dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. Results We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Conclusions Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day. PMID:21557703
Gupta, Vijayalaxmi; Holets-Bondar, Lesya; Roby, Katherine F; Enders, George; Tash, Joseph S
2015-01-01
Collection and processing of tissues to preserve space flight effects from animals after return to Earth is challenging. Specimens must be harvested with minimal time after landing to minimize postflight readaptation alterations in protein expression/translation, posttranslational modifications, and expression, as well as changes in gene expression and tissue histological degradation after euthanasia. We report the development of a widely applicable strategy for determining the window of optimal species-specific and tissue-specific posteuthanasia harvest that can be utilized to integrate into multi-investigator Biospecimen Sharing Programs. We also determined methods for ISS-compatible long-term tissue storage (10 months at -80°C) that yield recovery of high quality mRNA and protein for western analysis after sample return. Our focus was reproductive tissues. The time following euthanasia where tissues could be collected and histological integrity was maintained varied with tissue and species ranging between 1 and 3 hours. RNA quality was preserved in key reproductive tissues fixed in RNAlater up to 40 min after euthanasia. Postfixation processing was also standardized for safe shipment back to our laboratory. Our strategy can be adapted for other tissues under NASA's Biospecimen Sharing Program or similar multi-investigator tissue sharing opportunities.
PELE web server: atomistic study of biomolecular systems at your fingertips.
Madadkar-Sobhani, Armin; Guallar, Victor
2013-07-01
PELE, Protein Energy Landscape Exploration, our novel technology based on protein structure prediction algorithms and a Monte Carlo sampling, is capable of modelling the all-atom protein-ligand dynamical interactions in an efficient and fast manner, with two orders of magnitude reduced computational cost when compared with traditional molecular dynamics techniques. PELE's heuristic approach generates trial moves based on protein and ligand perturbations followed by side chain sampling and global/local minimization. The collection of accepted steps forms a stochastic trajectory. Furthermore, several processors may be run in parallel towards a collective goal or defining several independent trajectories; the whole procedure has been parallelized using the Message Passing Interface. Here, we introduce the PELE web server, designed to make the whole process of running simulations easier and more practical by minimizing input file demand, providing user-friendly interface and producing abstract outputs (e.g. interactive graphs and tables). The web server has been implemented in C++ using Wt (http://www.webtoolkit.eu) and MySQL (http://www.mysql.com). The PELE web server, accessible at http://pele.bsc.es, is free and open to all users with no login requirement.
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
NASA Astrophysics Data System (ADS)
Talamonti, James J.; Kay, Richard B.; Krebs, Danny J.
1996-05-01
A numerical model was developed to emulate the capabilities of systems performing noncontact absolute distance measurements. The model incorporates known methods to minimize signal processing and digital sampling errors and evaluates the accuracy limitations imposed by spectral peak isolation by using Hanning, Blackman, and Gaussian windows in the fast Fourier transform technique. We applied this model to the specific case of measuring the relative lengths of a compound Michelson interferometer. By processing computer-simulated data through our model, we project the ultimate precision for ideal data, and data containing AM-FM noise. The precision is shown to be limited by nonlinearities in the laser scan. absolute distance, interferometer.
Robotic sampling system for an unmanned Mars mission
NASA Technical Reports Server (NTRS)
Chun, Wendell
1989-01-01
A major robotics opportunity for NASA will be the Mars Rover/Sample Return Mission which could be launched as early as the 1990s. The exploratory portion of this mission will include two autonomous subsystems: the rover vehicle and a sample handling system. The sample handling system is the key to the process of collecting Martian soils. This system could include a core drill, a general-purpose manipulator, tools, containers, a return canister, certification hardware and a labeling system. Integrated into a functional package, the sample handling system is analogous to a complex robotic workcell. Discussed here are the different components of the system, their interfaces, forseeable problem areas and many options based on the scientific goals of the mission. The various interfaces in the sample handling process (component to component and handling system to rover) will be a major engineering effort. Two critical evaluation criteria that will be imposed on the system are flexibility and reliability. It needs to be flexible enough to adapt to different scenarios and environments and acquire the most desirable specimens for return to Earth. Scientists may decide to change the distribution and ratio of core samples to rock samples in the canister. The long distance and duration of this planetary mission places a reliability burden on the hardware. The communication time delay between Earth and Mars minimizes operator interaction (teleoperation, supervisory modes) with the sample handler. An intelligent system will be required to plan the actions, make sample choices, interpret sensor inputs, and query unknown surroundings. A combination of autonomous functions and supervised movements will be integrated into the sample handling system.
Axel Robotic Platform for Crater and Extreme Terrain Exploration
NASA Technical Reports Server (NTRS)
Nesnas, Issa A.; Matthews, Jaret B.; Edlund, Jeffrey A.; Burdick, Joel W.; Abad-Manterola, Pablo
2012-01-01
To be able to conduct science investigations on highly sloped and challenging terrains, it is necessary to deploy science payloads to such locations and collect and process in situ samples. A tethered robotic platform has been developed that is capable of exploring very challenging terrain. The Axel rover is a symmetrical rover that is minimally actuated, can traverse arbitrary paths, and operate upside-down or right-side up. It can be deployed from a larger platform (rover, lander, or aerobot) or from a dual Axel configuration. Axel carries and manages its own tether, reducing damage to the tether during operations. Fundamentally, Axel is a two-wheeled rover with a symmetric body and a trailing link. Because the primary goal is minimal complexity, this version of the Axel rover uses only four primary actuators to control its wheels, tether, and a trailing link. A fifth actuator is used for level winding of tether onto Axel s spool.
Direct PCR amplification of forensic touch and other challenging DNA samples: A review.
Cavanaugh, Sarah E; Bathrick, Abigail S
2018-01-01
DNA evidence sample processing typically involves DNA extraction, quantification, and STR amplification; however, DNA loss can occur at both the DNA extraction and quantification steps, which is not ideal for forensic evidence containing low levels of DNA. Direct PCR amplification of forensic unknown samples has been suggested as a means to circumvent extraction and quantification, thereby retaining the DNA typically lost during those procedures. Direct PCR amplification is a method in which a sample is added directly to an amplification reaction without being subjected to prior DNA extraction, purification, or quantification. It allows for maximum quantities of DNA to be targeted, minimizes opportunities for error and contamination, and reduces the time and monetary resources required to process samples, although data analysis may take longer as the increased DNA detection sensitivity of direct PCR may lead to more instances of complex mixtures. ISO 17025 accredited laboratories have successfully implemented direct PCR for limited purposes (e.g., high-throughput databanking analysis), and recent studies indicate that direct PCR can be an effective method for processing low-yield evidence samples. Despite its benefits, direct PCR has yet to be widely implemented across laboratories for the processing of evidentiary items. While forensic DNA laboratories are always interested in new methods that will maximize the quantity and quality of genetic information obtained from evidentiary items, there is often a lag between the advent of useful methodologies and their integration into laboratories. Delayed implementation of direct PCR of evidentiary items can be attributed to a variety of factors, including regulatory guidelines that prevent laboratories from omitting the quantification step when processing forensic unknown samples, as is the case in the United States, and, more broadly, a reluctance to validate a technique that is not widely used for evidence samples. The advantages of direct PCR of forensic evidentiary samples justify a re-examination of the factors that have delayed widespread implementation of this method and of the evidence supporting its use. In this review, the current and potential future uses of direct PCR in forensic DNA laboratories are summarized. Copyright © 2017 Elsevier B.V. All rights reserved.
Effect of Processing on Silk-Based Biomaterials: Reproducibility and Biocompatibility
Wray, Lindsay S.; Hu, Xiao; Gallego, Jabier; Georgakoudi, Irene; Omenetto, Fiorenzo G.; Schmidt, Daniel; Kaplan, David L.
2012-01-01
Silk fibroin has been successfully used as a biomaterial for tissue regeneration. In order to prepare silk fibroin biomaterials for human implantation a series of processing steps are required to purify the protein. Degumming to remove inflammatory sericin is a crucial step related to biocompatibility and variability in the material. Detailed characterization of silk fibroin degumming is reported. The degumming conditions significantly affected cell viability on the silk fibroin material and the ability to form three-dimensional porous scaffolds from the silk fibroin, but did not affect macrophage activation or β-sheet content in the materials formed. Methods are also provided to determine the content of residual sericin in silk fibroin solutions and to assess changes in silk fibroin molecular weight. Amino acid composition analysis was used to detect sericin residuals in silk solutions with a detection limit between 1.0% and 10% wt/wt, while fluorescence spectroscopy was used to reproducibly distinguish between silk samples with different molecular weights. Both methods are simple and require minimal sample volume, providing useful quality control tools for silk fibroin preparation processes. PMID:21695778
System for sensing droplet formation time delay in a flow cytometer
Van den Engh, Ger; Esposito, Richard J.
1997-01-01
A droplet flow cytometer system which includes a system to optimize the droplet formation time delay based on conditions actually experienced includes an automatic droplet sampler which rapidly moves a plurality of containers stepwise through the droplet stream while simultaneously adjusting the droplet time delay. Through the system sampling of an actual substance to be processed can be used to minimize the effect of the substances variations or the determination of which time delay is optimal. Analysis such as cell counting and the like may be conducted manually or automatically and input to a time delay adjustment which may then act with analysis equipment to revise the time delay estimate actually applied during processing. The automatic sampler can be controlled through a microprocessor and appropriate programming to bracket an initial droplet formation time delay estimate. When maximization counts through volume, weight, or other types of analysis exists in the containers, the increment may then be reduced for a more accurate ultimate setting. This may be accomplished while actually processing the sample without interruption.
Liu, Long; Wang, Yan; Yao, Jinyuan; Yang, Cuijun; Ding, Guifu
2016-08-01
This study describes a novel micro sampler consisting of an ultrahigh-aspect-ratio microneedle and a PDMS actuator. The microneedle was fabricated by a new method which introduced reshaped photoresist technology to form a flow channel inside. The microneedle includes two parts: shaft and pedestal. In this study, the shaft length is 1500 μm with a 45° taper angle on the tip and pedestal is 1000 μm. Besides, the shaft and pedestal are connected by an arc connection structure with a length of 600 μm. The microneedles have sufficient mechanical strength to insert into skin with a wide safety margin which was proved by mechanics tests. Moreover, a PDMS actuator with a chamber inside was designed and fabricated in this study. The chamber, acting as a reservoir in sampling process as well as providing power, was optimized by finite element analysis (FEA) to decrease dead volume and improve sampling precision. The micro sampler just needs finger press to activate the sampling process as well as used for quantitative micro injection to some extent. And a volume of 31.5 ± 0.8 μl blood was successfully sampled from the ear artery of a rabbit. This micro sampler is suitable for micro sampling for diagnose or therapy in biomedical field.
Effect of freezing on the rheological, chemical and colour properties of Serpa cheese.
Alvarenga, Nuno; Canada, João; Sousa, Isabel
2011-02-01
The effect of freezing on the properties of a raw ewes'-milk semi-soft cheese (Serpa cheese) was studied using small amplitude oscillatory (SAOS) and texture measurements, colour and chemical parameters. The freezing was introduced at three different stages of the ripening process (28, 35 and 42 days), and the cheeses were maintained frozen for 12 months. Cheeses were submitted to a slow or fast freezing method, and to different storage temperatures: -10 and -20°C (three replicates for each set conditions). Chemical data showed that only the proteolysis indicators exhibited differences between frozen and non-frozen samples; frozen samples showed higher values of NPN than the non-frozen samples, indicating that the freezing process did not prevent the secondary proteolysis of cheese. Frozen samples showed a significantly (P<0·05) stronger structure than the non-frozen, as indicated by hardness. However, the differences between the frozen and non-frozen samples were not significantly for storage modulus (G' 1Hz) and loss tangent (tan δ 1Hz) (P>0·05). Freezing affected mainly colour parameters: frozen samples were more luminous, and more yellow-green. The results allowed us to conclude that the damages caused by freezing to cheese properties could be minimized if this type of storage is introduced at the end of ripening (42 d) using a freezing temperature of -20°C.
Ha, Ji Won; Hahn, Jong Hoon
2017-02-01
Acupuncture sample injection is a simple method to deliver well-defined nanoliter-scale sample plugs in PDMS microfluidic channels. This acupuncture injection method in microchip CE has several advantages, including minimization of sample consumption, the capability of serial injections of different sample solutions into the same microchannel, and the capability of injecting sample plugs into any desired position of a microchannel. Herein, we demonstrate that the simple and cost-effective acupuncture sample injection method can be used for PDMS microchip-based field amplified sample stacking in the most simplified straight channel by applying a single potential. We achieved the increase in electropherogram signals for the case of sample stacking. Furthermore, we present that microchip CGE of ΦX174 DNA-HaeⅢ digest can be performed with the acupuncture injection method on a glass microchip while minimizing sample loss and voltage control hardware. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Krishnamurthy, Krish; Hari, Natarajan
2017-09-15
The recently published CRAFT (complete reduction to amplitude frequency table) technique converts the raw FID data (i.e., time domain data) into a table of frequencies, amplitudes, decay rate constants, and phases. It offers an alternate approach to decimate time-domain data, with minimal preprocessing step. It has been shown that application of CRAFT technique to process the t 1 dimension of the 2D data significantly improved the detectable resolution by its ability to analyze without the use of ubiquitous apodization of extensively zero-filled data. It was noted earlier that CRAFT did not resolve sinusoids that were not already resolvable in time-domain (i.e., t 1 max dependent resolution). We present a combined NUS-IST-CRAFT approach wherein the NUS acquisition technique (sparse sampling technique) increases the intrinsic resolution in time-domain (by increasing t 1 max), IST fills the gap in the sparse sampling, and CRAFT processing extracts the information without loss due to any severe apodization. NUS and CRAFT are thus complementary techniques to improve intrinsic and usable resolution. We show that significant improvement can be achieved with this combination over conventional NUS-IST processing. With reasonable sensitivity, the models can be extended to significantly higher t 1 max to generate an indirect-DEPT spectrum that rivals the direct observe counterpart. Copyright © 2017 John Wiley & Sons, Ltd.
Knopman, Debra S.; Voss, Clifford I.
1989-01-01
Sampling design for site characterization studies of solute transport in porous media is formulated as a multiobjective problem. Optimal design of a sampling network is a sequential process in which the next phase of sampling is designed on the basis of all available physical knowledge of the system. Three objectives are considered: model discrimination, parameter estimation, and cost minimization. For the first two objectives, physically based measures of the value of information obtained from a set of observations are specified. In model discrimination, value of information of an observation point is measured in terms of the difference in solute concentration predicted by hypothesized models of transport. Points of greatest difference in predictions can contribute the most information to the discriminatory power of a sampling design. Sensitivity of solute concentration to a change in a parameter contributes information on the relative variance of a parameter estimate. Inclusion of points in a sampling design with high sensitivities to parameters tends to reduce variance in parameter estimates. Cost minimization accounts for both the capital cost of well installation and the operating costs of collection and analysis of field samples. Sensitivities, discrimination information, and well installation and sampling costs are used to form coefficients in the multiobjective problem in which the decision variables are binary (zero/one), each corresponding to the selection of an observation point in time and space. The solution to the multiobjective problem is a noninferior set of designs. To gain insight into effective design strategies, a one-dimensional solute transport problem is hypothesized. Then, an approximation of the noninferior set is found by enumerating 120 designs and evaluating objective functions for each of the designs. Trade-offs between pairs of objectives are demonstrated among the models. The value of an objective function for a given design is shown to correspond to the ability of a design to actually meet an objective.
Minimally processed beetroot waste as an alternative source to obtain functional ingredients.
Costa, Anne Porto Dalla; Hermes, Vanessa Stahl; Rios, Alessandro de Oliveira; Flôres, Simone Hickmann
2017-06-01
Large amounts of waste are generated by the minimally processed vegetables industry, such as those from beetroot processing. The aim of this study was to determine the best method to obtain flour from minimally processed beetroot waste dried at different temperatures, besides producing a colorant from such waste and assessing its stability along 45 days. Beetroot waste dried at 70 °C originates flour with significant antioxidant activity and higher betalain content than flour produced from waste dried at 60 and 80 °C, while chlorination had no impact on the process since microbiological results were consistent for its application. The colorant obtained from beetroot waste showed color stability for 20 days and potential antioxidant activity over the analysis period, thus it can be used as a functional additive to improve nutritional characteristics and appearance of food products. These results are promising since minimally processed beetroot waste can be used as an alternative source of natural and functional ingredients with high antioxidant activity and betalain content.
Nano-plasmonic exosome diagnostics
Im, Hyungsoon; Shao, Huilin; Weissleder, Ralph; Castro, Cesar M.; Lee, Hakho
2015-01-01
Exosomes have emerged as a promising biomarker. These vesicles abound in biofluids and harbor molecular constituents from their parent cells, thereby offering a minimally-invasive avenue for molecular analyses. Despite such clinical potential, routine exosomal analysis, particularly the protein assay, remains challenging, due to requirements for large sample volumes and extensive processing. We have been developing miniaturized systems to facilitate clinical exosome studies. These systems can be categorized into two components: microfluidics for sample preparation and analytical tools for protein analyses. In this report, we review a new assay platform, nano-plasmonic exosome (nPLEX), in which sensing is based on surface plasmon resonance to achieve label-free exosome detection. Looking forward, we also discuss some potential challenges and improvements in exosome studies. PMID:25936957
Canadian macromolecular crystallography facility: a suite of fully automated beamlines.
Grochulski, Pawel; Fodje, Michel; Labiuk, Shaunivan; Gorin, James; Janzen, Kathryn; Berg, Russ
2012-06-01
The Canadian light source is a 2.9 GeV national synchrotron radiation facility located on the University of Saskatchewan campus in Saskatoon. The small-gap in-vacuum undulator illuminated beamline, 08ID-1, together with the bending magnet beamline, 08B1-1, constitute the Canadian Macromolecular Crystallography Facility (CMCF). The CMCF provides service to more than 50 Principal Investigators in Canada and the United States. Up to 25% of the beam time is devoted to commercial users and the general user program is guaranteed up to 55% of the useful beam time through a peer-review process. CMCF staff provides "Mail-In" crystallography service to users with the highest scored proposals. Both beamlines are equipped with very robust end-stations including on-axis visualization systems, Rayonix 300 CCD series detectors and Stanford-type robotic sample auto-mounters. MxDC, an in-house developed beamline control system, is integrated with a data processing module, AutoProcess, allowing full automation of data collection and data processing with minimal human intervention. Sample management and remote monitoring of experiments is enabled through interaction with a Laboratory Information Management System developed at the facility.
Kumar, Sunil; Kumar, Ramesh; Nambi, V E
2016-03-01
Pomegranate fruits are difficult to peel and once peeled, extracted arils have very short shelf-life. Therefore, present investigation was carried out to extend the shelf life of minimally processed pomegranate arils using pectin methyl esterase (PME) and CaCl2 treatment during refrigerated storage. The arils of freshly harvested pomegranate fruits (Punica granatum L.) were treated with different concentrations of food-grade PME (50-300 units) and calcium ions (0.5-2.0% CaCl₂) for a period of 5-30 min using response surface methodology. Treated and untreated arils were then packed in low density polyethylene bags (25 μ) and maintained under low temperature (5°C; 90% RH) for evaluating the physical, biochemical and microbial quality of pomegranate arils at four day interval. Physiological loss in weight increased during storage but no food-borne pathogens were found during 28 day of cold storage in treated arils. Color and firmness of both treated and untreated arils decreased during storage but it was better maintained in treated arils. The firmness was found to be 0.630 N in treated samples compared to untreated one (0.511 N) after 20 d of storage. Total antioxidant capacity, ferric reducing antioxidant power, polyphenol oxidase and lipoxygenase activities increased during storage. Treatment with 249.33 units of PME and 1.70% CaCl₂for an immersion time of 24.93 min was found to be most effective treatment for maintaining the quality of minimally processed arils for longer period. Sensory score was also higher in treated pomegranate arils that were quite acceptable even after 20 day of referigerated storage as against 12 day for untreated ones.
Note: Four-port microfluidic flow-cell with instant sample switching
NASA Astrophysics Data System (ADS)
MacGriff, Christopher A.; Wang, Shaopeng; Tao, Nongjian
2013-10-01
A simple device for high-speed microfluidic delivery of liquid samples to a surface plasmon resonance sensor surface is presented. The delivery platform is comprised of a four-port microfluidic cell, two ports serve as inlets for buffer and sample solutions, respectively, and a high-speed selector valve to control the alternate opening and closing of the two outlet ports. The time scale of buffer/sample switching (or sample injection rise and fall time) is on the order of milliseconds, thereby minimizing the opportunity for sample plug dispersion. The high rates of mass transport to and from the central microfluidic sensing region allow for SPR-based kinetic analysis of binding events with dissociation rate constants (kd) up to 130 s-1. The required sample volume is only 1 μL, allowing for minimal sample consumption during high-speed kinetic binding measurement.
Abadias, M; Usall, J; Anguera, M; Solsona, C; Viñas, I
2008-03-31
A survey of fresh and minimally-processed fruit and vegetables, and sprouts was conducted in several retail establishments in the Lleida area (Catalonia, Spain) during 2005-2006 to determine whether microbial contamination, and in particular potentially pathogenic bacteria, was present under these commodities. A total of 300 samples--including 21 ready-to-eat fruits, 28 whole fresh vegetables, 15 sprout samples and 237 ready-to-eat salads containing from one to six vegetables--were purchased from 4 supermarkets. They were tested for mesophilic and psychrotrophic aerobic counts, yeasts and moulds, lactic acid bacteria, Enterobacteriaceae, presumptive E. coli and Listeria monocytogenes counts as well as for the presence of Salmonella, E. coli O157:H7, Yersinia enterocolitica and thermotolerant Campylobacter. Results for the fresh-cut vegetables that we analyzed showed that, in general, the highest microorganism counts were associated with grated carrot, arugula and spinach (7.8, 7.5 and 7.4 log cfu g(-1) of aerobic mesophilic microorganisms; 6.1, 5.8 and 5.2 log cfu g(-1) of yeast and moulds; 5.9, 4.0 and 5.1 log cfu g(-1) lactic acid bacteria and 6.2, 5.3 and 6.0 log cfu g(-1) of Enterobacteriaceae). The lowest counts were generally associated with fresh-cut endive and lettuce (6.2 and 6.3 log cfu g(-1) of aerobic mesophilic microorganisms; 4.4 and 4.6 log cfu g(-1) of yeast and moulds; 2.7 and 3.8 log cfu g(-1) lactic acid bacteria and 4.8 and 4.4 log cfu g(-1) of Enterobacteriaceae). Counts of psychrotrophic microorganisms were as high as those of mesophilic microorganisms. Microbiological counts for fresh-cut fruit were very low. Sprouts were highly contaminated with mesophilic (7.9 log cfu g(-1)), psychrotrophic microorganisms (7.3 log cfu g(-1)) and Enterobacteriaceae (7.2 log cfu g(-1)) and showed a high incidence of E. coli (40% of samples). Of the samples analyzed, four (1.3%) were Salmonella positive and two (0.7%) harboured L. monocytogenes. None of the samples was positive for E. coli O157:H7, pathogenic Y. enterocolitica or thermotolerant Campylobacter.
Low-rank matrix decomposition and spatio-temporal sparse recovery for STAP radar
Sen, Satyabrata
2015-08-04
We develop space-time adaptive processing (STAP) methods by leveraging the advantages of sparse signal processing techniques in order to detect a slowly-moving target. We observe that the inherent sparse characteristics of a STAP problem can be formulated as the low-rankness of clutter covariance matrix when compared to the total adaptive degrees-of-freedom, and also as the sparse interference spectrum on the spatio-temporal domain. By exploiting these sparse properties, we propose two approaches for estimating the interference covariance matrix. In the first approach, we consider a constrained matrix rank minimization problem (RMP) to decompose the sample covariance matrix into a low-rank positivemore » semidefinite and a diagonal matrix. The solution of RMP is obtained by applying the trace minimization technique and the singular value decomposition with matrix shrinkage operator. Our second approach deals with the atomic norm minimization problem to recover the clutter response-vector that has a sparse support on the spatio-temporal plane. We use convex relaxation based standard sparse-recovery techniques to find the solutions. With extensive numerical examples, we demonstrate the performances of proposed STAP approaches with respect to both the ideal and practical scenarios, involving Doppler-ambiguous clutter ridges, spatial and temporal decorrelation effects. As a result, the low-rank matrix decomposition based solution requires secondary measurements as many as twice the clutter rank to attain a near-ideal STAP performance; whereas the spatio-temporal sparsity based approach needs a considerably small number of secondary data.« less
Ohkouchi, Yumiko; Ly, Bich Thuy; Ishikawa, Suguru; Aoki, Yusuke; Echigo, Shinya; Itoh, Sadahiko
2011-10-01
In Japan, customers' concerns about chlorinous odour in drinking water have been increasing. One promising approach for reducing chlorinous odour is the minimization of residual chlorine in water distribution, which requires stricter control of organics to maintain biological stability in water supply systems. In this investigation, the levels and seasonal changes of assimilable organic carbon (AOC) and its precursors in drinking water were surveyed to accumulate information on organics in terms of biological stability. In tap water samples purified through rapid sand filtration processes, the average AOC concentration was 174 microgC/L in winter and 60 microgC/L in summer. This difference seemed to reflect the seasonal changes of AOC in the natural aquatic environment. On the other hand, very little or no AOC could be removed after use of an ozonation-biological activated carbon (BAC) process. Especially in winter, waterworks should pay attention to BAC operating conditions to improve AOC removal. The storage of BAC effluent with residual chlorine at 0.05-0.15 mgCl2/L increased AOC drastically. This result indicated the possibility that abundant AOC precursors remaining in the finished water could contribute to newly AOC formation during water distribution with minimized residual chlorine. Combined amino acids, which remained at roughly equivalent to AOC in finished water, were identified as major AOC precursors. Prior to minimization of residual chlorine, enhancement of the removal abilities for both AOC and its precursors would be necessary.
Near DC force measurement using PVDF sensors
NASA Astrophysics Data System (ADS)
Ramanathan, Arun Kumar; Headings, Leon M.; Dapino, Marcelo J.
2018-03-01
There is a need for high-performance force sensors capable of operating at frequencies near DC while producing a minimal mass penalty. Example application areas include steering wheel sensors, powertrain torque sensors, robotic arms, and minimally invasive surgery. The beta crystallographic phase polyvinylidene fluoride (PVDF) films are suitable for this purpose owing to their large piezoelectric constant. Unlike conventional capacitive sensors, beta crystallographic phase PVDF films exhibit a broad linear range and can potentially be designed to operate without complex electronics or signal processing. A fundamental challenge that prevents the implementation of PVDF in certain high-performance applications is their inability to measure static signals, which results from their first-order electrical impedance. Charge readout algorithms have been implemented which address this issue only partially, as they often require integration of the output signal to obtain the applied force profile, resulting in signal drift and signal processing complexities. In this paper, we propose a straightforward real time drift compensation strategy that is applicable to high output impedance PVDF films. This strategy makes it possible to utilize long sample times with a minimal loss of accuracy; our measurements show that the static output remains within 5% of the original value during half-hour measurements. The sensitivity and full-scale range are shown to be determined by the feedback capacitance of the charge amplifier. A linear model of the PVDF sensor system is developed and validated against experimental measurements, along with benchmark tests against a commercial load cell.
Mechanisms and kinetics of granulated sewage sludge combustion.
Kijo-Kleczkowska, Agnieszka; Środa, Katarzyna; Kosowska-Golachowska, Monika; Musiał, Tomasz; Wolski, Krzysztof
2015-12-01
This paper investigates sewage sludge disposal methods with particular emphasis on combustion as the priority disposal method. Sewage sludge incineration is an attractive option because it minimizes odour, significantly reduces the volume of the starting material and thermally destroys organic and toxic components of the off pads. Additionally, it is possible that ashes could be used. Currently, as many as 11 plants use sewage sludge as fuel in Poland; thus, this technology must be further developed in Poland while considering the benefits of co-combustion with other fuels. This paper presents the results of experimental studies aimed at determining the mechanisms (defining the fuel combustion region by studying the effects of process parameters, including the size of the fuel sample, temperature in the combustion chamber and air velocity, on combustion) and kinetics (measurement of fuel temperature and mass changes) of fuel combustion in an air stream under different thermal conditions and flow rates. The combustion of the sludge samples during air flow between temperatures of 800 and 900°C is a kinetic-diffusion process. This process determines the sample size, temperature of its environment, and air velocity. The adopted process parameters, the time and ignition temperature of the fuel by volatiles, combustion time of the volatiles, time to reach the maximum temperature of the fuel surface, maximum temperature of the fuel surface, char combustion time, and the total process time, had significant impacts. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wason, James M. S.; Mander, Adrian P.
2012-01-01
Two-stage designs are commonly used for Phase II trials. Optimal two-stage designs have the lowest expected sample size for a specific treatment effect, for example, the null value, but can perform poorly if the true treatment effect differs. Here we introduce a design for continuous treatment responses that minimizes the maximum expected sample size across all possible treatment effects. The proposed design performs well for a wider range of treatment effects and so is useful for Phase II trials. We compare the design to a previously used optimal design and show it has superior expected sample size properties. PMID:22651118
Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.
Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert
2013-08-01
Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.
NASA Astrophysics Data System (ADS)
Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.
2017-12-01
Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.
Food processing by high hydrostatic pressure.
Yamamoto, Kazutaka
2017-04-01
High hydrostatic pressure (HHP) process, as a nonthermal process, can be used to inactivate microbes while minimizing chemical reactions in food. In this regard, a HHP level of 100 MPa (986.9 atm/1019.7 kgf/cm 2 ) and more is applied to food. Conventional thermal process damages food components relating color, flavor, and nutrition via enhanced chemical reactions. However, HHP process minimizes the damages and inactivates microbes toward processing high quality safe foods. The first commercial HHP-processed foods were launched in 1990 as fruit products such as jams, and then some other products have been commercialized: retort rice products (enhanced water impregnation), cooked hams and sausages (shelf life extension), soy sauce with minimized salt (short-time fermentation owing to enhanced enzymatic reactions), and beverages (shelf life extension). The characteristics of HHP food processing are reviewed from viewpoints of nonthermal process, history, research and development, physical and biochemical changes, and processing equipment.
Elliott, Paul; Peakman, Tim C
2008-04-01
UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, T.
Ten chemical processing cell (CPC) experiments were performed using simulant to evaluate Sludge Batch 9 for sludge-only and coupled processing using the nitric-formic flowsheet in the Defense Waste Processing Facility (DWPF). Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles were performed on eight of the ten. The other two were SRAT cycles only. Samples of the condensate, sludge, and off gas were taken to monitor the chemistry of the CPC experiments. The Savannah River National Laboratory (SRNL) has previously shown antifoam decomposes to form flammable organic products, (hexamethyldisiloxane (HMDSO), trimethylsilanol (TMS), and propanal), that are presentmore » in the vapor phase and condensate of the CPC vessels. To minimize antifoam degradation product formation, a new antifoam addition strategy was implemented at SRNL and DWPF to add antifoam undiluted.« less
Polley, Spencer D.; Mori, Yasuyoshi; Watson, Julie; Perkins, Mark D.; González, Iveth J.; Notomi, Tsugunori; Chiodini, Peter L.; Sutherland, Colin J.
2010-01-01
Loop-mediated isothermal amplification (LAMP) of DNA offers the ability to detect very small quantities of pathogen DNA following minimal tissue sample processing and is thus an attractive methodology for point-of-care diagnostics. Previous attempts to diagnose malaria by the use of blood samples and LAMP have targeted the parasite small-subunit rRNA gene, with a resultant sensitivity for Plasmodium falciparum of around 100 parasites per μl. Here we describe the use of mitochondrial targets for LAMP-based detection of any Plasmodium genus parasite and of P. falciparum specifically. These new targets allow routine amplification from samples containing as few as five parasites per μl of blood. Amplification is complete within 30 to 40 min and is assessed by real-time turbidimetry, thereby offering rapid diagnosis with greater sensitivity than is achieved by the most skilled microscopist or antigen detection using lateral flow immunoassays. PMID:20554824
Investigation on the structural characterization of pulsed p-type porous silicon
NASA Astrophysics Data System (ADS)
Wahab, N. H. Abd; Rahim, A. F. Abd; Mahmood, A.; Yusof, Y.
2017-08-01
P-type Porous silicon (PS) was sucessfully formed by using an electrochemical pulse etching (PC) and conventional direct current (DC) etching techniques. The PS was etched in the Hydrofluoric (HF) based solution at a current density of J = 10 mA/cm2 for 30 minutes from a crystalline silicon wafer with (100) orientation. For the PC process, the current was supplied through a pulse generator with 14 ms cycle time (T) with 10 ms on time (Ton) and pause time (Toff) of 4 ms respectively. FESEM, EDX, AFM, and XRD have been used to characterize the morphological properties of the PS. FESEM images showed that pulse PS (PPC) sample produces more uniform circular structures with estimated average pore sizes of 42.14 nm compared to DC porous (PDC) sample with estimated average size of 16.37nm respectively. The EDX spectrum for both samples showed higher Si content with minimal presence of oxide.
Biowaste monitoring system for shuttle
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Sauer, R. L.
1975-01-01
The acquisition of crew biomedical data has been an important task on all manned space missions from Project Mercury through the recently completed Skylab Missions. The monitoring of metabolic wastes from the crew is an important aspect of this activity. On early missions emphasis was placed on the collection and return of biowaste samples for post-mission analysis. On later missions such as Skylab, equipment for inflight measurement was also added. Life Science experiments are being proposed for Shuttle missions which will require the inflight measurement and sampling of metabolic wastes. In order to minimize the crew impact associated with these requirements, a high degree of automation of these processes will be required. This paper reviews the design and capabilities of urine biowaste monitoring equipment provided on past-manned space programs and defines and describes the urine volume measurement and sampling equipment planned for the Shuttle Orbiter program.
Origin and Correction of Magnetic Field Inhomogeneity at the Interface in Biphasic NMR Samples
Martin, Bryan T.; Chingas, G. C.
2012-01-01
The use of susceptibility matching to minimize spectral distortion of biphasic samples layered in a standard 5 mm NMR tube is described. The approach uses magic angle spinning (MAS) to first extract chemical shift differences by suppressing bulk magnetization. Then, using biphasic coaxial samples, magnetic susceptibilities are matched by titration with a paramagnetic salt. The matched phases are then layered in a standard NMR tube where they can be shimmed and examined. Line widths of two distinct spectral lines, selected to characterize homogeneity in each phase, are simultaneously optimized. Two-dimensional distortion-free, slice-resolved spectra of an octanol/water system illustrate the method. These data are obtained using a 2D stepped-gradient pulse sequence devised for this application. Advantages of this sequence over slice-selective methods are that acquisition efficiency is increased and processing requires only conventional software. PMID:22459062
Shepard, Michele; Brenner, Sara
2014-01-01
Numerous studies are ongoing in the fields of nanotoxicology and exposure science; however, gaps remain in identifying and evaluating potential exposures from skin contact with engineered nanoparticles in occupational settings. The aim of this study was to identify potential cutaneous exposure scenarios at a workplace using engineered nanoparticles (alumina, ceria, amorphous silica) and evaluate the presence of these materials on workplace surfaces. Process review, workplace observations, and preliminary surface sampling were conducted using microvacuum and wipe sample collection methods and transmission electron microscopy with elemental analysis. Exposure scenarios were identified with potential for incidental contact. Nanoparticles of silica or silica and/or alumina agglomerates (or aggregates) were identified in surface samples from work areas where engineered nanoparticles were used or handled. Additional data are needed to evaluate occupational exposures from skin contact with engineered nanoparticles; precautionary measures should be used to minimize potential cutaneous exposures in the workplace.
Beiske, K; Burchill, S A; Cheung, I Y; Hiyama, E; Seeger, R C; Cohn, S L; Pearson, A D J; Matthay, K K
2009-01-01
Disseminating disease is a predictive and prognostic indicator of poor outcome in children with neuroblastoma. Its accurate and sensitive assessment can facilitate optimal treatment decisions. The International Neuroblastoma Risk Group (INRG) Task Force has defined standardised methods for the determination of minimal disease (MD) by immunocytology (IC) and quantitative reverse transcriptase-polymerase chain reaction (QRT-PCR) using disialoganglioside GD2 and tyrosine hydroxylase mRNA respectively. The INRG standard operating procedures (SOPs) define methods for collecting, processing and evaluating bone marrow (BM), peripheral blood (PB) and peripheral blood stem cell harvest by IC and QRT-PCR. Sampling PB and BM is recommended at diagnosis, before and after myeloablative therapy and at the end of treatment. Peripheral blood stem cell products should be analysed at the time of harvest. Performing MD detection according to INRG SOPs will enable laboratories throughout the world to compare their results and thus facilitate quality-controlled multi-centre prospective trials to assess the clinical significance of MD and minimal residual disease in heterogeneous patient groups. PMID:19401690
Reverberation time influences musical enjoyment with cochlear implants.
Certo, Michael V; Kohlberg, Gavriel D; Chari, Divya A; Mancuso, Dean M; Lalwani, Anil K
2015-02-01
To identify factors that enhance the enjoyment of music in cochlear implant (CI) recipients. Specifically, we assessed the hypothesis that variations in reverberation time (RT60) may be linked to variations in the level of musical enjoyment in CI users. Prospective analysis of music enjoyment in normal-hearing individuals. Single tertiary academic medical center. Normal-hearing adults (N = 20) were asked to rate a novel 20-second melody on three enjoyment modalities: musicality, pleasantness, and naturalness. Subjective rating of music excerpts. Participants listened to seven different instruments play the melody, each with five levels (0.2, 1.6, 3.0, 5.0, 10.0 s) of RT60, both with and without CI simulation processing. Linear regression analysis with analysis of variance was used to assess the impact of RT60 on music enjoyment. Without CI simulation, music samples with RT60 = 3.0 seconds were ranked most pleasant and most musical, whereas those with RT60 = 1.6 seconds and RT60 = 3.0 seconds were ranked equally most natural (all p < 0.05). With CI simulation, music samples with RT60 = 0.2 seconds were ranked most pleasant, most musical, and most natural (all p < 0.05). Samples without CI simulation show a preference for middle-range RT60, whereas samples with CI simulation show a negative linear relationship between RT60 and musical enjoyment, with preference for minimal reverberation. Minimization of RT60 may be a useful strategy for increasing musical enjoyment under CI conditions, both in altering existing music as well as in composition of new music.
Intelligent Controller for a Compact Wide-Band Compositional Infrared Fourier Transform Spectrometer
NASA Astrophysics Data System (ADS)
Yiu, P.; Keymeulen, D.; Berisford, D. F.; Hand, K. P.; Carlson, R. W.
2013-12-01
This paper presents the design and integration of an intelligent controller for CIRIS (Compositional InfraRed Interferometric Spectrometer) on a stand-alone field programmable gate array (FPGA) architecture. CIRIS is a novel take on traditional Fourier Transform Spectrometers (FTS) and replaces linearly moving mirrors (characteristic of Michelson interferometers) with a constant-velocity rotating refractor to variably phase shift and alter the path length of incoming light. This design eliminates the need for periodically accelerating/decelerating mirrors inherent to canonical Michelson designs and allows for a compact and robust device that is intrinsically radiation-hard, making it ideal for spaceborne measurements in the near-IR to thermal-IR band (2-12 μm) on planetary exploration missions. A traditional Michelson FTS passes a monochromatic light source (incident light from the sample) through a system of refractors/mirrors followed by a mirror moving linearly in the plane of the incident light. This process selectively blocks certain wavelengths and permits measurement of the sample's absorption rates as a function of the wavelengths blocked to produce an 'inteferogram.' This is subsequently processed using a Fourier transform to obtain the sample's spectrum and ascertain the sample's composition. With our prototype CIRIS instrument in development at Design and Prototype Inc. and NASA-JPL, we propose the use of a rotating refractor spinning at a constant velocity to variably phase shift incident light to the detector as an alternative to a linearly moving mirror. This design eliminates sensitivity to vibrations, minimizing path length and non-linear errors due to minor perturbations to the system, in addition to facilitating compact design critical to meeting the strict volume requirements of spacecraft. Further, this is done without sacrificing spectral resolution or throughput when compared to Michelson or diffractive designs. While Michelson designs typically achieve very high wavelength resolution, the intended application of our instrument (spectroscopic investigation of Europa's surface) places higher emphasis on the greater wavelength band sensitivity in the 2-12 μm range provided by a rotating refractor design. The instrument's embedded microcontroller is implemented on a flight-qualified VIRTEX-5 FPGA with the aim of sampling the instrument's detector and optical rotary encoder in order to construct an interferogram. Subsequent signal processing, including a Fast Fourier Transform (FFT), noise reduction/averaging, and spectral calibration techniques are applied in real-time to compose the sample spectrum. Deployment of an FPGA eliminates the instrument's need to share computing resources with the main spacecraft computer and takes advantage of the low power consumption and high-throughput hardware parallelism intrinsic to FPGA applications. This parallelism facilitates the high speed, low latency sampling/signal processing critical to instrument precision with minimal power consumption to achieve highly sensitive spectra within the constraints of available spacecraft resources. The instrument is characterized in simulated space-flight conditions and we demonstrate that this technology is capable of meeting the strict volume, sensitivity, and power consumption requirements for implementation in scientific space systems.
Chen, Chun-Nan; Lin, Che-Yi; Chi, Fan-Hsiang; Chou, Chen-Han; Hsu, Ya-Ching; Kuo, Yen-Lin; Lin, Chih-Feng; Chen, Tseng-Cheng; Wang, Cheng-Ping; Lou, Pei-Jen; Ko, Jenq-Yuh; Hsiao, Tzu-Yu; Yang, Tsung-Lin
2016-01-01
Abstract Tumors of the supraclavicular fossa (SC) is clinically challenging because of anatomical complexity and tumor pathological diversity. Because of varied diseases entities and treatment choices of SC tumors, making the accurate decision among numerous differential diagnoses is imperative. Sampling by open biopsy (OB) remains the standard procedure for pathological confirmation. However, complicated anatomical structures of SC always render surgical intervention difficult to perform. Ultrasound-guided core biopsy (USCB) is a minimally invasive and office-based procedure for tissue sampling widely applied in many diseases of head and neck. This study aims to evaluate the clinical efficacy and utility of using USCB as the sampling method of SC tumors. From 2009 to 2014, consecutive patients who presented clinical symptoms and signs of supraclavicular tumors and were scheduled to receive sampling procedures for diagnostic confirmation were recruited. The patients received USCB or OB respectively in the initial tissue sampling. The accurate diagnostic rate based on pathological results was 90.2% for USCB, and 93.6% for OB. No significant difference was noted between USCB and OB groups in terms of diagnostic accuracy and the percentage of inadequate specimens. All cases in the USCB group had the sampling procedure completed within 10 minutes, but not in the OB group. No scars larger than 1 cm were found in USCB. Only patients in the OB groups had the need to receive general anesthesia and hospitalization and had scars postoperatively. Accordingly, USCB can serve as the first-line sampling tool for SC tumors with high diagnostic accuracy, minimal invasiveness, and low medical cost. PMID:26825877
Optimal Inspection of Imports to Prevent Invasive Pest Introduction.
Chen, Cuicui; Epanchin-Niell, Rebecca S; Haight, Robert G
2018-03-01
The United States imports more than 1 billion live plants annually-an important and growing pathway for introduction of damaging nonnative invertebrates and pathogens. Inspection of imports is one safeguard for reducing pest introductions, but capacity constraints limit inspection effort. We develop an optimal sampling strategy to minimize the costs of pest introductions from trade by posing inspection as an acceptance sampling problem that incorporates key features of the decision context, including (i) simultaneous inspection of many heterogeneous lots, (ii) a lot-specific sampling effort, (iii) a budget constraint that limits total inspection effort, (iv) inspection error, and (v) an objective of minimizing cost from accepted defective units. We derive a formula for expected number of accepted infested units (expected slippage) given lot size, sample size, infestation rate, and detection rate, and we formulate and analyze the inspector's optimization problem of allocating a sampling budget among incoming lots to minimize the cost of slippage. We conduct an empirical analysis of live plant inspection, including estimation of plant infestation rates from historical data, and find that inspections optimally target the largest lots with the highest plant infestation rates, leaving some lots unsampled. We also consider that USDA-APHIS, which administers inspections, may want to continue inspecting all lots at a baseline level; we find that allocating any additional capacity, beyond a comprehensive baseline inspection, to the largest lots with the highest infestation rates allows inspectors to meet the dual goals of minimizing the costs of slippage and maintaining baseline sampling without substantial compromise. © 2017 Society for Risk Analysis.
Smart Cup: A Minimally-Instrumented, Smartphone-Based Point-of-Care Molecular Diagnostic Device.
Liao, Shih-Chuan; Peng, Jing; Mauk, Michael G; Awasthi, Sita; Song, Jinzhao; Friedman, Harvey; Bau, Haim H; Liu, Changchun
2016-06-28
Nucleic acid amplification-based diagnostics offer rapid, sensitive, and specific means for detecting and monitoring the progression of infectious diseases. However, this method typically requires extensive sample preparation, expensive instruments, and trained personnel. All of which hinder its use in resource-limited settings, where many infectious diseases are endemic. Here, we report on a simple, inexpensive, minimally-instrumented, smart cup platform for rapid, quantitative molecular diagnostics of pathogens at the point of care. Our smart cup takes advantage of water-triggered, exothermic chemical reaction to supply heat for the nucleic acid-based, isothermal amplification. The amplification temperature is regulated with a phase-change material (PCM). The PCM maintains the amplification reactor at a constant temperature, typically, 60-65°C, when ambient temperatures range from 12 to 35°C. To eliminate the need for an optical detector and minimize cost, we use the smartphone's flashlight to excite the fluorescent dye and the phone camera to record real-time fluorescence emission during the amplification process. The smartphone can concurrently monitor multiple amplification reactors and analyze the recorded data. Our smart cup's utility was demonstrated by amplifying and quantifying herpes simplex virus type 2 (HSV-2) with LAMP assay in our custom-made microfluidic diagnostic chip. We have consistently detected as few as 100 copies of HSV-2 viral DNA per sample. Our system does not require any lab facilities and is suitable for use at home, in the field, and in the clinic, as well as in resource-poor settings, where access to sophisticated laboratories is impractical, unaffordable, or nonexistent.
Bobik, Krzysztof; Dunlap, John R.; Burch-Smith, Tessa M.
2014-01-01
Since the 1940s transmission electron microscopy (TEM) has been providing biologists with ultra-high resolution images of biological materials. Yet, because of laborious and time-consuming protocols that also demand experience in preparation of artifact-free samples, TEM is not considered a user-friendly technique. Traditional sample preparation for TEM used chemical fixatives to preserve cellular structures. High-pressure freezing is the cryofixation of biological samples under high pressures to produce very fast cooling rates, thereby restricting ice formation, which is detrimental to the integrity of cellular ultrastructure. High-pressure freezing and freeze substitution are currently the methods of choice for producing the highest quality morphology in resin sections for TEM. These methods minimize the artifacts normally associated with conventional processing for TEM of thin sections. After cryofixation the frozen water in the sample is replaced with liquid organic solvent at low temperatures, a process called freeze substitution. Freeze substitution is typically carried out over several days in dedicated, costly equipment. A recent innovation allows the process to be completed in three hours, instead of the usual two days. This is typically followed by several more days of sample preparation that includes infiltration and embedding in epoxy resins before sectioning. Here we present a protocol combining high-pressure freezing and quick freeze substitution that enables plant sample fixation to be accomplished within hours. The protocol can readily be adapted for working with other tissues or organisms. Plant tissues are of special concern because of the presence of aerated spaces and water-filled vacuoles that impede ice-free freezing of water. In addition, the process of chemical fixation is especially long in plants due to cell walls impeding the penetration of the chemicals to deep within the tissues. Plant tissues are therefore particularly challenging, but this protocol is reliable and produces samples of the highest quality. PMID:25350384
Nutritional Value of Commercial Protein-Rich Plant Products.
Mattila, Pirjo; Mäkinen, Sari; Eurola, Merja; Jalava, Taina; Pihlava, Juha-Matti; Hellström, Jarkko; Pihlanto, Anne
2018-06-01
The goal of this work was to analyze nutritional value of various minimally processed commercial products of plant protein sources such as faba bean (Vicia faba), lupin (Lupinus angustifolius), rapeseed press cake (Brassica rapa/napus subsp. Oleifera), flaxseed (Linum usitatissimum), oil hemp seed (Cannabis sativa), buckwheat (Fagopyrum esculentum), and quinoa (Chenopodium quinoa). Basic composition and various nutritional components like amino acids, sugars, minerals, and dietary fiber were determined. Nearly all the samples studied could be considered as good sources of essential amino acids, minerals and dietary fiber. The highest content of crude protein (over 30 g/100 g DW) was found in faba bean, blue lupin and rapeseed press cake. The total amount of essential amino acids (EAA) ranged from 25.8 g/16 g N in oil hemp hulls to 41.5 g/16 g N in pearled quinoa. All the samples studied have a nutritionally favorable composition with significant health benefit potential. Processing (dehulling or pearling) affected greatly to the contents of analyzed nutrients.
NASA Technical Reports Server (NTRS)
1977-01-01
The 20x9 TDI array was developed to meet the LANDSAT Thematic Mapper Requirements. This array is based upon a self-aligned, transparent gate, buried channel process. The process features: (1) buried channel, four phase, overlapping gate CCD's for high transfer efficiency without fat zero; (2) self-aligned transistors to minimize clock feedthrough and parasitic capacitance; and (3) transparent tin oxide electrode for high quantum efficiency with front surface irradiation. The requirements placed on the array and the performance achieved are summarized. This data is the result of flat field measurements only, no imaging or dynamic target measurements were made during this program. Measurements were performed with two different test stands. The bench test equipment fabricated for this program operated at the 8 micro sec line time and employed simple sampling of the gated MOSFET output video signal. The second stand employed Correlated Doubled Sampling (CDS) and operated at 79.2 micro sec line time.
Re-Os Systematics and HSE Distribution in Metal from Ochansk (H4) Chondrite
NASA Technical Reports Server (NTRS)
Smoliar, M. I.; Horan, M. F.; Alexander, C. M. OD.; Walker, R. J.
2003-01-01
Previous studies of the Re-Os systematics of chondrites have documented considerable variation in the Re/Os ratios of whole rock samples. For some whole rock chondrites, Re-Os systematics display large deviations from the primitive isochron that are considerably larger than deviations in other isotope systems. Possible interpretation of these facts is that the Re-Os system in chondrites is particularly sensitive to post-formation alteration processes, thus providing a useful tool to examine such processes. Significant variations that have been detected in highly siderophile element (HSE) patterns for ordinary chondrites support this conclusion. We report Re-Os isotope data for metal separates from the Ochansk H4 chondrite coupled with abundance data for Ru, Pd, Ir, and Pt, determined in the same samples by isotope dilution. We chose this meteorite mainly because it is an observed fall with minimal signs of weathering, and its low metamorphic grade (H4) and shock stage (S3).
Creamer, Jessica S; Mora, Maria F; Willis, Peter A
2017-01-17
Amino acids are fundamental building blocks of terrestrial life as well as ubiquitous byproducts of abiotic reactions. In order to distinguish between amino acids formed by abiotic versus biotic processes it is possible to use chemical distributions to identify patterns unique to life. This article describes two capillary electrophoresis methods capable of resolving 17 amino acids found in high abundance in both biotic and abiotic samples (seven enantiomer pairs d/l-Ala, -Asp, -Glu, -His, -Leu, -Ser, -Val and the three achiral amino acids Gly, β-Ala, and GABA). To resolve the 13 neutral amino acids one method utilizes a background electrolyte containing γ-cyclodextrin and sodium taurocholate micelles. The acidic amino acid enantiomers were resolved with γ-cyclodextrin alone. These methods allow detection limits down to 5 nM for the neutral amino acids and 500 nM for acidic amino acids and were used to analyze samples collected from Mono Lake with minimal sample preparation.
Paretti, Nicholas; Coes, Alissa L.; Kephart, Christopher M.; Mayo, Justine
2018-03-05
Tumacácori National Historical Park protects the culturally important Mission, San José de Tumacácori, while also managing a portion of the ecologically diverse riparian corridor of the Santa Cruz River. This report describes the methods and quality assurance procedures used in the collection of water samples for the analysis of Escherichia coli (E. coli), microbial source tracking markers, suspended sediment, water-quality parameters, turbidity, and the data collection for discharge and stage; the process for data review and approval is also described. Finally, this report provides a quantitative assessment of the quality of the E. coli, microbial source tracking, and suspended sediment data.The data-quality assessment revealed that bias attributed to field and laboratory contamination was minimal, with E. coli detections in only 3 out of 33 field blank samples analyzed. Concentrations in the field blanks were several orders of magnitude lower than environmental concentrations. The microbial source tracking (MST) field blank was below the detection limit for all MST markers analyzed. Laboratory blanks for E. coli at the USGS Arizona Water Science Center and laboratory blanks for MST markers at the USGS Ohio Water Microbiology Laboratory were all below the detection limit. Irreplicate data for E. coli and suspended sediment indicated that bias was not introduced to the data by combining samples collected using discrete sampling methods with samples collected using automatic sampling methods.The split and sequential E. coli replicate data showed consistent analytical variability and a single equation was developed to explain the variability of E. coli concentrations. An additional analysis of analytical variability for E. coli indicated analytical variability around 18 percent relative standard deviation and no trend was observed in the concentration during the processing and analysis of multiple split-replicates. Two replicate samples were collected for MST and individual markers were compared for a base flow and flood sample. For the markers found in common between the two types of samples, the relative standard deviation for the base flow sample was more than 3 times greater than the markers in the flood sample. Sequential suspended sediment replicates had a relative standard deviation of about 1.3 percent, indicating that environmental and analytical variability was minimal.A holding time review and laboratory study analysis supported the extended holding times required for this investigation. Most concentrations for flood and base-flow samples were within the theoretical variability specified in the most probable number approach suggesting that extended hold times did not overly influence the final concentrations reported.
Martínez Steele, Eurídice; Baraldi, Larissa Galastri; Louzada, Maria Laura da Costa; Moubarac, Jean-Claude; Mozaffarian, Dariush; Monteiro, Carlos Augusto
2016-01-01
Objectives To investigate the contribution of ultra-processed foods to the intake of added sugars in the USA. Ultra-processed foods were defined as industrial formulations which, besides salt, sugar, oils and fats, include substances not used in culinary preparations, in particular additives used to imitate sensorial qualities of minimally processed foods and their culinary preparations. Design Cross-sectional study. Setting National Health and Nutrition Examination Survey 2009–2010. Participants We evaluated 9317 participants aged 1+ years with at least one 24 h dietary recall. Main outcome measures Average dietary content of added sugars and proportion of individuals consuming more than 10% of total energy from added sugars. Data analysis Gaussian and Poisson regressions estimated the association between consumption of ultra-processed foods and intake of added sugars. All models incorporated survey sample weights and adjusted for age, sex, race/ethnicity, family income and educational attainment. Results Ultra-processed foods comprised 57.9% of energy intake, and contributed 89.7% of the energy intake from added sugars. The content of added sugars in ultra-processed foods (21.1% of calories) was eightfold higher than in processed foods (2.4%) and fivefold higher than in unprocessed or minimally processed foods and processed culinary ingredients grouped together (3.7%). Both in unadjusted and adjusted models, each increase of 5 percentage points in proportional energy intake from ultra-processed foods increased the proportional energy intake from added sugars by 1 percentage point. Consumption of added sugars increased linearly across quintiles of ultra-processed food consumption: from 7.5% of total energy in the lowest quintile to 19.5% in the highest. A total of 82.1% of Americans in the highest quintile exceeded the recommended limit of 10% energy from added sugars, compared with 26.4% in the lowest. Conclusions Decreasing the consumption of ultra-processed foods could be an effective way of reducing the excessive intake of added sugars in the USA. PMID:26962035
2015-01-01
Glioblastoma multiforme (GBM) is the most aggressive malignant primary brain tumor, with a dismal mean survival even with the current standard of care. Although in vitro cell systems can provide mechanistic insight into the regulatory networks governing GBM cell proliferation and migration, clinical samples provide a more physiologically relevant view of oncogenic signaling networks. However, clinical samples are not widely available and may be embedded for histopathologic analysis. With the goal of accurately identifying activated signaling networks in GBM tumor samples, we investigated the impact of embedding in optimal cutting temperature (OCT) compound followed by flash freezing in LN2 vs immediate flash freezing (iFF) in LN2 on protein expression and phosphorylation-mediated signaling networks. Quantitative proteomic and phosphoproteomic analysis of 8 pairs of tumor specimens revealed minimal impact of the different sample processing strategies and highlighted the large interpatient heterogeneity present in these tumors. Correlation analyses of the differentially processed tumor sections identified activated signaling networks present in selected tumors and revealed the differential expression of transcription, translation, and degradation associated proteins. This study demonstrates the capability of quantitative mass spectrometry for identification of in vivo oncogenic signaling networks from human tumor specimens that were either OCT-embedded or immediately flash-frozen. PMID:24927040
NASA Technical Reports Server (NTRS)
Choi, Michael K.
2017-01-01
A thermal design concept of using propylene loop heat pipes to minimize survival heater power for NASA's Evolutionary Xenon Thruster power processing units is presented. It reduces the survival heater power from 183 W to 35 W per power processing unit. The reduction is 81%.
Minimization of reflection cracks in flexible pavements.
DOT National Transportation Integrated Search
1977-01-01
This report describes the performance of fabrics used under overlays in an effort to minimize longitudinal and alligator cracking in flexible pavements. It is concluded, although the sample size is small, that the treatments will extend the pavement ...
NASA Astrophysics Data System (ADS)
Imada, Keita; Nakamura, Katsuhiko
This paper describes recent improvements to Synapse system for incremental learning of general context-free grammars (CFGs) and definite clause grammars (DCGs) from positive and negative sample strings. An important feature of our approach is incremental learning, which is realized by a rule generation mechanism called “bridging” based on bottom-up parsing for positive samples and the search for rule sets. The sizes of rule sets and the computation time depend on the search strategies. In addition to the global search for synthesizing minimal rule sets and serial search, another method for synthesizing semi-optimum rule sets, we incorporate beam search to the system for synthesizing semi-minimal rule sets. The paper shows several experimental results on learning CFGs and DCGs, and we analyze the sizes of rule sets and the computation time.
Development of a novel method for amniotic fluid stem cell storage.
Zavatti, Manuela; Beretti, Francesca; Casciaro, Francesca; Comitini, Giuseppina; Franchi, Fabrizia; Barbieri, Veronica; Bertoni, Laura; De Pol, Anto; La Sala, Giovanni B; Maraldi, Tullia
2017-08-01
Current procedures for collection of human amniotic fluid stem cells (hAFSCs) indicate that cells cultured in a flask for 2 weeks can then be used for research. However, hAFSCs can be retrieved directly from a small amount of amniotic fluid that can be obtained at the time of diagnostic amniocentesis. The aim of this study was to determine whether direct freezing of amniotic fluid cells is able to maintain or improve the potential of a sub-population of stem cells. We compared the potential of the hAFSCs regarding timing of freezing, cells obtained directly from amniotic fluid aspiration (D samples) and cells cultured in a flask before freezing (C samples). Colony-forming-unit ability, proliferation, morphology, stemness-related marker expression, senescence, apoptosis and differentiation potential of C and D samples were compared. hAFSCs isolated from D samples expressed mesenchymal stem cells markers until later passages, had a good proliferation rate and exhibited differentiation capacity similar to hAFSCs of C samples. Interestingly, direct freezing induced a higher concentration of cells positive for pluripotency stem cell markers, without teratoma formation in vivo. This study suggests that minimal processing may be adequate for the banking of amniotic fluid cells, avoiding in vitro passages before the storage and exposure to high oxygen concentration, which affect stem cell properties. This technique might be a cost-effective and reasonable approach to the process of Good Manufacturing Process accreditation for stem-cell banks. Copyright © 2017 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Energy Efficient GNSS Signal Acquisition Using Singular Value Decomposition (SVD).
Bermúdez Ordoñez, Juan Carlos; Arnaldo Valdés, Rosa María; Gómez Comendador, Fernando
2018-05-16
A significant challenge in global navigation satellite system (GNSS) signal processing is a requirement for a very high sampling rate. The recently-emerging compressed sensing (CS) theory makes processing GNSS signals at a low sampling rate possible if the signal has a sparse representation in a certain space. Based on CS and SVD theories, an algorithm for sampling GNSS signals at a rate much lower than the Nyquist rate and reconstructing the compressed signal is proposed in this research, which is validated after the output from that process still performs signal detection using the standard fast Fourier transform (FFT) parallel frequency space search acquisition. The sparse representation of the GNSS signal is the most important precondition for CS, by constructing a rectangular Toeplitz matrix (TZ) of the transmitted signal, calculating the left singular vectors using SVD from the TZ, to achieve sparse signal representation. Next, obtaining the M-dimensional observation vectors based on the left singular vectors of the SVD, which are equivalent to the sampler operator in standard compressive sensing theory, the signal can be sampled below the Nyquist rate, and can still be reconstructed via ℓ 1 minimization with accuracy using convex optimization. As an added value, there is a GNSS signal acquisition enhancement effect by retaining the useful signal and filtering out noise by projecting the signal into the most significant proper orthogonal modes (PODs) which are the optimal distributions of signal power. The algorithm is validated with real recorded signals, and the results show that the proposed method is effective for sampling, reconstructing intermediate frequency (IF) GNSS signals in the time discrete domain.
Vogeser, Michael; Spöhrer, Ute
2006-01-01
Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.
Energy Efficient GNSS Signal Acquisition Using Singular Value Decomposition (SVD)
Arnaldo Valdés, Rosa María; Gómez Comendador, Fernando
2018-01-01
A significant challenge in global navigation satellite system (GNSS) signal processing is a requirement for a very high sampling rate. The recently-emerging compressed sensing (CS) theory makes processing GNSS signals at a low sampling rate possible if the signal has a sparse representation in a certain space. Based on CS and SVD theories, an algorithm for sampling GNSS signals at a rate much lower than the Nyquist rate and reconstructing the compressed signal is proposed in this research, which is validated after the output from that process still performs signal detection using the standard fast Fourier transform (FFT) parallel frequency space search acquisition. The sparse representation of the GNSS signal is the most important precondition for CS, by constructing a rectangular Toeplitz matrix (TZ) of the transmitted signal, calculating the left singular vectors using SVD from the TZ, to achieve sparse signal representation. Next, obtaining the M-dimensional observation vectors based on the left singular vectors of the SVD, which are equivalent to the sampler operator in standard compressive sensing theory, the signal can be sampled below the Nyquist rate, and can still be reconstructed via ℓ1 minimization with accuracy using convex optimization. As an added value, there is a GNSS signal acquisition enhancement effect by retaining the useful signal and filtering out noise by projecting the signal into the most significant proper orthogonal modes (PODs) which are the optimal distributions of signal power. The algorithm is validated with real recorded signals, and the results show that the proposed method is effective for sampling, reconstructing intermediate frequency (IF) GNSS signals in the time discrete domain. PMID:29772731
MIXI: Mobile Intelligent X-Ray Inspection System
NASA Astrophysics Data System (ADS)
Arodzero, Anatoli; Boucher, Salime; Kutsaev, Sergey V.; Ziskin, Vitaliy
2017-07-01
A novel, low-dose Mobile Intelligent X-ray Inspection (MIXI) concept is being developed at RadiaBeam Technologies. The MIXI concept relies on a linac-based, adaptive, ramped energy source of short X-ray packets of pulses, a new type of fast X-ray detector, rapid processing of detector signals for intelligent control of the linac, and advanced radiography image processing. The key parameters for this system include: better than 3 mm line pair resolution; penetration greater than 320 mm of steel equivalent; scan speed with 100% image sampling rate of up to 15 km/h; and material discrimination over a range of thicknesses up to 200 mm of steel equivalent. Its minimal radiation dose, size and weight allow MIXI to be placed on a lightweight truck chassis.
Analysis of problems and failures in the measurement of soil-gas radon concentration.
Neznal, Martin; Neznal, Matěj
2014-07-01
Long-term experience in the field of soil-gas radon concentration measurements allows to describe and explain the most frequent causes of failures, which can appear in practice when various types of measurement methods and soil-gas sampling techniques are used. The concept of minimal sampling depth, which depends on the volume of the soil-gas sample and on the soil properties, is shown in detail. Consideration of minimal sampling depth at the time of measurement planning allows to avoid the most common mistakes. The ways how to identify influencing parameters, how to avoid a dilution of soil-gas samples by the atmospheric air, as well as how to recognise inappropriate sampling methods are discussed. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Silva, Catarina; Cavaco, Carina; Perestrelo, Rosa; Pereira, Jorge; Câmara, José S.
2014-01-01
For a long time, sample preparation was unrecognized as a critical issue in the analytical methodology, thus limiting the performance that could be achieved. However, the improvement of microextraction techniques, particularly microextraction by packed sorbent (MEPS) and solid-phase microextraction (SPME), completely modified this scenario by introducing unprecedented control over this process. Urine is a biological fluid that is very interesting for metabolomics studies, allowing human health and disease characterization in a minimally invasive form. In this manuscript, we will critically review the most relevant and promising works in this field, highlighting how the metabolomic profiling of urine can be an extremely valuable tool for the early diagnosis of highly prevalent diseases, such as cardiovascular, oncologic and neurodegenerative ones. PMID:24958388
Ju, Guangxu; Highland, Matthew J; Yanguas-Gil, Angel; Thompson, Carol; Eastman, Jeffrey A; Zhou, Hua; Brennan, Sean M; Stephenson, G Brian; Fuoss, Paul H
2017-03-01
We describe an instrument that exploits the ongoing revolution in synchrotron sources, optics, and detectors to enable in situ studies of metal-organic vapor phase epitaxy (MOVPE) growth of III-nitride materials using coherent x-ray methods. The system includes high-resolution positioning of the sample and detector including full rotations, an x-ray transparent chamber wall for incident and diffracted beam access over a wide angular range, and minimal thermal sample motion, giving the sub-micron positional stability and reproducibility needed for coherent x-ray studies. The instrument enables surface x-ray photon correlation spectroscopy, microbeam diffraction, and coherent diffraction imaging of atomic-scale surface and film structure and dynamics during growth, to provide fundamental understanding of MOVPE processes.
NASA Astrophysics Data System (ADS)
Ju, Guangxu; Highland, Matthew J.; Yanguas-Gil, Angel; Thompson, Carol; Eastman, Jeffrey A.; Zhou, Hua; Brennan, Sean M.; Stephenson, G. Brian; Fuoss, Paul H.
2017-03-01
We describe an instrument that exploits the ongoing revolution in synchrotron sources, optics, and detectors to enable in situ studies of metal-organic vapor phase epitaxy (MOVPE) growth of III-nitride materials using coherent x-ray methods. The system includes high-resolution positioning of the sample and detector including full rotations, an x-ray transparent chamber wall for incident and diffracted beam access over a wide angular range, and minimal thermal sample motion, giving the sub-micron positional stability and reproducibility needed for coherent x-ray studies. The instrument enables surface x-ray photon correlation spectroscopy, microbeam diffraction, and coherent diffraction imaging of atomic-scale surface and film structure and dynamics during growth, to provide fundamental understanding of MOVPE processes.
Sorzano, Carlos Oscars S; Pérez-De-La-Cruz Moreno, Maria Angeles; Burguet-Castell, Jordi; Montejo, Consuelo; Ros, Antonio Aguilar
2015-06-01
Pharmacokinetics (PK) applications can be seen as a special case of nonlinear, causal systems with memory. There are cases in which prior knowledge exists about the distribution of the system parameters in a population. However, for a specific patient in a clinical setting, we need to determine her system parameters so that the therapy can be personalized. This system identification is performed many times by measuring drug concentrations in plasma. The objective of this work is to provide an irregular sampling strategy that minimizes the uncertainty about the system parameters with a fixed amount of samples (cost constrained). We use Monte Carlo simulations to estimate the average Fisher's information matrix associated to the PK problem, and then estimate the sampling points that minimize the maximum uncertainty associated to system parameters (a minimax criterion). The minimization is performed employing a genetic algorithm. We show that such a sampling scheme can be designed in a way that is adapted to a particular patient and that it can accommodate any dosing regimen as well as it allows flexible therapeutic strategies. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Oden, Timothy D.
2011-01-01
The Gulf Coast aquifer system is the primary water supply for Montgomery County in southeastern Texas, including part of the Houston metropolitan area and the cities of Magnolia, Conroe, and The Woodlands Township, Texas. The U.S. Geological Survey, in cooperation with the Lone Star Groundwater Conservation District, collected environmental tracer data in the Gulf Coast aquifer system, primarily in Montgomery County. Forty existing groundwater wells screened in the Gulf Coast aquifer system were selected for sampling in Montgomery County (38 wells), Waller County (1 well), and Walker County (1 well). Groundwater-quality samples, physicochemical properties, and water-level data were collected once from each of the 40 wells during March-September 2008. Groundwater-quality samples were analyzed for dissolved gases and the environmental tracers sulfur hexafluoride, chlorofluorocarbons, tritium, helium-4, and helium-3/tritium. Water samples were collected and processed onsite using methods designed to minimize changes to the water-sample chemistry or contamination from the atmosphere. Replicate samples for quality assurance and quality control were collected with each environmental sample. Well-construction information and environmental tracer data for March-September 2008 are presented.
A combined method for DNA analysis and radiocarbon dating from a single sample.
Korlević, Petra; Talamo, Sahra; Meyer, Matthias
2018-03-07
Current protocols for ancient DNA and radiocarbon analysis of ancient bones and teeth call for multiple destructive samplings of a given specimen, thereby increasing the extent of undesirable damage to precious archaeological material. Here we present a method that makes it possible to obtain both ancient DNA sequences and radiocarbon dates from the same sample material. This is achieved by releasing DNA from the bone matrix through incubation with either EDTA or phosphate buffer prior to complete demineralization and collagen extraction utilizing the acid-base-acid-gelatinization and ultrafiltration procedure established in most radiocarbon dating laboratories. Using a set of 12 bones of different ages and preservation conditions we demonstrate that on average 89% of the DNA can be released from sample powder with minimal, or 38% without any, detectable collagen loss. We also detect no skews in radiocarbon dates compared to untreated samples. Given the different material demands for radiocarbon dating (500 mg of bone/dentine) and DNA analysis (10-100 mg), combined DNA and collagen extraction not only streamlines the sampling process but also drastically increases the amount of DNA that can be recovered from limited sample material.
Laser Induced Breakdown Spectroscopy of Glass and Crystal Samples
NASA Astrophysics Data System (ADS)
Sharma, Prakash; Sandoval, Alejandra; Carter, Michael; Kumar, Akshaya
2015-03-01
Different types of quartz crystals and rare earth ions doped glasses have been identified using the laser induced breakdown spectroscopy (LIBS) technique. LIBS is a real time technique, can be used to identify samples in solid, liquid and gas phases. The advantage of LIBS technique is that no sample preparation is required and laser causes extremely minimal damage to the sample surface. The LIBS spectrum of silicate glasses, prepared by sol-gel method and doped with different concentration of rare earth ions, has been recorded. The limit of detection of rare earth ions in glass samples has been calculated. Total 10 spectrums of each sample were recorded and then averaged to get a final spectrum. The ocean optics LIBS2500 plus spectrometer along with a Q- switched Nd: YAG laser (Quantel, Big Sky) were used to record the LIBS spectrum. This spectrometer can analyze the sample in the spectral range of 200 nm to 980 nm. The spectrum was processed by OOILIBS-plus (v1.0) software. This study has application in the industry where different crystals can be easily identified before they go for shaping and polishing. Also, concentration of rare earth ions in glass can be monitored in real time for quality control.
Huff, Jacquelyn K; Bresnahan, James F; Davies, Malonne I
2003-06-06
This study evaluated the suitability of some disinfection and sterilization methods for use with microdialysis probes. Disinfection or sterilization should minimize the tissue inflammatory reaction and improve the long-term health of rats on study and ensure the quality of data obtained by microdialysis sampling. Furthermore, the treatment should not negatively impact probe integrity or sampling performance. The techniques chosen for evaluation included two disinfection methods (70% ethanol and a commercial contact lens solution) and two sterilization methods (hydrogen peroxide plasma, and e-beam radiation). Linear microdialysis probes treated by these processes were compared to untreated probes removed from the manufacturer's packaging as if sterile (the control group). The probes were aseptically implanted in the livers of rats and monitored for 72 hours. The parameters chosen to evaluate probe performance were relative sample mass recovery and the relative in vivo extraction efficiency of the probe for caffeine. Post mortem bacterial counts and histopathology examination of liver tissue were also conducted. The probes remained intact and functional for the entire study period. The methods tested did not acutely alter the probes although hydrogen peroxide plasma and contact lens solution groups showed reduced extraction efficiencies. Minimal tissue damage was observed surrounding the probes and acute inflammatory reaction was mild to moderate. Low numbers of bacterial colonies from the implantation sites indicates that the health of animals in this study was not impaired. This was also true for the control group (untreated probe).
Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Victor
2012-07-15
A novel lab-on-valve system has been developed for strontium determination in environmental samples. Miniaturized lab-on-valve system potentially offers facilities to allow any kind of chemical and physical processes, including fluidic and microcarrier bead control, homogenous reaction and liquid-solid interaction. A rapid, inexpensive and fully automated method for the separation and preconcentration of total and radioactive strontium, using a solid phase extraction material (Sr-Resin), has been developed. Total strontium concentrations are determined by ICP-OES and (90)Sr activities by a low background proportional counter. The method has been successfully applied to different water samples of environmental interest. The proposed system offers minimization of sample handling, drastic reduction of reagent volume, improvement of the reproducibility and sample throughput and attains a significant decrease of both time and cost per analysis. The LLD of the total Sr reached is 1.8ng and the minimum detectable activity for (90)Sr is 0.008Bq. The repeatability of the separation procedure is 1.2% (n=10). Copyright © 2011 Elsevier B.V. All rights reserved.
Leck, Kira
2006-10-01
Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.
ERIC Educational Resources Information Center
Bacharach, Samuel B.; And Others
1990-01-01
Study of five sets of work process variables and their relationship to role conflict and overload among public sector nurses and engineers found managerial strategies appropriate for minimizing role conflict not necessarily appropriate for minimizing role overload. Some work process predictors may be similar across professions, and managerial…
Liu, Hongtao; Johnson, Jeffrey L.; Koval, Greg; Malnassy, Greg; Sher, Dorie; Damon, Lloyd E.; Hsi, Eric D.; Bucci, Donna Marie; Linker, Charles A.; Cheson, Bruce D.; Stock, Wendy
2012-01-01
Background In the present study, the prognostic impact of minimal residual disease during treatment on time to progression and overall survival was analyzed prospectively in patients with mantle cell lymphoma treated on the Cancer and Leukemia Group B 59909 clinical trial. Design and Methods Peripheral blood and bone marrow samples were collected during different phases of the Cancer and Leukemia Group B 59909 study for minimal residual disease analysis. Minimal residual disease status was determined by quantitative polymerase chain reaction of IgH and/or BCL-1/JH gene rearrangement. Correlation of minimal residual disease status with time to progression and overall survival was determined. In multivariable analysis, minimal residual disease, and other risk factors were correlated with time to progression. Results Thirty-nine patients had evaluable, sequential peripheral blood and bone marrow samples for minimal residual disease analysis. Using peripheral blood monitoring, 18 of 39 (46%) achieved molecular remission following induction therapy. The molecular remission rate increased from 46 to 74% after one course of intensification therapy. Twelve of 21 minimal residual disease positive patients (57%) progressed within three years of follow up compared to 4 of 18 (22%) molecular remission patients (P=0.049). Detection of minimal residual disease following induction therapy predicted disease progression with a hazard ratio of 3.7 (P=0.016). The 3-year probability of time to progression among those who were in molecular remission after induction chemotherapy was 82% compared to 48% in patients with detectable minimal residual disease. The prediction of time to progression by post-induction minimal residual disease was independent of other prognostic factors in multivariable analysis. Conclusions Detection of minimal residual disease following induction immunochemotherapy was an independent predictor of time to progression following immunochemotherapy and autologous stem cell transplantation for mantle cell lymphoma. The clinical trial was registered at ClinicalTrials.gov: NCT00020943. PMID:22102709
Bolon, Brad; Krinke, Georg; Butt, Mark T; Rao, Deepa B; Pardo, Ingrid D; Jortner, Bernard S; Garman, Robert H; Jensen, Karl; Andrews-Jones, Lydia; Morrison, James P; Sharma, Alok K; Thibodeau, Michael S
2018-01-01
Peripheral nervous system (PNS) toxicity is surveyed inconsistently in nonclinical general toxicity studies. These Society of Toxicologic Pathology "best practice" recommendations are designed to ensure consistent, efficient, and effective sampling, processing, and evaluation of PNS tissues for four different situations encountered during nonclinical general toxicity (screening) and dedicated neurotoxicity studies. For toxicity studies where neurotoxicity is unknown or not anticipated (situation 1), PNS evaluation may be limited to one sensorimotor spinal nerve. If somatic PNS neurotoxicity is suspected (situation 2), analysis minimally should include three spinal nerves, multiple dorsal root ganglia, and a trigeminal ganglion. If autonomic PNS neuropathy is suspected (situation 3), parasympathetic and sympathetic ganglia should be assessed. For dedicated neurotoxicity studies where a neurotoxic effect is expected (situation 4), PNS sampling follows the strategy for situations 2 and/or 3, as dictated by functional or other compound/target-specific data. For all situations, bilateral sampling with unilateral processing is acceptable. For situations 1-3, PNS is processed conventionally (immersion in buffered formalin, paraffin embedding, and hematoxylin and eosin staining). For situation 4 (and situations 2 and 3 if resources and timing permit), perfusion fixation with methanol-free fixative is recommended. Where PNS neurotoxicity is suspected or likely, at least one (situations 2 and 3) or two (situation 4) nerve cross sections should be postfixed with glutaraldehyde and osmium before hard plastic resin embedding; soft plastic embedding is not a suitable substitute for hard plastic. Special methods may be used if warranted to further characterize PNS findings. Initial PNS analysis should be informed, not masked ("blinded"). Institutions may adapt these recommendations to fit their specific programmatic requirements but may need to explain in project documentation the rationale for their chosen PNS sampling, processing, and evaluation strategy.
Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?
NASA Astrophysics Data System (ADS)
Ruebeck, Joshua B.; James, Ryan G.; Mahoney, John R.; Crutchfield, James P.
2018-01-01
Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.
Soil sampling kit and a method of sampling therewith
Thompson, Cyril V.
1991-01-01
A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.
Soil sampling kit and a method of sampling therewith
Thompson, C.V.
1991-02-05
A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.
Wu, Johnny; Witkiewitz, Katie; McMahon, Robert J; Dodge, Kenneth A
2010-10-01
Conduct problems, substance use, and risky sexual behavior have been shown to coexist among adolescents, which may lead to significant health problems. The current study was designed to examine relations among these problem behaviors in a community sample of children at high risk for conduct disorder. A latent growth model of childhood conduct problems showed a decreasing trend from grades K to 5. During adolescence, four concurrent conduct problem and substance use trajectory classes were identified (high conduct problems and high substance use, increasing conduct problems and increasing substance use, minimal conduct problems and increasing substance use, and minimal conduct problems and minimal substance use) using a parallel process growth mixture model. Across all substances (tobacco, binge drinking, and marijuana use), higher levels of childhood conduct problems during kindergarten predicted a greater probability of classification into more problematic adolescent trajectory classes relative to less problematic classes. For tobacco and binge drinking models, increases in childhood conduct problems over time also predicted a greater probability of classification into more problematic classes. For all models, individuals classified into more problematic classes showed higher proportions of early sexual intercourse, infrequent condom use, receiving money for sexual services, and ever contracting an STD. Specifically, tobacco use and binge drinking during early adolescence predicted higher levels of sexual risk taking into late adolescence. Results highlight the importance of studying the conjoint relations among conduct problems, substance use, and risky sexual behavior in a unified model. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Widhiarso, Wahyu; Rosyidi, Cucuk Nur
2018-02-01
Minimizing production cost in a manufacturing company will increase the profit of the company. The cutting parameters will affect total processing time which then will affect the production cost of machining process. Besides affecting the production cost and processing time, the cutting parameters will also affect the environment. An optimization model is needed to determine the optimum cutting parameters. In this paper, we develop an optimization model to minimize the production cost and the environmental impact in CNC turning process. The model is used a multi objective optimization. Cutting speed and feed rate are served as the decision variables. Constraints considered are cutting speed, feed rate, cutting force, output power, and surface roughness. The environmental impact is converted from the environmental burden by using eco-indicator 99. Numerical example is given to show the implementation of the model and solved using OptQuest of Oracle Crystal Ball software. The results of optimization indicate that the model can be used to optimize the cutting parameters to minimize the production cost and the environmental impact.
Sampling of temporal networks: Methods and biases
NASA Astrophysics Data System (ADS)
Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter
2017-11-01
Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.
Scaling up the 454 Titanium Library Construction and Pooling of Barcoded Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phung, Wilson; Hack, Christopher; Shapiro, Harris
2009-03-23
We have been developing a high throughput 454 library construction process at the Joint Genome Institute to meet the needs of de novo sequencing a large number of microbial and eukaryote genomes, EST, and metagenome projects. We have been focusing efforts in three areas: (1) modifying the current process to allow the construction of 454 standard libraries on a 96-well format; (2) developing a robotic platform to perform the 454 library construction; and (3) designing molecular barcodes to allow pooling and sorting of many different samples. In the development of a high throughput process to scale up the number ofmore » libraries by adapting the process to a 96-well plate format, the key process change involves the replacement of gel electrophoresis for size selection with Solid Phase Reversible Immobilization (SPRI) beads. Although the standard deviation of the insert sizes increases, the overall quality sequence and distribution of the reads in the genome has not changed. The manual process of constructing 454 shotgun libraries on 96-well plates is a time-consuming, labor-intensive, and ergonomically hazardous process; we have been experimenting to program a BioMek robot to perform the library construction. This will not only enable library construction to be completed in a single day, but will also minimize any ergonomic risk. In addition, we have implemented a set of molecular barcodes (AKA Multiple Identifiers or MID) and a pooling process that allows us to sequence many targets simultaneously. Here we will present the testing of pooling a set of selected fosmids derived from the endomycorrhizal fungus Glomus intraradices. By combining the robotic library construction process and the use of molecular barcodes, it is now possible to sequence hundreds of fosmids that represent a minimal tiling path of this genome. Here we present the progress and the challenges of developing these scaled-up processes.« less
NASA Astrophysics Data System (ADS)
Abdullah, N. O.; Pandebesie, E. S.
2018-03-01
Based on Indonesian Government Regulation number 18, 2008, solid waste management should be conducted from the source to minimize the amount of waste. The process includes the waste from domestic, commercial, and institution. This also includes in 3R program (reduce, reuse, and recycle). Vegetable waste from market is a potential material to produce biogas due to its chemical composition (hemi-cellulose, cellulose, and lignin) which transform the biomass to be the raw material of biogas. Acid substance of vegetable becomes an obstacle in process of producing biogas. There has to be buffer material which can improve the performance of biogas process. Cow manure is a material which can be easily obtained as buffer. This research used 24 biogas reactor in volume 6 L by batch method. Biogas volume is measured by checking the preferment in manometer. Methane measurement is conducted by using Gas Chromatography (GC) Hewlett Packard (HP-series 6890) in day 15 and 30. The research was started by sample characterization, sample test by total solid analysis, volatile solid, lignin, ratio C/N, ammonium, and ash. Analysis of pH, temperature, and biogas volume is conducted every day.
Hu, Jian Zhi; Hu, Mary Y.; Townsend, Mark R.; Lercher, Johannes A.; Peden, Charles H. F.
2015-10-06
Re-usable ceramic magic angle spinning (MAS) NMR rotors constructed of high-mechanic strength ceramics are detailed that include a sample compartment that maintains high pressures up to at least about 200 atmospheres (atm) and high temperatures up to about least about 300.degree. C. during operation. The rotor designs minimize pressure losses stemming from penetration over an extended period of time. The present invention makes possible a variety of in-situ high pressure, high temperature MAS NMR experiments not previously achieved in the prior art.
Micromotors to capture and destroy anthrax simulant spores.
Orozco, Jahir; Pan, Guoqing; Sattayasamitsathit, Sirilak; Galarnyk, Michael; Wang, Joseph
2015-03-07
Towards addressing the need for detecting and eliminating biothreats, we describe a micromotor-based approach for screening, capturing, isolating and destroying anthrax simulant spores in a simple and rapid manner with minimal sample processing. The B. globilli antibody-functionalized micromotors can recognize, capture and transport B. globigii spores in environmental matrices, while showing non-interactions with excess of non-target bacteria. Efficient destruction of the anthrax simulant spores is demonstrated via the micromotor-induced mixing of a mild oxidizing solution. The new micromotor-based approach paves a way to dynamic multifunctional systems that rapidly recognize, isolate, capture and destroy biological threats.
DOT National Transportation Integrated Search
2013-07-01
This report documents the performance of two low traffic volume experimental chip seals constructed using : locally available, minimally processed sand and gravel aggregates after four winters of service. The projects : were constructed by CDOT maint...
INTELLIGENT DECISION SUPPORT FOR WASTE MINIMIZATION IN ELECTROPLATING PLANTS. (R824732)
Wastewater, spent solvent, spent process solutions, and sludge are the major waste streams generated in large volumes daily in electroplating plants. These waste streams can be significantly minimized through process modification and operational improvement. I...
Ziajahromi, Shima; Neale, Peta A; Rintoul, Llew; Leusch, Frederic D L
2017-04-01
Wastewater effluent is expected to be a pathway for microplastics to enter the aquatic environment, with microbeads from cosmetic products and polymer fibres from clothes likely to enter wastewater treatment plants (WWTP). To date, few studies have quantified microplastics in wastewater. Moreover, the lack of a standardized and applicable method to identify microplastics in complex samples, such as wastewater, has limited the accurate assessment of microplastics and may lead to an incorrect estimation. This study aimed to develop a validated method to sample and process microplastics from wastewater effluent and to apply the developed method to quantify and characterise wastewater-based microplastics in effluent from three WWTPs that use primary, secondary and tertiary treatment processes. We applied a high-volume sampling device that fractionated microplastics in situ and an efficient sample processing procedure to improve the sampling of microplastics in wastewater and to minimize the false detection of non-plastic particles. The sampling device captured between 92% and 99% of polystyrene microplastics using 25 μm-500 μm mesh screens in laboratory tests. Microplastic type, size and suspected origin in all studied WWTPs, along with the removal efficiency during the secondary and tertiary treatment stages, was investigated. Suspected microplastics were characterised using Fourier Transform Infrared spectroscopy, with between 22 and 90% of the suspected microplastics found to be non-plastic particles. An average of 0.28, 0.48 and 1.54 microplastics per litre of final effluent was found in tertiary, secondary and primary treated effluent, respectively. This study suggests that although low concentrations of microplastics are detected in wastewater effluent, WWTPs still have the potential to act as a pathway to release microplastics given the large volumes of effluent discharged to the aquatic environment. This study focused on a single sampling campaign, with long-term monitoring recommended to further characterise microplastics in wastewater. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Procháska, F.; Vít, T.; Matoušek, O.; Melich, R.
2016-11-01
High demands on the final surfaces micro-roughness as well as great shape accuracy have to be achieved under the manufacturing process of the precise mirrors for Metis orbital coronagraph. It is challenging engineering task with respect to lightweight design of the mirrors and resulting objectionable optical surface shape stability. Manufacturing of such optical elements is usually affected by number of various effects. Most of them are caused by instability of temperature field. It is necessary to explore, comprehend and consequently minimize all thermo - mechanical processes which take place during mirror cementing, grinding and polishing processes to minimize the optical surface deformation. Application of FEM simulation was proved as a useful tool to help to solve this task. FEM simulations were used to develop and virtually compare different mirror holders to minimize the residual stress generated by temperature changes and to suppress the shape deformation of the optical surface below the critical limit of about 100 nm.
Minimal camera networks for 3D image based modeling of cultural heritage objects.
Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma
2014-03-25
3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue "Lamassu". Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883-859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm.
Minimal Camera Networks for 3D Image Based Modeling of Cultural Heritage Objects
Alsadik, Bashar; Gerke, Markus; Vosselman, George; Daham, Afrah; Jasim, Luma
2014-01-01
3D modeling of cultural heritage objects like artifacts, statues and buildings is nowadays an important tool for virtual museums, preservation and restoration. In this paper, we introduce a method to automatically design a minimal imaging network for the 3D modeling of cultural heritage objects. This becomes important for reducing the image capture time and processing when documenting large and complex sites. Moreover, such a minimal camera network design is desirable for imaging non-digitally documented artifacts in museums and other archeological sites to avoid disturbing the visitors for a long time and/or moving delicate precious objects to complete the documentation task. The developed method is tested on the Iraqi famous statue “Lamassu”. Lamassu is a human-headed winged bull of over 4.25 m in height from the era of Ashurnasirpal II (883–859 BC). Close-range photogrammetry is used for the 3D modeling task where a dense ordered imaging network of 45 high resolution images were captured around Lamassu with an object sample distance of 1 mm. These images constitute a dense network and the aim of our study was to apply our method to reduce the number of images for the 3D modeling and at the same time preserve pre-defined point accuracy. Temporary control points were fixed evenly on the body of Lamassu and measured by using a total station for the external validation and scaling purpose. Two network filtering methods are implemented and three different software packages are used to investigate the efficiency of the image orientation and modeling of the statue in the filtered (reduced) image networks. Internal and external validation results prove that minimal image networks can provide highly accurate records and efficiency in terms of visualization, completeness, processing time (>60% reduction) and the final accuracy of 1 mm. PMID:24670718
GROUND WATER ISSUE: LOW-FLOW (MINIMAL DRAWDOWN) GROUND-WATER SAMPLING PROCEDURES
This paper is intended to provide background information on the development of low-flow sampling procedures and its application under a variety of hydrogeologic settings. The sampling methodology described in this paper assumes that the monitoring goal is to sample monitoring wel...
An interface for the direct coupling of small liquid samples to AMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
An interface for the direct coupling of small liquid samples to AMS
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.; ...
2015-05-28
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
Singh, Satwinder Kaur; Meyering, Maaike; Ramwadhdoebe, Tamara H; Stynenbosch, Linda F M; Redeker, Anke; Kuppen, Peter J K; Melief, Cornelis J M; Welters, Marij J P; van der Burg, Sjoerd H
2012-11-01
The ability to measure antigen-specific T cells at the single-cell level by intracellular cytokine staining (ICS) is a promising immunomonitoring tool and is extensively applied in the evaluation of immunotherapy of cancer. The protocols used to detect antigen-specific CD8+ T-cell responses generally work for the detection of antigen-specific T cells in samples that have undergone at least one round of in vitro pre-stimulation. Application of a common protocol but now using long peptides as antigens was not suitable to simultaneously detect antigen-specific CD8+ and CD4+ T cells directly ex vivo in cryopreserved samples. CD8 T-cell reactivity to monocytes pulsed with long peptides as antigens ranged between 5 and 25 % of that observed against monocytes pulsed with a direct HLA class I fitting minimal CTL peptide epitope. Therefore, we adapted our ICS protocol and show that the use of tenfold higher concentration of long peptides to load APC, the use of IFN-α and poly(I:C) to promote antigen processing and improve T-cell stimulation, does allow for the ex vivo detection of low-frequency antigen-specific CD8+ and CD4+ T cells in an HLA-independent setting. While most of the improvements were related to increasing the ability to measure CD8+ T-cell reactivity following stimulation with long peptides to at least 50 % of the response detected when using a minimal peptide epitope, the final analysis of blood samples from vaccinated patients successfully showed that the adapted ICS protocol also increases the ability to ex vivo detect low-frequency p53-specific CD4+ T-cell responses in cryopreserved PBMC samples.
Stein, Eric D; White, Bryan P; Mazor, Raphael D; Miller, Peter E; Pilgrim, Erik M
2013-01-01
Molecular methods, such as DNA barcoding, have the potential to enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biomonitoring using benthic macroinvertebrates. Using higher volumes or concentrations of ethanol, requirements for shorter holding times, or the need to include additional filtering may increase cost and logistical constraints to existing biomonitoring programs. To address this issue we evaluated the efficacy of various ethanol-based sample preservation methods at maintaining DNA integrity. We evaluated a series of methods that were minimally modified from typical field protocols in order to identify an approach that can be readily incorporated into existing monitoring programs. Benthic macroinvertebrates were collected from a minimally disturbed stream in southern California, USA and subjected to one of six preservation treatments. Ten individuals from five taxa were selected from each treatment and processed to produce DNA barcodes from the mitochondrial gene cytochrome c oxidase I (COI). On average, we obtained successful COI sequences (i.e. either full or partial barcodes) for between 93-99% of all specimens across all six treatments. As long as samples were initially preserved in 95% ethanol, successful sequencing of COI barcodes was not affected by a low dilution ratio of 2∶1, transfer to 70% ethanol, presence of abundant organic matter, or holding times of up to six months. Barcoding success varied by taxa, with Leptohyphidae (Ephemeroptera) producing the lowest barcode success rate, most likely due to poor PCR primer efficiency. Differential barcoding success rates have the potential to introduce spurious results. However, routine preservation methods can largely be used without adverse effects on DNA integrity.
Stein, Eric D.; White, Bryan P.; Mazor, Raphael D.; Miller, Peter E.; Pilgrim, Erik M.
2013-01-01
Molecular methods, such as DNA barcoding, have the potential to enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biomonitoring using benthic macroinvertebrates. Using higher volumes or concentrations of ethanol, requirements for shorter holding times, or the need to include additional filtering may increase cost and logistical constraints to existing biomonitoring programs. To address this issue we evaluated the efficacy of various ethanol-based sample preservation methods at maintaining DNA integrity. We evaluated a series of methods that were minimally modified from typical field protocols in order to identify an approach that can be readily incorporated into existing monitoring programs. Benthic macroinvertebrates were collected from a minimally disturbed stream in southern California, USA and subjected to one of six preservation treatments. Ten individuals from five taxa were selected from each treatment and processed to produce DNA barcodes from the mitochondrial gene cytochrome c oxidase I (COI). On average, we obtained successful COI sequences (i.e. either full or partial barcodes) for between 93–99% of all specimens across all six treatments. As long as samples were initially preserved in 95% ethanol, successful sequencing of COI barcodes was not affected by a low dilution ratio of 2∶1, transfer to 70% ethanol, presence of abundant organic matter, or holding times of up to six months. Barcoding success varied by taxa, with Leptohyphidae (Ephemeroptera) producing the lowest barcode success rate, most likely due to poor PCR primer efficiency. Differential barcoding success rates have the potential to introduce spurious results. However, routine preservation methods can largely be used without adverse effects on DNA integrity. PMID:23308097
Semantic-gap-oriented active learning for multilabel image annotation.
Tang, Jinhui; Zha, Zheng-Jun; Tao, Dacheng; Chua, Tat-Seng
2012-04-01
User interaction is an effective way to handle the semantic gap problem in image annotation. To minimize user effort in the interactions, many active learning methods were proposed. These methods treat the semantic concepts individually or correlatively. However, they still neglect the key motivation of user feedback: to tackle the semantic gap. The size of the semantic gap of each concept is an important factor that affects the performance of user feedback. User should pay more efforts to the concepts with large semantic gaps, and vice versa. In this paper, we propose a semantic-gap-oriented active learning method, which incorporates the semantic gap measure into the information-minimization-based sample selection strategy. The basic learning model used in the active learning framework is an extended multilabel version of the sparse-graph-based semisupervised learning method that incorporates the semantic correlation. Extensive experiments conducted on two benchmark image data sets demonstrated the importance of bringing the semantic gap measure into the active learning process.
Ballari, Rajashekhar V; Martin, Asha
2013-12-01
DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.
Designing single- and multiple-shell sampling schemes for diffusion MRI using spherical code.
Cheng, Jian; Shen, Dinggang; Yap, Pew-Thian
2014-01-01
In diffusion MRI (dMRI), determining an appropriate sampling scheme is crucial for acquiring the maximal amount of information for data reconstruction and analysis using the minimal amount of time. For single-shell acquisition, uniform sampling without directional preference is usually favored. To achieve this, a commonly used approach is the Electrostatic Energy Minimization (EEM) method introduced in dMRI by Jones et al. However, the electrostatic energy formulation in EEM is not directly related to the goal of optimal sampling-scheme design, i.e., achieving large angular separation between sampling points. A mathematically more natural approach is to consider the Spherical Code (SC) formulation, which aims to achieve uniform sampling by maximizing the minimal angular difference between sampling points on the unit sphere. Although SC is well studied in the mathematical literature, its current formulation is limited to a single shell and is not applicable to multiple shells. Moreover, SC, or more precisely continuous SC (CSC), currently can only be applied on the continuous unit sphere and hence cannot be used in situations where one or several subsets of sampling points need to be determined from an existing sampling scheme. In this case, discrete SC (DSC) is required. In this paper, we propose novel DSC and CSC methods for designing uniform single-/multi-shell sampling schemes. The DSC and CSC formulations are solved respectively by Mixed Integer Linear Programming (MILP) and a gradient descent approach. A fast greedy incremental solution is also provided for both DSC and CSC. To our knowledge, this is the first work to use SC formulation for designing sampling schemes in dMRI. Experimental results indicate that our methods obtain larger angular separation and better rotational invariance than the generalized EEM (gEEM) method currently used in the Human Connectome Project (HCP).
Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F
2015-01-01
Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.
7 CFR 3430.36 - Procedures to minimize or eliminate duplication of effort.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE COMPETITIVE AND... may implement appropriate business processes to minimize or eliminate the awarding of CSREES Federal... awards made by other Federal agencies. Business processes may include the review of the Current and...
MPLP and the Catalog Record as a Finding Aid
ERIC Educational Resources Information Center
Bowen Maier, Shannon
2011-01-01
The cataloging of otherwise unprocessed collections is an innovative minimal processing technique with important implications for reference service. This article mines the existing literature for how institutions engaged in minimal processing view reference, the strengths and weaknesses of catalog records as finding aids, and information about…
Siroli, Lorenzo; Patrignani, Francesca; Serrazanetti, Diana I; Tabanelli, Giulia; Montanari, Chiara; Gardini, Fausto; Lanciotti, Rosalba
2015-05-01
Outbreaks of food-borne disease associated with the consumption of fresh and minimally processed fruits and vegetables have increased dramatically over the last few years. Traditional chemical sanitizers are unable to completely eradicate or kill the microorganisms on fresh produce. These conditions have stimulated research to alternative methods for increasing food safety. The use of protective cultures, particularly lactic acid bacteria (LAB), has been proposed for minimally processed products. However, the application of bioprotective cultures has been limited at the industrial level. From this perspective, the main aims of this study were to select LAB from minimally processed fruits and vegetables to be used as biocontrol agents and then to evaluate the effects of the selected strains, alone or in combination with natural antimicrobials (2-(E)-hexenal/hexanal, 2-(E)-hexenal/citral for apples and thyme for lamb's lettuce), on the shelf-life and safety characteristics of minimally processed apples and lamb's lettuce. The results indicated that applying the Lactobacillus plantarum strains CIT3 and V7B3 to apples and lettuce, respectively, increased both the safety and shelf-life. Moreover, combining the selected strains with natural antimicrobials produced a further increase in the shelf-life of these products without detrimental effects on the organoleptic qualities. Copyright © 2014 Elsevier Ltd. All rights reserved.
Principle of minimal work fluctuations.
Xiao, Gaoyang; Gong, Jiangbin
2015-08-01
Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].
NASA Technical Reports Server (NTRS)
Seidel, A.; Soellner, W.; Stenzel, C.
2012-01-01
Electromagnetic levitation under microgravity provides unique opportunities for the investigation of liquid metals, alloys and semiconductors, both above and below their melting temperatures, with minimized disturbances of the sample under investigation. The opportunity to perform such experiments will soon be available on the ISS with the EML payload which is currently being integrated. With its high-performance diagnostics systems EML allows to measure various physical properties such as heat capacity, enthalpy of fusion, viscosity, surface tension, thermal expansion coefficient, and electrical conductivity. In studies of nucleation and solidification phenomena the nucleation kinetics, phase selection, and solidification velocity can be determined. Advanced measurement capabilities currently being studied include the measurement and control of the residual oxygen content of the process atmosphere and a complementary inductive technique to measure thermophysical properties.
Project management techniques for highly integrated programs
NASA Technical Reports Server (NTRS)
Stewart, J. F.; Bauer, C. A.
1983-01-01
The management and control of a representative, highly integrated high-technology project, in the X-29A aircraft flight test project is addressed. The X-29A research aircraft required the development and integration of eight distinct technologies in one aircraft. The project management system developed for the X-29A flight test program focuses on the dynamic interactions and the the intercommunication among components of the system. The insights gained from the new conceptual framework permitted subordination of departments to more functional units of decisionmaking, information processing, and communication networks. These processes were used to develop a project management system for the X-29A around the information flows that minimized the effects inherent in sampled-data systems and exploited the closed-loop multivariable nature of highly integrated projects.
Fast converging minimum probability of error neural network receivers for DS-CDMA communications.
Matyjas, John D; Psaromiligkos, Ioannis N; Batalama, Stella N; Medley, Michael J
2004-03-01
We consider a multilayer perceptron neural network (NN) receiver architecture for the recovery of the information bits of a direct-sequence code-division-multiple-access (DS-CDMA) user. We develop a fast converging adaptive training algorithm that minimizes the bit-error rate (BER) at the output of the receiver. The adaptive algorithm has three key features: i) it incorporates the BER, i.e., the ultimate performance evaluation measure, directly into the learning process, ii) it utilizes constraints that are derived from the properties of the optimum single-user decision boundary for additive white Gaussian noise (AWGN) multiple-access channels, and iii) it embeds importance sampling (IS) principles directly into the receiver optimization process. Simulation studies illustrate the BER performance of the proposed scheme.
Biohazards Assessment in Large-Scale Zonal Centrifugation
Baldwin, C. L.; Lemp, J. F.; Barbeito, M. S.
1975-01-01
A study was conducted to determine the biohazards associated with use of the large-scale zonal centrifuge for purification of moderate risk oncogenic viruses. To safely and conveniently assess the hazard, coliphage T3 was substituted for the virus in a typical processing procedure performed in a National Cancer Institute contract laboratory. Risk of personnel exposure was found to be minimal during optimal operation but definite potential for virus release from a number of centrifuge components during mechanical malfunction was shown by assay of surface, liquid, and air samples collected during the processing. High concentration of phage was detected in the turbine air exhaust and the seal coolant system when faulty seals were employed. The simulant virus was also found on both centrifuge chamber interior and rotor surfaces. Images PMID:1124921
Dayre McNally, J; Matheson, Loren A; Sankaran, Koravangattu; Rosenberg, Alan M
2008-11-01
This study compared 25-hydroxyvitamin D [25(OH)D] measurements in capillary and venous blood samples collected, respectively by fingerprick and venipuncture. Capillary blood for measuring 25(OH)D has potential advantages by reducing blood volume required (2mL versus 0.3mL for venipuncture and capillary sampling, respectively), facilitating blood collection for those populations in whom venipuncture is difficult (e.g. infants and children), improving patient convenience and reducing costs associated with phlebotomy. The results demonstrated a highly significant relationship between 25(OH)D levels in serum derived from venous and capillary blood samples (r(2)=0.901). Despite statistically higher 25(OH)D levels in fingerprick samples (108+/-9nmol/L) compared with venipuncture samples (90+/-7nmol/L), the correlation between venous and capillary samples provides support for this approach as a practical alternative to venipuncture for vitamin D determination. However, clinical application may require the incorporation of a correction factor for the assessment of insufficiency, and research studies should avoid using the two methods interchangeably. Studying vitamin D's role in health and disease requires collection techniques and measurement methods that are reliable, reproducible, easily accessible, inexpensive and minimally burdensome to the patient. The option to collect patient samples by fingerprick may facilitate the collection process.
Enzyme catalysis with small ionic liquid quantities.
Fischer, Fabian; Mutschler, Julien; Zufferey, Daniel
2011-04-01
Enzyme catalysis with minimal ionic liquid quantities improves reaction rates, stereoselectivity and enables solvent-free processing. In particular the widely used lipases combine well with many ionic liquids. Demonstrated applications are racemate separation, esterification and glycerolysis. Minimal solvent processing is also an alternative to sluggish solvent-free catalysis. The method allows simplified down-stream processing, as only traces of ionic liquids have to be removed.
Siracusa, Valentina; Blanco, Ignazio; Romani, Santina; Tylewicz, Urszula; Dalla Rosa, Marco
2012-10-01
This work reports an experimental study on the permeability and thermal behavior of commercial polypropylene (PP) film used for fresh-cut potatoes packaging. The permeability was tested using oxygen, carbon dioxide, nitrogen, mix of these 3 gases, normally used for modified atmosphere packaging (MAP) and Air, to understand if it would be possible to extend the shelf life of this food product designed for the catering field in respect to the packaging behavior. The temperature influence on permeability data, from 5 to 40 °C, was analyzed, before and after 4, 8, 12, 15, and 20 d of food contact, pointing out the dependence between temperature and gas transmission rate (GTR), solubility (S), diffusion coefficient (D), and time lag (t(L)) parameters. The activation energies (E) of the permeation process were determined with the different gases used in the experiments. The thermal behavior of PP film was studied by differential scanning calorimetry (DSC) and thermogravimetric analysis (TG) to well understand its thermal stability. Fourier transformed-infrared with attenuated total reflectance (FT-IR/ATR) spectroscopy was also performed in order to study the influence of the food contact on the chemical characteristics of the polymer film. The results obtained were discussed and compared each other. Studied samples showed, for all investigated gases, an increase of gas permeability and S values at higher temperature. Heat resistance classification among the sample as it is and stored in modified atmospheres was made. Finally all performed experiments have showed good polymer stability for the shelf-life storage potatoes under study. Study of packaging material was performed in a range of temperature, which can simulate the service condition to assess the suitability of a commercial polymer film for modified atmosphere packaging of fresh-cut potatoes minimally processed designed for catering purpose. © 2012 Institute of Food Technologists®
POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.
A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...
40 CFR 63.543 - What are my standards for process vents?
Code of Federal Regulations, 2014 CFR
2014-07-01
... develop and follow standard operating procedures designed to minimize emissions of total hydrocarbon for... manufacturer's recommended procedures, if available, and the standard operating procedures designed to minimize... 40 Protection of Environment 10 2014-07-01 2014-07-01 false What are my standards for process...
Technique minimizes the effects of dropouts on telemetry records
NASA Technical Reports Server (NTRS)
Anderson, T. O.; Hurd, W. J.
1972-01-01
Recorder deficiencies are minimized by using two-channel system to prepare two tapes, each having noise, wow and flutter, and dropout characteristics of channel on which it was made. Processing tapes by computer and combining signals from two channels produce single tape free of dropouts caused by recording process.
Asakawa, Naoya; Uchida, Keisuke; Sakakibara, Mamoru; Omote, Kazunori; Noguchi, Keiji; Tokuda, Yusuke; Kamiya, Kiwamu; Hatanaka, Kanako C; Matsuno, Yoshihiro; Yamada, Shiro; Asakawa, Kyoko; Fukasawa, Yuichiro; Nagai, Toshiyuki; Anzai, Toshihisa; Ikeda, Yoshihiko; Ishibashi-Ueda, Hatsue; Hirota, Masanori; Orii, Makoto; Akasaka, Takashi; Uto, Kenta; Shingu, Yasushige; Matsui, Yoshiro; Morimoto, Shin-Ichiro; Tsutsui, Hiroyuki; Eishi, Yoshinobu
2017-01-01
Although rare, cardiac sarcoidosis (CS) is potentially fatal. Early diagnosis and intervention are essential, but histopathologic diagnosis is limited. We aimed to detect Propionibacterium acnes, a commonly implicated etiologic agent of sarcoidosis, in myocardial tissues obtained from CS patients. We examined formalin-fixed paraffin-embedded myocardial tissues obtained by surgery or autopsy and endomyocardial biopsy from patients with CS (n = 26; CS-group), myocarditis (n = 15; M-group), or other cardiomyopathies (n = 39; CM-group) using immunohistochemistry (IHC) with a P. acnes-specific monoclonal antibody. We found granulomas in 16 (62%) CS-group samples. Massive (≥14 inflammatory cells) and minimal (<14 inflammatory cells) inflammatory foci, respectively, were detected in 16 (62%) and 11 (42%) of the CS-group samples, 10 (67%) and 10 (67%) of the M-group samples, and 1 (3%) and 18 (46%) of the CM-group samples. P. acnes-positive reactivity in granulomas, massive inflammatory foci, and minimal inflammatory foci were detected in 10 (63%), 10 (63%), and 8 (73%) of the CS-group samples, respectively, and in none of the M-group and CM-group samples. Frequent identification of P. acnes in sarcoid granulomas of originally aseptic myocardial tissues suggests that this indigenous bacterium causes granuloma in many CS patients. IHC detection of P. acnes in massive or minimal inflammatory foci of myocardial biopsy samples without granulomas may be useful for differentiating sarcoidosis from myocarditis or other cardiomyopathies.
Ju, Guangxu; Highland, Matthew J.; Yanguas-Gil, Angel; ...
2017-03-21
Here, we describe an instrument that exploits the ongoing revolution in synchrotron sources, optics, and detectors to enable in situ studies of metal-organic vapor phase epitaxy (MOVPE) growth of III-nitride materials using coherent x-ray methods. The system includes high-resolution positioning of the sample and detector including full rotations, an x-ray transparent chamber wall for incident and diffracted beam access over a wide angular range, and minimal thermal sample motion, giving the sub-micron positional stability and reproducibility needed for coherent x-ray studies. The instrument enables surface x-ray photon correlation spectroscopy, microbeam diffraction, and coherent diffraction imaging of atomic-scale surface and filmmore » structure and dynamics during growth, to provide fundamental understanding of MOVPE processes.« less
Optimization of integrated impeller mixer via radiotracer experiments.
Othman, N; Kamarudin, S K; Takriff, M S; Rosli, M I; Engku Chik, E M F; Adnan, M A K
2014-01-01
Radiotracer experiments are carried out in order to determine the mean residence time (MRT) as well as percentage of dead zone, V dead (%), in an integrated mixer consisting of Rushton and pitched blade turbine (PBT). Conventionally, optimization was performed by varying one parameter and others were held constant (OFAT) which lead to enormous number of experiments. Thus, in this study, a 4-factor 3-level Taguchi L9 orthogonal array was introduced to obtain an accurate optimization of mixing efficiency with minimal number of experiments. This paper describes the optimal conditions of four process parameters, namely, impeller speed, impeller clearance, type of impeller, and sampling time, in obtaining MRT and V dead (%) using radiotracer experiments. The optimum conditions for the experiments were 100 rpm impeller speed, 50 mm impeller clearance, Type A mixer, and 900 s sampling time to reach optimization.
Garg, Ramandeep; Brennan, Lorraine; Price, Ruth K; Wallace, Julie M W; Strain, J J; Gibney, Mike J; Shewry, Peter R; Ward, Jane L; Garg, Lalit; Welch, Robert W
2016-02-17
Wheat bran, and especially wheat aleurone fraction, are concentrated sources of a wide range of components which may contribute to the health benefits associated with higher consumption of whole-grain foods. This study used NMR metabolomics to evaluate urine samples from baseline at one and two hours postprandially, following the consumption of minimally processed bran, aleurone or control by 14 participants (7 Females; 7 Males) in a randomized crossover trial. The methodology discriminated between the urinary responses of control, and bran and aleurone, but not between the two fractions. Compared to control, consumption of aleurone or bran led to significantly and substantially higher urinary concentrations of lactate, alanine, N-acetylaspartate acid and N-acetylaspartylglutamate and significantly and substantially lower urinary betaine concentrations at one and two hours postprandially. There were sex related differences in urinary metabolite profiles with generally higher hippurate and citrate and lower betaine in females compared to males. Overall, this postprandial study suggests that acute consumption of bran or aleurone is associated with a number of physiological effects that may impact on energy metabolism and which are consistent with longer term human and animal metabolomic studies that used whole-grain wheat diets or wheat fractions.
Garg, Ramandeep; Brennan, Lorraine; Price, Ruth K.; Wallace, Julie M. W.; Strain, J. J.; Gibney, Mike J.; Shewry, Peter R.; Ward, Jane L.; Garg, Lalit; Welch, Robert W.
2016-01-01
Wheat bran, and especially wheat aleurone fraction, are concentrated sources of a wide range of components which may contribute to the health benefits associated with higher consumption of whole-grain foods. This study used NMR metabolomics to evaluate urine samples from baseline at one and two hours postprandially, following the consumption of minimally processed bran, aleurone or control by 14 participants (7 Females; 7 Males) in a randomized crossover trial. The methodology discriminated between the urinary responses of control, and bran and aleurone, but not between the two fractions. Compared to control, consumption of aleurone or bran led to significantly and substantially higher urinary concentrations of lactate, alanine, N-acetylaspartate acid and N-acetylaspartylglutamate and significantly and substantially lower urinary betaine concentrations at one and two hours postprandially. There were sex related differences in urinary metabolite profiles with generally higher hippurate and citrate and lower betaine in females compared to males. Overall, this postprandial study suggests that acute consumption of bran or aleurone is associated with a number of physiological effects that may impact on energy metabolism and which are consistent with longer term human and animal metabolomic studies that used whole-grain wheat diets or wheat fractions. PMID:26901221
Manasse, N J; Hux, K; Rankin-Erickson, J L
2000-11-01
Impairments in motor functioning, language processing, and cognitive status may impact the written language performance of traumatic brain injury (TBI) survivors. One strategy to minimize the impact of these impairments is to use a speech recognition system. The purpose of this study was to explore the effect of mild dysarthria and mild cognitive-communication deficits secondary to TBI on a 19-year-old survivor's mastery and use of such a system-specifically, Dragon Naturally Speaking. Data included the % of the participant's words accurately perceived by the system over time, the participant's accuracy over time in using commands for navigation and error correction, and quantitative and qualitative changes in the participant's written texts generated with and without the use of the speech recognition system. Results showed that Dragon NaturallySpeaking was approximately 80% accurate in perceiving words spoken by the participant, and the participant quickly and easily mastered all navigation and error correction commands presented. Quantitatively, the participant produced a greater amount of text using traditional word processing and a standard keyboard than using the speech recognition system. Minimal qualitative differences appeared between writing samples. Discussion of factors that may have contributed to the obtained results and that may affect the generalization of the findings to other TBI survivors is provided.
Optical fiber Raman-based spectroscopy for oral lesions characterization: a pilot study
NASA Astrophysics Data System (ADS)
Carvalho, Luis Felipe C. S.; Neto, Lázaro P. M.; Oliveira, Inajara P.; Rangel, João. Lucas; Ferreira, Isabelle; Kitakawa, Dárcio; Martin, Airton A.
2016-03-01
In the clinical daily life various lesions of the oral cavity have shown different aspects, generating an inconclusive or doubtful diagnosis. In general, oral injuries are diagnosed by histopathological analysis from biopsy, which is an invasive procedure and does not gives immediate results. In the other hand, Raman spectroscopy technique it is a real time and minimal invasive analytical tool, with notable diagnostic capability. This study aims to characterize, by optical fiber Raman-based spectroscopy (OFRS), normal, inflammatory, potentially malignant, benign and malign oral lesions. Raman data were collected by a Holospec f / 1.8 spectrograph (Kayser Optical Systems) coupled to an optical fiber, with a 785nm laser line source and a CCD Detector. The data were pre-processed and vector normalized. The average analysis and standard deviation was performed associated with cluster analysis and compared to the histopalogical results. Samples of described oral pathological processes were used in the study. The OFRS was efficient to characterized oral lesions and normal mucosa, in which biochemical information related to vibrational modes of proteins, lipids, nucleic acids and carbohydrates were observed. The technique (OFRS) is able to demonstrate biochemical information concern different types of oral lesions showing that Raman spectroscopy could be useful for an early and minimal invasive diagnosis.
Mann, J E; Brashears, M M
2006-08-01
In order to provide beef processors with valuable data to validate critical limits set for temperature during grinding, a study was conducted to determine Escherichia coli o157:H7 growth at various temperatures in raw ground beef. Fresh ground beef samples were inoculated with a cocktail mixture of streptomycin-resistant E. coli O157:H7 to facilitate recovery in the presence of background flora. Samples were held at 4.4, 7.2, and 10 degrees C, and at room temperature (22.2 to 23.3 degrees C) to mimic typical processing and holding temperatures observed in meat processing environments. E. coli O157:H7 counts were determined by direct plating onto tryptic soy agar with streptomycin (1,000 microg/ml), at 2-h intervals over 12 h for samples held at room temperature. Samples held under refrigeration temperatures were sampled at 4, 8, 12, 24, 48, and 72 h. Less than one log of E. coli O157:H7 growth was observed at 48 h for samples held at 10 degrees C. Samples held at 4.4 and 7.2 degrees C showed less than one log of E. coli O157:H7 growth at 72 h. Samples held at room temperature showed no significant increase in E. coli O157:H7 counts for the first 6 h, but increased significantly afterwards. These results illustrate that meat processors can utilize a variety of time and temperature combinations as critical limits in their hazard analysis critical control point plans to minimize E. coli O157:H7 growth during the production and storage of ground beef.
A weighted ℓ{sub 1}-minimization approach for sparse polynomial chaos expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Ji; Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu
2014-06-15
This work proposes a method for sparse polynomial chaos (PC) approximation of high-dimensional stochastic functions based on non-adapted random sampling. We modify the standard ℓ{sub 1}-minimization algorithm, originally proposed in the context of compressive sampling, using a priori information about the decay of the PC coefficients, when available, and refer to the resulting algorithm as weightedℓ{sub 1}-minimization. We provide conditions under which we may guarantee recovery using this weighted scheme. Numerical tests are used to compare the weighted and non-weighted methods for the recovery of solutions to two differential equations with high-dimensional random inputs: a boundary value problem with amore » random elliptic operator and a 2-D thermally driven cavity flow with random boundary condition.« less
Ex-vivo holographic microscopy and spectroscopic analysis of head and neck cancer
NASA Astrophysics Data System (ADS)
Holler, Stephen; Wurtz, Robert; Auyeung, Kelsey; Auyeung, Kris; Paspaley-Grbavac, Milan; Mulroe, Brigid; Sobrero, Maximiliano; Miles, Brett
2015-03-01
Optical probes to identify tumor margins in vivo would greatly reduce the time, effort and complexity in the surgical removal of malignant tissue in head and neck cancers. Current approaches involve visual microscopy of stained tissue samples to determine cancer margins, which results in the excision of excess of tissue to assure complete removal of the cancer. Such surgical procedures and follow-on chemotherapy can adversely affect the patient's recovery and subsequent quality of life. In order to reduce the complexity of the process and minimize adverse effects on the patient, we investigate ex vivo tissue samples (stained and unstained) using digital holographic microscopy in conjunction with spectroscopic analyses (reflectance and transmission spectroscopy) in order to determine label-free, optically identifiable characteristic features that may ultimately be used for in vivo processing of cancerous tissues. The tissue samples studied were squamous cell carcinomas and associated controls from patients of varying age, gender and race. Holographic microscopic imaging scans across both cancerous and non-cancerous tissue samples yielded amplitude and phase reconstructions that were correlated with spectral signatures. Though the holographic reconstructions and measured spectra indicate variations even among the same class of tissue, preliminary results indicate the existence of some discriminating features. Further analyses are presently underway to further this work and extract additional information from the imaging and spectral data that may prove useful for in vivo surgical identification.
Climate and Lightning: An updated TRMM-LIS Analysis
NASA Technical Reports Server (NTRS)
Petersen, Walter A.; Buechler, D. E.
2009-01-01
The TRMM Lightning Imaging Sensor (LIS) has sampled global tropical and sub-tropical lightning flash densities for approximately 11 years. These data were originally processed and results presented by the authors in the 3rd AMS MALD Conference held in 2007 using both pre and post TRMM-boost lightning data. These data were normalized for the orbit boost by scaling the pre-boost data by a fixed constant based on the different swath areas for the pre and post-boost years (post-boost after 2001). Inevitably, one must question this simple approach to accounting for the orbit boost when sampling such a noisy quantity. Hence we are in the process of reprocessing the entire 11-year TRMM LIS dataset to reduce the orbit swath of the post-boost era to that of the pre-boost in order to eliminate sampling bias in the dataset. Study of the diurnal/seasonal/annual sampling suggests that those biases are already minimal and should not contribute to error in examination of annual trends. We will present new analysis of the 11-year annual trends in total lightning flash density for all latitudinal belts and select regions/regimes of the tropics as related to conventional climate signals and precipitation contents in the same period. The results should enable us to address, in some fashion, the sensitivity of the lightning flash density to subtle changes in climate.
Wu, Qihua; Shi, Honglan; Ma, Yinfa; Adams, Craig; Eichholz, Todd; Timmons, Terry; Jiang, Hua
2015-01-01
N-Nitrosamines are potent mutagenic and carcinogenic emerging water disinfection by-products (DBPs). The most effective strategy to control the formation of these DBPs is minimizing their precursors from source water. Secondary and tertiary amines are dominating precursors of N-nitrosamines formation during drinking water disinfection process. Therefore, the screening and removal of these amines in source water are very essential for preventing the formation of N-nitrosamines. A rapid, simple, and sensitive ultrafast liquid chromatography-tandem mass spectrometry (UFLC-MS/MS) method has been developed in this study to determine seven amines, including dimethylamine, ethylmethylamine, diethylamine, dipropylamine, trimethylamine, 3-(dimethylaminomethyl)indole, and 4-dimethylaminoantipyrine, as major precursors of N-nitrosamines in drinking water system. No sample preparation process is needed except a simple filtration. Separation and detection can be achieved in 11 min per sample. The method detection limits of selected amines are ranging from 0.02 μg/L to 1 μg/L except EMA (5 μg/L), and good calibration linearity was achieved. The developed method was applied to determine the selected precursors in source water and drinking water samples collected from Midwest area of the United States. In most of water samples, the concentrations of selected precursors of N-nitrosamines were below their method detection limits. Dimethylamine was detected in some of water samples at the concentration up to 25.4 μg/L. Copyright © 2014 Elsevier B.V. All rights reserved.
System automatically supplies precise analytical samples of high-pressure gases
NASA Technical Reports Server (NTRS)
Langdon, W. M.
1967-01-01
High-pressure-reducing and flow-stabilization system delivers analytical gas samples from a gas supply. The system employs parallel capillary restrictors for pressure reduction and downstream throttling valves for flow control. It is used in conjunction with a sampling valve and minimizes alterations of the sampled gas.
40 CFR 1065.1107 - Sample media and sample system preparation; sample system assembly.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) For capturing PM, we recommend using pure quartz filters with no binder. Select the filter diameter to minimize filter change intervals, accounting for the expected PM emission rate, sample flow rate, and... filter without replacing the sorbent or otherwise disassembling the batch sampler. In those cases...
Sampling Error in a Particulate Mixture: An Analytical Chemistry Experiment.
ERIC Educational Resources Information Center
Kratochvil, Byron
1980-01-01
Presents an undergraduate experiment demonstrating sampling error. Selected as the sampling system is a mixture of potassium hydrogen phthalate and sucrose; using a self-zeroing, automatically refillable buret to minimize titration time of multiple samples and employing a dilute back-titrant to obtain high end-point precision. (CS)
Acute hydrodynamic damage induced by SPLITT fractionation and centrifugation in red blood cells.
Urbina, Adriana; Godoy-Silva, Ruben; Hoyos, Mauricio; Camacho, Marcela
2016-05-01
Though blood bank processing traditionally employs centrifugation, new separation techniques may be appealing for large scale processes. Split-flow fractionation (SPLITT) is a family of techniques that separates in absence of labelling and uses very low flow rates and force fields, and is therefore expected to minimize cell damage. However, the hydrodynamic stress and possible consequent damaging effects of SPLITT fractionation have not been yet examined. The aim of this study was to investigate the hydrodynamic damage of SPLITT fractionation to human red blood cells, and to compare these effects with those induced by centrifugation. Peripheral whole blood samples were collected from healthy volunteers. Samples were diluted in a buffered saline solution, and were exposed to SPLITT fractionation (flow rates 1-10 ml/min) or centrifugation (100-1500 g) for 10 min. Cell viability, shape, diameter, mean corpuscular hemoglobin, and membrane potential were measured. Under the operating conditions employed, both SPLITT and centrifugation maintained cell viability above 98%, but resulted in significant sublethal damage, including echinocyte formation, decreased cell diameter, decreased mean corpuscular hemoglobin, and membrane hyperpolarization which was inhibited by EGTA. Wall shear stress and maximum energy dissipation rate showed significant correlation with lethal and sublethal damage. Our data do not support the assumption that SPLITT fractionation induces very low shear stress and is innocuous to cell function. Some changes in SPLITT channel design are suggested to minimize cell damage. Measurement of membrane potential and cell diameter could provide a new, reliable and convenient basis for evaluation of hydrodynamic effects on different cell models, allowing identification of optimal operating conditions on different scales. Copyright © 2016 Elsevier B.V. All rights reserved.
Improving image quality in laboratory x-ray phase-contrast imaging
NASA Astrophysics Data System (ADS)
De Marco, F.; Marschner, M.; Birnbacher, L.; Viermetz, M.; Noël, P.; Herzen, J.; Pfeiffer, F.
2017-03-01
Grating-based X-ray phase-contrast (gbPC) is known to provide significant benefits for biomedical imaging. To investigate these benefits, a high-sensitivity gbPC micro-CT setup for small (≍ 5 cm) biological samples has been constructed. Unfortunately, high differential-phase sensitivity leads to an increased magnitude of data processing artifacts, limiting the quality of tomographic reconstructions. Most importantly, processing of phase-stepping data with incorrect stepping positions can introduce artifacts resembling Moiré fringes to the projections. Additionally, the focal spot size of the X-ray source limits resolution of tomograms. Here we present a set of algorithms to minimize artifacts, increase resolution and improve visual impression of projections and tomograms from the examined setup. We assessed two algorithms for artifact reduction: Firstly, a correction algorithm exploiting correlations of the artifacts and differential-phase data was developed and tested. Artifacts were reliably removed without compromising image data. Secondly, we implemented a new algorithm for flatfield selection, which was shown to exclude flat-fields with strong artifacts. Both procedures successfully improved image quality of projections and tomograms. Deconvolution of all projections of a CT scan can minimize blurring introduced by the finite size of the X-ray source focal spot. Application of the Richardson-Lucy deconvolution algorithm to gbPC-CT projections resulted in an improved resolution of phase-contrast tomograms. Additionally, we found that nearest-neighbor interpolation of projections can improve the visual impression of very small features in phase-contrast tomograms. In conclusion, we achieved an increase in image resolution and quality for the investigated setup, which may lead to an improved detection of very small sample features, thereby maximizing the setup's utility.
Hermetic edge sealing of photovoltaic modules
NASA Astrophysics Data System (ADS)
Nowlan, M. J.
1983-07-01
The feasibility of using an electrostatic bonding (ESB) and ultrasonic welding process to produce hermetic edge seals on terrestrial solar cell modules was investigated. The fabrication sequence is to attach an aluminum foil "gasket' to the perimeter of a glass sheet. A cell circuit is next encapsulated inside the gasket, and its aluminum foil back cover is seam welded ultrasonically to the gasket. An ESB process for sealing aluminum to glass was developed in an ambient air atmosphere, which eliminates the requirement for a vacuum or pressure vessel. An ultrasonic seam welding process was also developed which did not degrade the quality of the ESB seal. Good quality welds with minimal deformation were produced. The effectiveness of the above described sealing techniques was tested by constructing 400 sq cm (8 x 8 s64 sq in) sample modules, and then subjecting them to nondestructive fine and gross leak tests. The gross leak tests identified several different causes of leaks which were then eliminated by modifying the assembly process.
Hermetic edge sealing of photovoltaic modules
NASA Technical Reports Server (NTRS)
Nowlan, M. J.
1983-01-01
The feasibility of using an electrostatic bonding (ESB) and ultrasonic welding process to produce hermetic edge seals on terrestrial solar cell modules was investigated. The fabrication sequence is to attach an aluminum foil "gasket' to the perimeter of a glass sheet. A cell circuit is next encapsulated inside the gasket, and its aluminum foil back cover is seam welded ultrasonically to the gasket. An ESB process for sealing aluminum to glass was developed in an ambient air atmosphere, which eliminates the requirement for a vacuum or pressure vessel. An ultrasonic seam welding process was also developed which did not degrade the quality of the ESB seal. Good quality welds with minimal deformation were produced. The effectiveness of the above described sealing techniques was tested by constructing 400 sq cm (8 x 8 s64 sq in) sample modules, and then subjecting them to nondestructive fine and gross leak tests. The gross leak tests identified several different causes of leaks which were then eliminated by modifying the assembly process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Mark A.
The development of our Integrated Actinide Sample Preparation Laboratory (IASPL) commenced in 1998 driven by the need to perform transmission electron microscopy studies on naturally aged plutonium and its alloys looking for the microstructural effects of the radiological decay process (1). Remodeling and construction of a laboratory within the Chemistry and Materials Science Directorate facilities at LLNL was required to turn a standard radiological laboratory into a Radiological Materials Area (RMA) and Radiological Buffer Area (RBA) containing type I, II and III workplaces. Two inert atmosphere dry-train glove boxes with antechambers and entry/exit fumehoods (Figure 1), having a baseline atmospheremore » of 1 ppm oxygen and 1 ppm water vapor, a utility fumehood and a portable, and a third double-walled enclosure have been installed and commissioned. These capabilities, along with highly trained technical staff, facilitate the safe operation of sample preparation processes and instrumentation, and sample handling while minimizing oxidation or corrosion of the plutonium. In addition, we are currently developing the capability to safely transfer small metallographically prepared samples to a mini-SEM for microstructural imaging and chemical analysis. The gloveboxes continue to be the most crucial element of the laboratory allowing nearly oxide-free sample preparation for a wide variety of LLNL-based characterization experiments, which includes transmission electron microscopy, electron energy loss spectroscopy, optical microscopy, electrical resistivity, ion implantation, X-ray diffraction and absorption, magnetometry, metrological surface measurements, high-pressure diamond anvil cell equation-of-state, phonon dispersion measurements, X-ray absorption and emission spectroscopy, and differential scanning calorimetry. The sample preparation and materials processing capabilities in the IASPL have also facilitated experimentation at world-class facilities such as the Advanced Photon Source at Argonne National Laboratory, the European Synchrotron Radiation Facility in Grenoble, France, the Stanford Synchrotron Radiation Facility, the National Synchrotron Light Source at Brookhaven National Laboratory, the Advanced Light Source at Lawrence Berkeley National Laboratory, and the Triumph Accelerator in Canada.« less
NASA Astrophysics Data System (ADS)
Khan, Faisal; Enzmann, Frieder; Kersten, Michael
2016-03-01
Image processing of X-ray-computed polychromatic cone-beam micro-tomography (μXCT) data of geological samples mainly involves artefact reduction and phase segmentation. For the former, the main beam-hardening (BH) artefact is removed by applying a best-fit quadratic surface algorithm to a given image data set (reconstructed slice), which minimizes the BH offsets of the attenuation data points from that surface. A Matlab code for this approach is provided in the Appendix. The final BH-corrected image is extracted from the residual data or from the difference between the surface elevation values and the original grey-scale values. For the segmentation, we propose a novel least-squares support vector machine (LS-SVM, an algorithm for pixel-based multi-phase classification) approach. A receiver operating characteristic (ROC) analysis was performed on BH-corrected and uncorrected samples to show that BH correction is in fact an important prerequisite for accurate multi-phase classification. The combination of the two approaches was thus used to classify successfully three different more or less complex multi-phase rock core samples.
SU-G-IeP1-13: Sub-Nyquist Dynamic MRI Via Prior Rank, Intensity and Sparsity Model (PRISM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, B; Gao, H
Purpose: Accelerated dynamic MRI is important for MRI guided radiotherapy. Inspired by compressive sensing (CS), sub-Nyquist dynamic MRI has been an active research area, i.e., sparse sampling in k-t space for accelerated dynamic MRI. This work is to investigate sub-Nyquist dynamic MRI via a previously developed CS model, namely Prior Rank, Intensity and Sparsity Model (PRISM). Methods: The proposed method utilizes PRISM with rank minimization and incoherent sampling patterns for sub-Nyquist reconstruction. In PRISM, the low-rank background image, which is automatically calculated by rank minimization, is excluded from the L1 minimization step of the CS reconstruction to further sparsify themore » residual image, thus allowing for higher acceleration rates. Furthermore, the sampling pattern in k-t space is made more incoherent by sampling a different set of k-space points at different temporal frames. Results: Reconstruction results from L1-sparsity method and PRISM method with 30% undersampled data and 15% undersampled data are compared to demonstrate the power of PRISM for dynamic MRI. Conclusion: A sub- Nyquist MRI reconstruction method based on PRISM is developed with improved image quality from the L1-sparsity method.« less
Saini, Ramesh Kumar; Shang, Xiao Min; Ko, Eun Young; Choi, Jeong Hee; Keum, Young-Soo
2015-08-01
Minimally processed ready-to-eat baby-leaf vegetables (BLVs) are the most convenient source to include the health beneficial bioactive in the daily diet. In the present study, the visual quality and storage stability of carotenoids, tocopherols were investigated in lettuce (green and red romaine) and salad rocket BLVs. The commercially packed samples of BLVs were stored at 0 °C and 4 °C in dark conditions and analyzed after 0, 2, 4, 8 and 12 days of storage. All the studied samples were found in better visual quality up to eight days of storage at both the temperatures. In most cases, the quality was correlated with the chlorophyll contents. The highest significant (p < 0.05) positive changes in total carotenoids and tocopherols were observed in samples stored at 4 °C. Also, carotenoids and tocopherols are maximum stable in green and red romaine lettuce, respectively.
WAIS-III index score profiles in the Canadian standardization sample.
Lange, Rael T
2007-01-01
Representative index score profiles were examined in the Canadian standardization sample of the Wechsler Adult Intelligence Scale-Third Edition (WAIS-III). The identification of profile patterns was based on the methodology proposed by Lange, Iverson, Senior, and Chelune (2002) that aims to maximize the influence of profile shape and minimize the influence of profile magnitude on the cluster solution. A two-step cluster analysis procedure was used (i.e., hierarchical and k-means analyses). Cluster analysis of the four index scores (i.e., Verbal Comprehension [VCI], Perceptual Organization [POI], Working Memory [WMI], Processing Speed [PSI]) identified six profiles in this sample. Profiles were differentiated by pattern of performance and were primarily characterized as (a) high VCI/POI, low WMI/PSI, (b) low VCI/POI, high WMI/PSI, (c) high PSI, (d) low PSI, (e) high VCI/WMI, low POI/PSI, and (f) low VCI, high POI. These profiles are potentially useful for determining whether a patient's WAIS-III performance is unusual in a normal population.
Fluidic hydrogen detector production prototype development
NASA Technical Reports Server (NTRS)
Roe, G. W.; Wright, R. E.
1976-01-01
A hydrogen gas sensor that can replace catalytic combustion sensors used to detect leaks in the liquid hydrogen transfer systems at Kennedy Space Center was developed. A fluidic sensor concept, based on the principle that the frequency of a fluidic oscillator is proportional to the square root of the molecular weight of its operating fluid, was utilized. To minimize sensitivity to pressure and temperature fluctuations, and to make the sensor specific for hydrogen, two oscillators are used. One oscillator operates on sample gas containing hydrogen, while the other operates on sample gas with the hydrogen converted to steam. The conversion is accomplished with a small catalytic converter. The frequency difference is taken, and the hydrogen concentration computed with a simple digital processing circuit. The output from the sensor is an analog signal proportional to hydrogen content. The sensor is shown to be accurate and insensitive to severe environmental disturbances. It is also specific for hydrogen, even with large helium concentrations in the sample gas.
Shepard, Michele; Brenner, Sara
2014-01-01
Background: Numerous studies are ongoing in the fields of nanotoxicology and exposure science; however, gaps remain in identifying and evaluating potential exposures from skin contact with engineered nanoparticles in occupational settings. Objectives: The aim of this study was to identify potential cutaneous exposure scenarios at a workplace using engineered nanoparticles (alumina, ceria, amorphous silica) and evaluate the presence of these materials on workplace surfaces. Methods: Process review, workplace observations, and preliminary surface sampling were conducted using microvacuum and wipe sample collection methods and transmission electron microscopy with elemental analysis. Results: Exposure scenarios were identified with potential for incidental contact. Nanoparticles of silica or silica and/or alumina agglomerates (or aggregates) were identified in surface samples from work areas where engineered nanoparticles were used or handled. Conclusions: Additional data are needed to evaluate occupational exposures from skin contact with engineered nanoparticles; precautionary measures should be used to minimize potential cutaneous exposures in the workplace. PMID:25000112
Han, Changhee; Burn-Nunes, Laurie J; Lee, Khanghyun; Chang, Chaewon; Kang, Jung-Ho; Han, Yeongcheol; Hur, Soon Do; Hong, Sungmin
2015-08-01
An improved decontamination method and ultraclean analytical procedures have been developed to minimize Pb contamination of processed glacial ice cores and to achieve reliable determination of Pb isotopes in North Greenland Eemian Ice Drilling (NEEM) deep ice core sections with concentrations at the sub-picogram per gram level. A PL-7 (Fuso Chemical) silica-gel activator has replaced the previously used colloidal silica activator produced by Merck and has been shown to provide sufficiently enhanced ion beam intensity for Pb isotope analysis for a few tens of picograms of Pb. Considering the quantities of Pb contained in the NEEM Greenland ice core and a sample weight of 10 g used for the analysis, the blank contribution from the sample treatment was observed to be negligible. The decontamination and analysis of the artificial ice cores and selected NEEM Greenland ice core sections confirmed the cleanliness and effectiveness of the overall analytical process. Copyright © 2015 Elsevier B.V. All rights reserved.
Svobodová, Kateřina; Semerád, Jaroslav; Petráčková, Denisa; Novotný, Čeněk
2018-05-30
Quantitative changes in antibiotic resistance genes (ARGs) were investigated in six urban wastewater treatment plants (WWTPs) treating municipal and industrial wastewaters. In a selected WWTP, the fate of ARGs was studied in a 1-year time interval and in two phases of wastewater treatment process. Nine ARGs (tetW, tetO, tetA, tetB, tetM, bla TEM , ermB, sul1, and intl1) were quantified in total and their relative abundance assessed by ARG copies/16SrRNA copies. From the tetracycline resistance genes, tetW was the only one detected in all sampled WWTPs. Its relative abundance in the nitrification tank of WWTP5 was found stable during the 1-year period, but was lowered by secondary sedimentation processes in the wastewater treatment down to 24% compared to the nitrification tank. Bacterial isolates showing high tetracycline resistance (minimal inhibition concentrations >100 μg/mL) were identified as members of Acinetobacter, Klebsiella, Citrobacter, Bacillus, and Enterobacter genera. Dynamic shifts in the relative abundance of ermB and sul1 were also demonstrated in wastewater samples from WWTP5.
Scale up of NiTi shape memory alloy production by EBM
NASA Astrophysics Data System (ADS)
Otubo, J.; Rigo, O. D.; Moura Neto, C.; Kaufman, M. J.; Mei, P. R.
2003-10-01
The usual process to produce NiTi shape memory alloy is by vacuum induction melting (VIM) using a graphite crucible, which causes contamination of the melt with carbon. Contamination with oxygen originates from the residual oxygen inside the melting chamber. An alternative process to produce NiTi alloys is by electron beam melting (EBM) using a water-cooled copper crucible that eliminates carbon contamination, and the oxygen contamination would be minimal due to operation in a vacuum of better than 10^{-2} Pa. In a previous work, it was demonstrated that the technique is feasible for button shaped samples weighing around 30g. The present work presents the results on the scale up program that enables the production of larger samples/ingots. The results are very promising in terms of chemical composition homogeneity as well as in terms of carbon contamination, the latter being four to ten times lower than the commercially-produced VIM products, and in terms of final oxygen content which is shown to depend primarily on the starting raw materials.
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques-Philippe
2015-01-01
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering. PMID:25945583
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams.
Coquelle, Nicolas; Brewster, Aaron S; Kapp, Ulrike; Shilova, Anastasya; Weinhausen, Britta; Burghammer, Manfred; Colletier, Jacques Philippe
2015-05-01
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Å resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.
Raster-scanning serial protein crystallography using micro- and nano-focused synchrotron beams
Coquelle, Nicolas; Brewster, Aaron S.; Kapp, Ulrike; ...
2015-04-25
High-resolution structural information was obtained from lysozyme microcrystals (20 µm in the largest dimension) using raster-scanning serial protein crystallography on micro- and nano-focused beamlines at the ESRF. Data were collected at room temperature (RT) from crystals sandwiched between two silicon nitride wafers, thereby preventing their drying, while limiting background scattering and sample consumption. In order to identify crystal hits, new multi-processing and GUI-driven Python-based pre-analysis software was developed, named NanoPeakCell, that was able to read data from a variety of crystallographic image formats. Further data processing was carried out using CrystFEL, and the resultant structures were refined to 1.7 Åmore » resolution. The data demonstrate the feasibility of RT raster-scanning serial micro- and nano-protein crystallography at synchrotrons and validate it as an alternative approach for the collection of high-resolution structural data from micro-sized crystals. Advantages of the proposed approach are its thriftiness, its handling-free nature, the reduced amount of sample required, the adjustable hit rate, the high indexing rate and the minimization of background scattering.« less
Referent control and motor equivalence of reaching from standing
Tomita, Yosuke; Feldman, Anatol G.
2016-01-01
Motor actions may result from central changes in the referent body configuration, defined as the body posture at which muscles begin to be activated or deactivated. The actual body configuration deviates from the referent configuration, particularly because of body inertia and environmental forces. Within these constraints, the system tends to minimize the difference between these configurations. For pointing movement, this strategy can be expressed as the tendency to minimize the difference between the referent trajectory (RT) and actual trajectory (QT) of the effector (hand). This process may underlie motor equivalent behavior that maintains the pointing trajectory regardless of the number of body segments involved. We tested the hypothesis that the minimization process is used to produce pointing in standing subjects. With eyes closed, 10 subjects reached from a standing position to a remembered target located beyond arm length. In randomly chosen trials, hip flexion was unexpectedly prevented, forcing subjects to take a step during pointing to prevent falling. The task was repeated when subjects were instructed to intentionally take a step during pointing. In most cases, reaching accuracy and trajectory curvature were preserved due to adaptive condition-specific changes in interjoint coordination. Results suggest that referent control and the minimization process associated with it may underlie motor equivalence in pointing. NEW & NOTEWORTHY Motor actions may result from minimization of the deflection of the actual body configuration from the centrally specified referent body configuration, in the limits of neuromuscular and environmental constraints. The minimization process may maintain reaching trajectory and accuracy regardless of the number of body segments involved (motor equivalence), as confirmed in this study of reaching from standing in young healthy individuals. Results suggest that the referent control process may underlie motor equivalence in reaching. PMID:27784802
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
Distributed query plan generation using multiobjective genetic algorithm.
Panicker, Shina; Kumar, T V Vijay
2014-01-01
A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability.
Distributed Query Plan Generation Using Multiobjective Genetic Algorithm
Panicker, Shina; Vijay Kumar, T. V.
2014-01-01
A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability. PMID:24963513
Tsaur, G A; Riger, T O; Popov, A M; Nasedkina, T V; Kustanovich, A M; Solodovnikov, A G; Streneva, O V; Shorikov, E V; Tsvirenko, S V; Saveliev, L I; Fechina, L G
2015-04-01
The occurrence of minimal residual disease is an important prognostic factor under acute lymphoblastic leucosis in children and adults. In overwhelming majority of research studies bone marrow is used to detect minimal residual disease. The comparative characteristic of detection of minimal residual disease in peripheral blood and bone marrow was carried out. The prognostic role of occurrence of minimal residual disease in peripheral blood and bone marrow under therapy according protocol MLL-Baby was evaluated. The analysis embraced 142 pair samples from 53 patients with acute lymphoblastic leucosis and various displacements of gene MLL younger than 365 days. The minimal residual disease was detected by force of identification of chimeric transcripts using polymerase chain reaction in real-time mode in 7 sequential points of observation established by protocol of therapy. The comparability of results of qualitative detection of minimal residual disease in bone marrow and peripheral blood amounted to 84.5%. At that, in all 22 (15.5%) discordant samples minimal residual disease was detected only in bone marrow. Despite of high level of comparability of results of detection of minimal residual disease in peripheral blood and bone marrow the occurrence of minimal residual disease in peripheral blood at various stages of therapy demonstrated no independent prognostic significance. The established differences had no relationship with sensitivity of method determined by value of absolute expression of gene ABL. Most likely, these differences reflected real distribution of tumor cells. The results of study demonstrated that application of peripheral blood instead of bone marrow for monitoring of minimal residual disease under acute lymphoblastic leucosis in children of first year of life is inappropriate. At the same time, retention of minimal residual disease in TH4 in bone marrow was an independent and prognostic unfavorable factor under therapy of acute lymphoblastic leucosis of children of first year of life according protocol MLL-Baby (OO=7.326, confidence interval 2.378-22.565).
Uyttendaele, M; Neyts, K; Vanderswalmen, H; Notebaert, E; Debevere, J
2004-02-01
Aeromonas is an opportunistic pathogen, which, although in low numbers, may be present on minimally processed vegetables. Although the intrinsic and extrinsic factors of minimally processed prepacked vegetable mixes are not inhibitory to the growth of Aeromonas species, multiplication to high numbers during processing and storage of naturally contaminated grated carrots, mixed lettuce, and chopped bell peppers was not observed. Aeromonas was shown to be resistant towards chlorination of water, but was susceptible to 1% and 2% lactic acid and 0.5% and 1.0% thyme essential oil treatment, although the latter provoked adverse sensory properties when applied for decontamination of chopped bell peppers. Integration of a decontamination step with 2% lactic acid in the processing line of grated carrots was shown to have the potential to control the overall microbial quality of the grated carrots and was particularly effective towards Aeromonas.
NASA Technical Reports Server (NTRS)
Wirick, S.; Flynn, G. J.; Frank, D.; Sandford, S. A.; Zolensky, M. E.; Tsou, P.; Peltzer, C.; Jacobsen, C.
2009-01-01
Great care and a large effort was made to minimize the amount of organic matter contained within the flight aerogel used to collect Comet 81P/Wild 2 samples. Even so, by the very nature of the production process and silica aerogel s affinity for volatile organics keeping silica aerogel free from organics is a monumental task. Silica aerogel from three production batches was flown on the Stardust sample return mission. All 3 types had layered densities varying from 5mg/ml to 50 mg/ml where the densest aerogel was farthest away from the collection area. A 2 step gelation process was used to make the flight aerogel and organics used in this process were tetraethylorthosilicate, ethanol and acetonitrile. Both ammonium hydroxide and nitric acid were also used in the aerogel production process. The flight aerogel was baked at JPL at 300 C for 72 hours, most of the baking was done at atmosphere but twice a day the oven was pumped to 10 torr for hour [1]. After the aerogel was baked it was stored in a nitrogen purged cabinet until flight time. One aerogel cell was located in the SRC away from any sample collection area as a witness to possible contamination from out gassing of the space craft, re-entry gases and any other organic encounter. This aerogel was aerogel used in the interstellar collection sample tray and is the least dense of the 3 batches of aerogel flown. Organics found in the witness tile include organics containing Si-CH3 bonds, amines and PAHS. Besides organic contamination, hot spots of calcium were reported in the flight aerogel. Carbonates have been detected in comet 81P/Wild2 samples . During preflight analyses, no technique was used to analyze for carbonates in aerogel. To determine if the carbonates found in 81P/Wild2 samples were from the comet, it is necessary to analyze the flight aerogel for carbonate as well as for organics.
The ethical use of existing samples for genome research.
Bathe, Oliver F; McGuire, Amy L
2009-10-01
Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.
Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.M.; Rogan, Peter K.
2017-01-01
Accurate digital image analysis of abnormal microscopic structures relies on high quality images and on minimizing the rates of false positive (FP) and negative objects in images. Cytogenetic biodosimetry detects dicentric chromosomes (DCs) that arise from exposure to ionizing radiation, and determines radiation dose received based on DC frequency. Improvements in automated DC recognition increase the accuracy of dose estimates by reclassifying FP DCs as monocentric chromosomes or chromosome fragments. We also present image segmentation methods to rank high quality digital metaphase images and eliminate suboptimal metaphase cells. A set of chromosome morphology segmentation methods selectively filtered out FP DCs arising primarily from sister chromatid separation, chromosome fragmentation, and cellular debris. This reduced FPs by an average of 55% and was highly specific to these abnormal structures (≥97.7%) in three samples. Additional filters selectively removed images with incomplete, highly overlapped, or missing metaphase cells, or with poor overall chromosome morphologies that increased FP rates. Image selection is optimized and FP DCs are minimized by combining multiple feature based segmentation filters and a novel image sorting procedure based on the known distribution of chromosome lengths. Applying the same image segmentation filtering procedures to both calibration and test samples reduced the average dose estimation error from 0.4 Gy to <0.2 Gy, obviating the need to first manually review these images. This reliable and scalable solution enables batch processing for multiple samples of unknown dose, and meets current requirements for triage radiation biodosimetry of high quality metaphase cell preparations. PMID:29026522
Report #10-P-0218, September 8, 2010. With minimal exceptions, our independent sampling results at the Wheeler Pit Superfund Site were consistent with the sampling results that EPA Region 5 has obtained historically.
ERIC Educational Resources Information Center
Hodge, Megan M.; Gotzke, Carrie L.
2011-01-01
Listeners' identification of young children's productions of minimally contrastive words and predictive relationships between accurately identified words and intelligibility scores obtained from a 100-word spontaneous speech sample were determined for 36 children with typically developing speech (TDS) and 36 children with speech sound disorders…
Understanding and Minimizing Staff Burnout. An Introductory Packet.
ERIC Educational Resources Information Center
California Univ., Los Angeles. Center for Mental Health Schools.
Staff who bring a mental health perspective to the schools can deal with problems of staff burnout. This packet is designed to help in beginning the process of minimizing burnout, a process that requires reducing environmental stressors, increasing personal capabilities, and enhancing job supports. The packet opens with brief discussions of "What…
USDA-ARS?s Scientific Manuscript database
Fresh-cut cantaloupes have been associated with outbreaks of Salmonelosis disease and the minimally processed fresh-cut fruits have a limited shelf life because of deterioration caused by spoilage microflora and physiological processes. In this study, we evaluated the effect of minimal wet steam t...
Marchetti, Bárbara V; Candotti, Cláudia T; Raupp, Eduardo G; Oliveira, Eduardo B C; Furlanetto, Tássia S; Loss, Jefferson F
The purpose of this study was to assess a radiographic method for spinal curvature evaluation in children, based on spinous processes, and identify its normality limits. The sample consisted of 90 radiographic examinations of the spines of children in the sagittal plane. Thoracic and lumbar curvatures were evaluated using angular (apex angle [AA]) and linear (sagittal arrow [SA]) measurements based on the spinous processes. The same curvatures were also evaluated using the Cobb angle (CA) method, which is considered the gold standard. For concurrent validity (AA vs CA), Pearson's product-moment correlation coefficient, root-mean-square error, Pitman- Morgan test, and Bland-Altman analysis were used. For reproducibility (AA, SA, and CA), the intraclass correlation coefficient, standard error of measurement, and minimal detectable change measurements were used. A significant correlation was found between CA and AA measurements, as was a low root-mean-square error. The mean difference between the measurements was 0° for thoracic and lumbar curvatures, and the mean standard deviations of the differences were ±5.9° and 6.9°, respectively. The intraclass correlation coefficients of AA and SA were similar to or higher than the gold standard (CA). The standard error of measurement and minimal detectable change of the AA were always lower than the CA. This study determined the concurrent validity, as well as intra- and interrater reproducibility, of the radiographic measurements of kyphosis and lordosis in children. Copyright © 2017. Published by Elsevier Inc.
Responses of fresh-cut products of four mango cultivars under two different storage conditions.
Sharma, Sonu; Rao, Tadapaneni Venkata Ramana
2017-05-01
Due to availability of minimally processed products, the consumption of fresh produce has increased over recent years. The present study has been undertaken with the objective of screening of four mango cultivars ('Kesar', 'Rajapuri', 'Totapuri' and 'Ladvo') for evaluating the consequences of minimal processing on their quality attributes under storage at two different temperatures (5 ± 1 °C, 95% RH and 10 ± 1 °C, 87% RH) up to 12 days. The result of the study revealed significant impacts of low temperature storage on the quality parameters of fresh-cut mango cultivars. The evaluated bioactive compounds such as total phenolics, vitamin C and carotenoids were better retained in the samples stored at 5 °C as compared with that of 10 °C. Moreover, the storage of fresh-cut mango cultivars at 5 °C showed lower water loss and microbial contamination. Sensory analyses revealed that the storage of fresh-cut mango cultivars at 10 °C influenced overall acceptability due to changes in their visual perception, though taste, odor and firmness were less affected. This study revealed a significant variation in the storability of fresh-cut mango cultivars with respect to the storage temperature. Among currently studied four cultivars of mango, slices of 'Totapuri' showed comparatively the least change in color, firmness and sensory properties during storage at 5 and 10 °C and it can be a potential cultivar for fresh-cut processing.
Durand, Axel; Chase, Zanna; Remenyi, Tomas; Quéroué, Fabien
2012-01-01
We have developed a method for the determination of copper in natural waters at nanomolar levels. The use of a microplate-reader minimizes sample processing time (~25 s per sample), reagent consumption (~120 μL per sample), and sample volume (~700 μL). Copper is detected by chemiluminescence. This technique is based on the formation of a complex between copper and 1,10-phenanthroline and the subsequent emission of light during the oxidation of the complex by hydrogen peroxide. Samples are acidified to pH 1.7 and then introduced directly into a 24-well plate. Reagents are added during data acquisition via two reagent injectors. When trace metal clean protocols are employed, the reproducibility is generally less than 7% on blanks and the detection limit is 0.7 nM for seawater and 0.4 nM for freshwater. More than 100 samples per hour can be analyzed with this technique, which is simple, robust, and amenable to at-sea analysis. Seawater samples from Storm Bay in Tasmania illustrate the utility of the method for environmental science. Indeed other trace metals for which optical detection methods exist (e.g., chemiluminescence, fluorescence, and absorbance) could be adapted to the microplate-reader.
Analysis of munitions constituents in groundwater using a field-portable GC-MS.
Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K
2012-05-01
The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.
Price, A.; Peterson, James T.
2010-01-01
Stream fish managers often use fish sample data to inform management decisions affecting fish populations. Fish sample data, however, can be biased by the same factors affecting fish populations. To minimize the effect of sample biases on decision making, biologists need information on the effectiveness of fish sampling methods. We evaluated single-pass backpack electrofishing and seining combined with electrofishing by following a dual-gear, mark–recapture approach in 61 blocknetted sample units within first- to third-order streams. We also estimated fish movement out of unblocked units during sampling. Capture efficiency and fish abundances were modeled for 50 fish species by use of conditional multinomial capture–recapture models. The best-approximating models indicated that capture efficiencies were generally low and differed among species groups based on family or genus. Efficiencies of single-pass electrofishing and seining combined with electrofishing were greatest for Catostomidae and lowest for Ictaluridae. Fish body length and stream habitat characteristics (mean cross-sectional area, wood density, mean current velocity, and turbidity) also were related to capture efficiency of both methods, but the effects differed among species groups. We estimated that, on average, 23% of fish left the unblocked sample units, but net movement varied among species. Our results suggest that (1) common warmwater stream fish sampling methods have low capture efficiency and (2) failure to adjust for incomplete capture may bias estimates of fish abundance. We suggest that managers minimize bias from incomplete capture by adjusting data for site- and species-specific capture efficiency and by choosing sampling gear that provide estimates with minimal bias and variance. Furthermore, if block nets are not used, we recommend that managers adjust the data based on unconditional capture efficiency.
Quality and loudness judgments for music subjected to compression limiting.
Croghan, Naomi B H; Arehart, Kathryn H; Kates, James M
2012-08-01
Dynamic-range compression (DRC) is used in the music industry to maximize loudness. The amount of compression applied to commercial recordings has increased over time due to a motivating perspective that louder music is always preferred. In contrast to this viewpoint, artists and consumers have argued that using large amounts of DRC negatively affects the quality of music. However, little research evidence has supported the claims of either position. The present study investigated how DRC affects the perceived loudness and sound quality of recorded music. Rock and classical music samples were peak-normalized and then processed using different amounts of DRC. Normal-hearing listeners rated the processed and unprocessed samples on overall loudness, dynamic range, pleasantness, and preference, using a scaled paired-comparison procedure in two conditions: un-equalized, in which the loudness of the music samples varied, and loudness-equalized, in which loudness differences were minimized. Results indicated that a small amount of compression was preferred in the un-equalized condition, but the highest levels of compression were generally detrimental to quality, whether loudness was equalized or varied. These findings are contrary to the "louder is better" mentality in the music industry and suggest that more conservative use of DRC may be preferred for commercial music.
Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning
Silva, Susana F.; Domingues, José Paulo
2018-01-01
Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed. PMID:29599938
Accurate Rapid Lifetime Determination on Time-Gated FLIM Microscopy with Optical Sectioning.
Silva, Susana F; Domingues, José Paulo; Morgado, António Miguel
2018-01-01
Time-gated fluorescence lifetime imaging microscopy (FLIM) is a powerful technique to assess the biochemistry of cells and tissues. When applied to living thick samples, it is hampered by the lack of optical sectioning and the need of acquiring many images for an accurate measurement of fluorescence lifetimes. Here, we report on the use of processing techniques to overcome these limitations, minimizing the acquisition time, while providing optical sectioning. We evaluated the application of the HiLo and the rapid lifetime determination (RLD) techniques for accurate measurement of fluorescence lifetimes with optical sectioning. HiLo provides optical sectioning by combining the high-frequency content from a standard image, obtained with uniform illumination, with the low-frequency content of a second image, acquired using structured illumination. Our results show that HiLo produces optical sectioning on thick samples without degrading the accuracy of the measured lifetimes. We also show that instrument response function (IRF) deconvolution can be applied with the RLD technique on HiLo images, improving greatly the accuracy of the measured lifetimes. These results open the possibility of using the RLD technique with pulsed diode laser sources to determine accurately fluorescence lifetimes in the subnanosecond range on thick multilayer samples, providing that offline processing is allowed.
Predictive control of hollow-fiber bioreactors for the production of monoclonal antibodies.
Dowd, J E; Weber, I; Rodriguez, B; Piret, J M; Kwok, K E
1999-05-20
The selection of medium feed rates for perfusion bioreactors represents a challenge for process optimization, particularly in bioreactors that are sampled infrequently. When the present and immediate future of a bioprocess can be adequately described, predictive control can minimize deviations from set points in a manner that can maximize process consistency. Predictive control of perfusion hollow-fiber bioreactors was investigated in a series of hybridoma cell cultures that compared operator control to computer estimation of feed rates. Adaptive software routines were developed to estimate the current and predict the future glucose uptake and lactate production of the bioprocess at each sampling interval. The current and future glucose uptake rates were used to select the perfusion feed rate in a designed response to deviations from the set point values. The routines presented a graphical user interface through which the operator was able to view the up-to-date culture performance and assess the model description of the immediate future culture performance. In addition, fewer samples were taken in the computer-estimated cultures, reducing labor and analytical expense. The use of these predictive controller routines and the graphical user interface decreased the glucose and lactate concentration variances up to sevenfold, and antibody yields increased by 10% to 43%. Copyright 1999 John Wiley & Sons, Inc.
NASA Astrophysics Data System (ADS)
Bitner, Rex M.; Koller, Susan C.
2004-06-01
Three different methods of automated high throughput purification of genomic DNA from plant materials processed in 96 well plates are described. One method uses MagneSil paramagnetic particles to purify DNA present in single leaf punch samples or small seed samples, using 320ul capacity 96 well plates which minimizes reagent and plate costs. A second method uses 2.2 ml and 1.2 ml capacity plates and allows the purification of larger amounts of DNA from 5-6 punches of materials or larger amounts of seeds. The third method uses the MagneSil ONE purification system to purify a fixed amount of DNA, thus simplifying the processing of downstream applications by normalizing the amounts of DNA so they do not require quantitation. Protocols for the purification of a fixed yield of DNA, e.g. 1 ug, from plant leaf or seed samples using MagneSil paramagnetic particles and a Beckman-Coulter BioMek FX robot are described. DNA from all three methods is suitable for applications such as PCR, RAPD, STR, READIT SNP analysis, and multiplexed PCR systems. The MagneSil ONE system is also suitable for use with SNP detection systems such as Third Wave Technology"s Invader methods.
Martínez Steele, Eurídice; Baraldi, Larissa Galastri; Louzada, Maria Laura da Costa; Moubarac, Jean-Claude; Mozaffarian, Dariush; Monteiro, Carlos Augusto
2016-03-09
To investigate the contribution of ultra-processed foods to the intake of added sugars in the USA. Ultra-processed foods were defined as industrial formulations which, besides salt, sugar, oils and fats, include substances not used in culinary preparations, in particular additives used to imitate sensorial qualities of minimally processed foods and their culinary preparations. Cross-sectional study. National Health and Nutrition Examination Survey 2009-2010. We evaluated 9317 participants aged 1+ years with at least one 24 h dietary recall. Average dietary content of added sugars and proportion of individuals consuming more than 10% of total energy from added sugars. Gaussian and Poisson regressions estimated the association between consumption of ultra-processed foods and intake of added sugars. All models incorporated survey sample weights and adjusted for age, sex, race/ethnicity, family income and educational attainment. Ultra-processed foods comprised 57.9% of energy intake, and contributed 89.7% of the energy intake from added sugars. The content of added sugars in ultra-processed foods (21.1% of calories) was eightfold higher than in processed foods (2.4%) and fivefold higher than in unprocessed or minimally processed foods and processed culinary ingredients grouped together (3.7%). Both in unadjusted and adjusted models, each increase of 5 percentage points in proportional energy intake from ultra-processed foods increased the proportional energy intake from added sugars by 1 percentage point. Consumption of added sugars increased linearly across quintiles of ultra-processed food consumption: from 7.5% of total energy in the lowest quintile to 19.5% in the highest. A total of 82.1% of Americans in the highest quintile exceeded the recommended limit of 10% energy from added sugars, compared with 26.4% in the lowest. Decreasing the consumption of ultra-processed foods could be an effective way of reducing the excessive intake of added sugars in the USA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Minimization In Digital Design As A Meta-Planning Problem
NASA Astrophysics Data System (ADS)
Ho, William P. C.; Wu, Jung-Gen
1987-05-01
In our model-based expert system for automatic digital system design, we formalize the design process into three sub-processes - compiling high-level behavioral specifications into primitive behavioral operations, grouping primitive operations into behavioral functions, and grouping functions into modules. Consideration of design minimization explicitly controls decision-making in the last two subprocesses. Design minimization, a key task in the automatic design of digital systems, is complicated by the high degree of interaction among the time sequence and content of design decisions. In this paper, we present an AI approach which directly addresses these interactions and their consequences by modeling the minimization prob-lem as a planning problem, and the management of design decision-making as a meta-planning problem.
You, David J; Geshell, Kenneth J; Yoon, Jeong-Yeol
2011-10-15
Direct and sensitive detection of foodborne pathogens from fresh produce samples was accomplished using a handheld lab-on-a-chip device, requiring little to no sample processing and enrichment steps for a near-real-time detection and truly field-deployable device. The detection of Escherichia coli K12 and O157:H7 in iceberg lettuce was achieved utilizing optimized Mie light scatter parameters with a latex particle immunoagglutination assay. The system exhibited good sensitivity, with a limit of detection of 10 CFU mL(-1) and an assay time of <6 min. Minimal pretreatment with no detrimental effects on assay sensitivity and reproducibility was accomplished with a simple and cost-effective KimWipes filter and disposable syringe. Mie simulations were used to determine the optimal parameters (particle size d, wavelength λ, and scatter angle θ) for the assay that maximize light scatter intensity of agglutinated latex microparticles and minimize light scatter intensity of the tissue fragments of iceberg lettuce, which were experimentally validated. This introduces a powerful method for detecting foodborne pathogens in fresh produce and other potential sample matrices. The integration of a multi-channel microfluidic chip allowed for differential detection of the agglutinated particles in the presence of the antigen, revealing a true field-deployable detection system with decreased assay time and improved robustness over comparable benchtop systems. Additionally, two sample preparation methods were evaluated through simulated field studies based on overall sensitivity, protocol complexity, and assay time. Preparation of the plant tissue sample by grinding resulted in a two-fold improvement in scatter intensity over washing, accompanied with a significant increase in assay time: ∼5 min (grinding) versus ∼1 min (washing). Specificity studies demonstrated binding of E. coli O157:H7 EDL933 to only O157:H7 antibody conjugated particles, with no cross-reactivity to K12. This suggests the adaptability of the system for use with a wide variety of pathogens, and the potential to detect in a variety of biological matrices with little to no sample pretreatment. Copyright © 2011 Elsevier B.V. All rights reserved.
Song, Lili; Chen, Hangjun; Gao, Haiyan; Fang, Xiangjun; Mu, Honglei; Yuan, Ya; Yang, Qian; Jiang, Yueming
2013-09-04
Minimally processed water bamboo shoot (WBS) lignifies and deteriorates rapidly at room temperature, which limits greatly its marketability. This study was to investigate the effect of modified atmosphere packaging (MAP) on the sensory quality index, lignin formation, production of radical oxygen species (ROS) and activities of scavenging enzymes, membrane integrity and energy status of minimally processed WBS when packaged with or without the sealed low-density polyethylene (LDPE) bags, and then stored at 20°C for 9 days or 2°C for 60 days. The sensory quality of minimally processed WBS decreased quickly after 6 days of storage at 20°C. Low temperature storage maintained a higher sensory quality index within the first 30 days, but exhibited higher contents of lignin and hydrogen peroxide (H2O2) as compared with non-MAP shoots at 20°C. Combined MAP and low temperature storage not only maintained good sensory quality after 30 days, but also reduced significantly the increases in lignin content, superoxide anion (O2.-) production rate, H2O2 content and membrane permeability, maintained high activities of superoxide dismutase (SOD), catalase (CAT) and ascorbate peroxidase (APX), and reduced the increase in activities of lipase, phospholipase D (PLD) and lipoxygenase (LOX). Furthermore, the minimally processed WBS under MAP condition exhibited higher energy charge (EC) and lower adenosine monophosphate (AMP) content by the end of storage (60 days) at 2°C than those without MAP or stored for 9 days at 20°C. These results indicated that MAP in combination with low temperature storage reduced lignification of minimally processed WBS, which was closely associated with maintenance of energy status and enhanced activities of antioxidant enzymes, as well as reduced alleviation of membrane damage caused by ROS.
2013-01-01
Background Minimally processed water bamboo shoot (WBS) lignifies and deteriorates rapidly at room temperature, which limits greatly its marketability. This study was to investigate the effect of modified atmosphere packaging (MAP) on the sensory quality index, lignin formation, production of radical oxygen species (ROS) and activities of scavenging enzymes, membrane integrity and energy status of minimally processed WBS when packaged with or without the sealed low-density polyethylene (LDPE) bags, and then stored at 20°C for 9 days or 2°C for 60 days. Results The sensory quality of minimally processed WBS decreased quickly after 6 days of storage at 20°C. Low temperature storage maintained a higher sensory quality index within the first 30 days, but exhibited higher contents of lignin and hydrogen peroxide (H2O2) as compared with non-MAP shoots at 20°C. Combined MAP and low temperature storage not only maintained good sensory quality after 30 days, but also reduced significantly the increases in lignin content, superoxide anion (O2.-) production rate, H2O2 content and membrane permeability, maintained high activities of superoxide dismutase (SOD), catalase (CAT) and ascorbate peroxidase (APX), and reduced the increase in activities of lipase, phospholipase D (PLD) and lipoxygenase (LOX). Furthermore, the minimally processed WBS under MAP condition exhibited higher energy charge (EC) and lower adenosine monophosphate (AMP) content by the end of storage (60 days) at 2°C than those without MAP or stored for 9 days at 20°C. Conclusion These results indicated that MAP in combination with low temperature storage reduced lignification of minimally processed WBS, which was closely associated with maintenance of energy status and enhanced activities of antioxidant enzymes, as well as reduced alleviation of membrane damage caused by ROS. PMID:24006941
Design and testing of coring bits on drilling lunar rock simulant
NASA Astrophysics Data System (ADS)
Li, Peng; Jiang, Shengyuan; Tang, Dewei; Xu, Bo; Ma, Chao; Zhang, Hui; Qin, Hongwei; Deng, Zongquan
2017-02-01
Coring bits are widely utilized in the sampling of celestial bodies, and their drilling behaviors directly affect the sampling results and drilling security. This paper introduces a lunar regolith coring bit (LRCB), which is a key component of sampling tools for lunar rock breaking during the lunar soil sampling process. We establish the interaction model between the drill bit and rock at a small cutting depth, and the two main influential parameters (forward and outward rake angles) of LRCB on drilling loads are determined. We perform the parameter screening task of LRCB with the aim to minimize the weight on bit (WOB). We verify the drilling load performances of LRCB after optimization, and the higher penetrations per revolution (PPR) are, the larger drilling loads we gained. Besides, we perform lunar soil drilling simulations to estimate the efficiency on chip conveying and sample coring of LRCB. The results of the simulation and test are basically consistent on coring efficiency, and the chip removal efficiency of LRCB is slightly lower than HIT-H bit from simulation. This work proposes a method for the design of coring bits in subsequent extraterrestrial explorations.
Sputum color: potential implications for clinical practice.
Johnson, Allen L; Hampson, David F; Hampson, Neil B
2008-04-01
Respiratory infections with sputum production are a major reason for physician visits, diagnostic testing, and antibiotic prescription in the United States. We sought to determine whether the simple characteristic of sputum color provides information that impacts resource utilization such as laboratory testing and prescription of antibiotics. Out-patient sputum samples submitted to the microbiology laboratory for routine analysis were assigned to one of 8 color categories (green, yellow-green, rust, yellow, red, cream, white, and clear), based on a key made from paint chip color samples. Subsequent Gram stain and culture results were compared to sputum color. Of 289 consecutive samples, 144 (50%) met standard Gram-stain criteria for being acceptable lower-respiratory-tract specimens. In the acceptable Gram-stain group, 60 samples had a predominant organism on Gram stain, and the culture yielded a consistent result in 42 samples (15% of the 289 total specimens). Yield at each level of analysis differed greatly by color. The yield from sputum colors green, yellow-green, yellow, and rust was much higher than the yield from cream, white, or clear. If out-patient sputum is cream, white, or clear, the yield from bacteriologic analysis is extremely low. This information can reduce laboratory processing costs and help minimize unnecessary antibiotic prescription.
Specialized minimal PDFs for optimized LHC calculations.
Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan
2016-01-01
We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.
Torczynski, John R.
2000-01-01
A spin coating apparatus requires less cleanroom air flow than prior spin coating apparatus to minimize cleanroom contamination. A shaped exhaust duct from the spin coater maintains process quality while requiring reduced cleanroom air flow. The exhaust duct can decrease in cross section as it extends from the wafer, minimizing eddy formation. The exhaust duct can conform to entrainment streamlines to minimize eddy formation and reduce interprocess contamination at minimal cleanroom air flow rates.
Factor Structure and Correlates of the Dissociative Experiences Scale in a Large Offender Sample
ERIC Educational Resources Information Center
Ruiz, Mark A.; Poythress, Norman G.; Lilienfeld, Scott O.; Douglas, Kevin S.
2008-01-01
The authors examined the psychometric properties, factor structure, and construct validity of the Dissociative Experiences Scale (DES) in a large offender sample (N = 1,515). Although the DES is widely used with community and clinical samples, minimal work has examined offender samples. Participants were administered self-report and interview…
Hydrothermal diamond-anvil cell: Application to studies of geologic fluids
Chou, I.-Ming
2003-01-01
The hydrothermal diamond-anvil cell (HDAC) was designed to simulate the geologic conditions of crustal processes in the presence of water or other fluids. The HDAC has been used to apply external pressure to both synthetic and natural fluid inclusions in quartz to minimize problems caused by stretching or decrepitation of inclusions during microthermometric analysis. When the HDAC is loaded with a fluid sample, it can be considered as a large synthetic fluid inclusion and therefore, can be used to study the PVTX properties as well as phase relations of the sample fluid. Because the HDAC has a wide measurement pressure-temperature range and also allows in-situ optical observations, it has been used to study critical phenomena of various chemical systems, such as the geologically important hydrous silicate melts. It is possible, when the HDAC is combined with synchrotron X-ray sources, to obtain basic information on speciation and structure of metal including rare-earth elements (REE) complexes in hydrothermal solutions as revealed by X-ray absorption fine structure (XAFS) spectra. Recent modifications of the HDAC minimize the loss of intensity of X-rays due to scattering and absorption by the diamonds. These modifications are especially important for studying elements with absorption edges below 10 keV and therefore particularly valuable for our understanding of transport and deposition of first-row transition elements and REE in hydrothermal environments.
Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Kabir, Abuzar; Furton, Kenneth G; Santana-Rodríguez, José Juan
2015-10-01
A fast and sensitive sample preparation strategy using fabric phase sorptive extraction followed by ultra-high-performance liquid chromatography and tandem mass spectrometry detection has been developed to analyse benzotriazole UV stabilizer compounds in aqueous samples. Benzotriazole UV stabilizer compounds are a group of compounds added to sunscreens and other personal care products which may present detrimental effects to aquatic ecosystems. Fabric phase sorptive extraction is a novel solvent minimized sample preparation approach that integrates the advantages of sol-gel derived hybrid inorganic-organic nanocomposite sorbents and the flexible, permeable and hydrophobic surface chemistry of polyester fabric. It is a highly sensitive, fast, efficient and inexpensive device that can be reused and does not suffer from coating damage, unlike SPME fibres or stir bars. In this paper, we optimized the extraction of seven benzotriazole UV filters evaluating the majority of the parameters involved in the extraction process, such as sorbent chemistry selection, extraction time, back-extraction solvent, back-extraction time and the impact of ionic strength. Under the optimized conditions, fabric phase sorptive extraction allows enrichment factors of 10 times with detection limits ranging from 6.01 to 60.7 ng L(-1) and intra- and inter-day % RSDs lower than 11 and 30 % for all compounds, respectively. The optimized sample preparation technique followed by ultra-high-performance liquid chromatography and tandem mass spectrometry detection was applied to determine the target analytes in sewage samples from wastewater treatment plants with different purification processes of Gran Canaria Island (Spain). Two UV stabilizer compounds were measured in ranges 17.0-60.5 ng mL(-1) (UV 328) and 69.3-99.2 ng mL(-1) (UV 360) in the three sewage water samples analysed.
Combined effects of microwaves, electron beams and polyfunctional monomers on rubber vulcanization.
Manaila, Elena; Martin, Diana; Stelescu, Daniela Zuga; Craciun, Gabriela; Ighigeanu, Daniel; Matei, Constantin
2009-01-01
This paper presents comparative results obtained by conventional vulcanization with benzoyl peroxide (CV-BP), separate electron beam vulcanization (EB-V) and simultaneous electron beam and microwave vulcanization (EB+MW-V) applied to two kind of rubber samples: EVA (ethylene vinyl acetate) rubber-sample (EVA-sample) and EPDM (ethylene-propylene terpolymer) rubber-sample (EPDM-sample). The EVA-samples contain 61.54% EVA Elvax 260, 30.77% carbon black, 1.85% TAC (triallylcyanurate) polyfunctional monomer and 5.84% filler (zinc oxide, stearic acid, polyethylene glycol and antioxidant). The EPDM-samples contain 61.54% EPDM Nordel 4760, 30.77% carbon black, 1.85% TMPT (trimethylopropane trimethacrylate) polyfunctional monomer and 5.84% filler (zinc oxide, stearic acid, polyethylene glycol and antioxidant). The rubber samples designed for different vulcanization methods were obtained from raw rubber mixtures, as compressed sheets of 2 mm in the polyethylene foils to minimize oxidation. For EB and EB + MW treatments the sheets were cut in rectangular shape 0.15 x 0.15 m2. The physical properties of samples obtained by CV-BP EV-Vand EB + MW-V methods were evaluated by measuring the tearing strength, residual elongation, elongation at break, tensile strength, 300% modulus, 100% modulus, elasticity and hardness. The obtained results demonstrate an improvement of rubber several properties obtained by EB and EB + MW processing as compared to classical procedure using benzoyl peroxide.
Skvortsova, Vasilisa; Degos, Bertrand; Welter, Marie-Laure; Vidailhet, Marie; Pessiglione, Mathias
2017-06-21
Instrumental learning is a fundamental process through which agents optimize their choices, taking into account various dimensions of available options such as the possible reward or punishment outcomes and the costs associated with potential actions. Although the implication of dopamine in learning from choice outcomes is well established, less is known about its role in learning the action costs such as effort. Here, we tested the ability of patients with Parkinson's disease (PD) to maximize monetary rewards and minimize physical efforts in a probabilistic instrumental learning task. The implication of dopamine was assessed by comparing performance ON and OFF prodopaminergic medication. In a first sample of PD patients ( n = 15), we observed that reward learning, but not effort learning, was selectively impaired in the absence of treatment, with a significant interaction between learning condition (reward vs effort) and medication status (OFF vs ON). These results were replicated in a second, independent sample of PD patients ( n = 20) using a simplified version of the task. According to Bayesian model selection, the best account for medication effects in both studies was a specific amplification of reward magnitude in a Q-learning algorithm. These results suggest that learning to avoid physical effort is independent from dopaminergic circuits and strengthen the general idea that dopaminergic signaling amplifies the effects of reward expectation or obtainment on instrumental behavior. SIGNIFICANCE STATEMENT Theoretically, maximizing reward and minimizing effort could involve the same computations and therefore rely on the same brain circuits. Here, we tested whether dopamine, a key component of reward-related circuitry, is also implicated in effort learning. We found that patients suffering from dopamine depletion due to Parkinson's disease were selectively impaired in reward learning, but not effort learning. Moreover, anti-parkinsonian medication restored the ability to maximize reward, but had no effect on effort minimization. This dissociation suggests that the brain has evolved separate, domain-specific systems for instrumental learning. These results help to disambiguate the motivational role of prodopaminergic medications: they amplify the impact of reward without affecting the integration of effort cost. Copyright © 2017 the authors 0270-6474/17/376087-11$15.00/0.
Chemical fractionation-enhanced structural characterization of marine dissolved organic matter
NASA Astrophysics Data System (ADS)
Arakawa, N.; Aluwihare, L.
2016-02-01
Describing the molecular fingerprint of dissolved organic matter (DOM) requires sample processing methods and separation techniques that can adequately minimize its complexity. We have employed acid hydrolysis as a way to make the subcomponents of marine solid phase-extracted (PPL) DOM more accessible to analytical techniques. Using a combination of NMR and chemical derivatization or reduction analyzed by comprehensive (GCxGC) gas chromatography, we observed chemical features strikingly similar to terrestrial DOM. In particular, we observed reduced alicylic hydrocarbons believed to be the backbone of previously identified carboxylic rich alicyclic material (CRAM). Additionally, we found carbohydrates, amino acids and small lipids and acids.
Characterization of the Environmentally Induced Chemical Transformations of Uranium Tetrafluoride
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wellons, M.
A key challenge with nuclear safeguards environmental sampling is identification of the materials post release due to subsequent chemical reactions with ambient water and oxygen. Uranium Tetrafluoride (UF4) is of interest as an intermediate in both the upstream and downstream portions of uranium feedstock and metal production processes used in nuclear fuel production; however minimal published research exists relating to UF4 hydrolysis. FY16 efforts were dedicated to in-situ Raman spectroscopy and X-ray diffraction characterization of UF4 during exposure to various relative humidity conditions. This effort mapped several hydrolysis reaction pathways and identified both intermediate, and terminal progeny species.
Miyata, Tomohiro; Mizoguchi, Teruyasu
2018-03-01
Understanding structures and spatial distributions of molecules in liquid phases is crucial for the control of liquid properties and to develop efficient liquid-phase processes. Here, real-space mapping of molecular distributions in a liquid was performed. Specifically, the ionic liquid 1-Ethyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide (C2mimTFSI) was imaged using atomic-resolution scanning transmission electron microscopy. Simulations revealed network-like bright regions in the images that were attributed to the TFSI- anion, with minimal contributions from the C2mim+ cation. Simple visualization of the TFSI- distribution in the liquid sample was achieved by binarizing the experimental image.
Evaluation of radiometric and geometric characteristics of LANDSAT-D imaging system
NASA Technical Reports Server (NTRS)
Salisbury, J. W.; Podwysocki, M. H.; Bender, L. U.; Rowan, L. C. (Principal Investigator)
1983-01-01
With vegetation masked and noise sources eliminated or minimized, different carbonate facies could be discriminated in a south Florida scene. Laboratory spectra of grab samples indicate that a 20% change in depth of the carbonate absorption band was detected despite the effects of atmospheric absorption. Both bright and dark hydrothermally altered volcanic rocks can be discriminated from their unaltered equivalents. A previously unrecognized altered area was identified on the basis of the TM images. The ability to map desert varnish in semi-arid terrains has economic significance as it defines areas that are less susceptible desert erosional process and suitable for construction development.
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Lohmann, Rainer; Jaward, Foday M; Durham, Louise; Barber, Jonathan L; Ockenden, Wendy; Jones, Kevin C; Bruhn, Regina; Lakaschus, Soenke; Dachs, Jordi; Booij, Kees
2004-07-15
Air samples were taken onboard the RRS Bransfield on an Atlantic cruise from the United Kingdom to Halley, Antarctica, from October to December 1998, with the aim of establishing PCB oceanic background air concentrations and assessing their latitudinal distribution. Great care was taken to minimize pre- and post-collection contamination of the samples, which was validated through stringent QA/QC procedures. However, there is evidence that onboard contamination of the air samples occurred,following insidious, diffusive emissions on the ship. Other data (for PCBs and other persistent organic pollutants (POPs)) and examples of shipboard contamination are presented. The implications of these findings for past and future studies of global POPs distribution are discussed. Recommendations are made to help critically appraise and minimize the problems of insidious/diffusive shipboard contamination.
Characterization, adaptive traffic shaping, and multiplexing of real-time MPEG II video
NASA Astrophysics Data System (ADS)
Agrawal, Sanjay; Barry, Charles F.; Binnai, Vinay; Kazovsky, Leonid G.
1997-01-01
We obtain network traffic model for real-time MPEG-II encoded digital video by analyzing video stream samples from real-time encoders from NUKO Information Systems. MPEG-II sample streams include a resolution intensive movie, City of Joy, an action intensive movie, Aliens, a luminance intensive (black and white) movie, Road To Utopia, and a chrominance intensive (color) movie, Dick Tracy. From our analysis we obtain a heuristic model for the encoded video traffic which uses a 15-stage Markov process to model the I,B,P frame sequences within a group of pictures (GOP). A jointly-correlated Gaussian process is used to model the individual frame sizes. Scene change arrivals are modeled according to a gamma process. Simulations show that our MPEG-II traffic model generates, I,B,P frame sequences and frame sizes that closely match the sample MPEG-II stream traffic characteristics as they relate to latency and buffer occupancy in network queues. To achieve high multiplexing efficiency we propose a traffic shaping scheme which sets preferred 1-frame generation times among a group of encoders so as to minimize the overall variation in total offered traffic while still allowing the individual encoders to react to scene changes. Simulations show that our scheme results in multiplexing gains of up to 10% enabling us to multiplex twenty 6 Mbps MPEG-II video streams instead of 18 streams over an ATM/SONET OC3 link without latency or cell loss penalty. This scheme is due for a patent.
Radiation induced corrosion of copper for spent nuclear fuel storage
NASA Astrophysics Data System (ADS)
Björkbacka, Åsa; Hosseinpour, Saman; Johnson, Magnus; Leygraf, Christofer; Jonsson, Mats
2013-11-01
The long term safety of repositories for radioactive waste is one of the main concerns for countries utilizing nuclear power. The integrity of engineered and natural barriers in such repositories must be carefully evaluated in order to minimize the release of radionuclides to the biosphere. One of the most developed concepts of long term storage of spent nuclear fuel is the Swedish KBS-3 method. According to this method, the spent fuel will be sealed inside copper canisters surrounded by bentonite clay and placed 500 m down in stable bedrock. Despite the importance of the process of radiation induced corrosion of copper, relatively few studies have been reported. In this work the effect of the total gamma dose on radiation induced corrosion of copper in anoxic pure water has been studied experimentally. Copper samples submerged in water were exposed to a series of total doses using three different dose rates. Unirradiated samples were used as reference samples throughout. The copper surfaces were examined qualitatively using IRAS and XPS and quantitatively using cathodic reduction. The concentration of copper in solution after irradiation was measured using ICP-AES. The influence of aqueous radiation chemistry on the corrosion process was evaluated based on numerical simulations. The experiments show that the dissolution as well as the oxide layer thickness increase upon radiation. Interestingly, the evaluation using numerical simulations indicates that aqueous radiation chemistry is not the only process driving the corrosion of copper in these systems.
Large-scale evidence of dependency length minimization in 37 languages
Futrell, Richard; Mahowald, Kyle; Gibson, Edward
2015-01-01
Explaining the variation between human languages and the constraints on that variation is a core goal of linguistics. In the last 20 y, it has been claimed that many striking universals of cross-linguistic variation follow from a hypothetical principle that dependency length—the distance between syntactically related words in a sentence—is minimized. Various models of human sentence production and comprehension predict that long dependencies are difficult or inefficient to process; minimizing dependency length thus enables effective communication without incurring processing difficulty. However, despite widespread application of this idea in theoretical, empirical, and practical work, there is not yet large-scale evidence that dependency length is actually minimized in real utterances across many languages; previous work has focused either on a small number of languages or on limited kinds of data about each language. Here, using parsed corpora of 37 diverse languages, we show that overall dependency lengths for all languages are shorter than conservative random baselines. The results strongly suggest that dependency length minimization is a universal quantitative property of human languages and support explanations of linguistic variation in terms of general properties of human information processing. PMID:26240370
Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián
2009-01-01
This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975
Towards rapid prototyped convective microfluidic DNA amplification platform
NASA Astrophysics Data System (ADS)
Ajit, Smrithi; Praveen, Hemanth Mithun; Puneeth, S. B.; Dave, Abhishek; Sesham, Bharat; Mohan, K. N.; Goel, Sanket
2017-02-01
Today, Polymerase Chain Reaction (PCR) based DNA amplification plays an indispensable role in the field of biomedical research. Its inherent ability to exponentially amplify sample DNA has proven useful for the identification of virulent pathogens like those causing Multiple Drug-Resistant Tuberculosis (MDR-TB). The intervention of Microfluidics technology has revolutionized the concept of PCR from being a laborious and time consuming process into one that is faster, easily portable and capable of being multifunctional. The Microfluidics based PCR outweighs its traditional counterpart in terms of flexibility of varying reaction rate, operation simplicity, need of a fraction of volume and capability of being integrated with other functional elements. The scope of the present work involves the development of a real-time continuous flow microfluidic device, fabricated by 3D printing-governed rapid prototyping method, eventually leading to an automated and robust platform to process multiple DNA samples for detection of MDRTB-associated mutations. The thermal gradient characteristic to the PCR process is produced using peltier units appropriate to the microfluidic environment fully monitored and controlled by a low cost controller driven by a Data Acquisition System. The process efficiency achieved in the microfluidic environment in terms of output per cycle is expected to be on par with the traditional PCR and capable of earning the additional advantages of being faster and minimizing the handling.
Absolute Paleointensity Techniques: Developments in the Last 10 Years (Invited)
NASA Astrophysics Data System (ADS)
Bowles, J. A.; Brown, M. C.
2009-12-01
The ability to determine variations in absolute intensity of the Earth’s paleomagnetic field has greatly enhanced our understanding of geodynamo processes, including secular variation and field reversals. Igneous rocks and baked clay artifacts that carry a thermal remanence (TRM) have allowed us to study field variations over timescales ranging from decades to billions of years. All absolute paleointensity techniques are fundamentally based on repeating the natural process by which the sample acquired its magnetization, i.e. a laboratory TRM is acquired in a controlled field, and the ratio of the natural TRM to that acquired in the laboratory is directly proportional to the ancient field. Techniques for recovering paleointensity have evolved since the 1930s from relatively unsophisticated (but revolutionary for their time) single step remagnetizations to the various complicated, multi-step procedures in use today. These procedures can be broadly grouped into two categories: 1) “Thellier-type” experiments that step-wise heat samples at a series of temperatures up to the maximum unblocking temperature of the sample, progressively removing the natural remanence (NRM) and acquiring a laboratory-induced TRM; and 2) “Shaw-type” experiments that combine alternating field demagnetization of the NRM and laboratory TRM with a single heating to a temperature above the sample’s Curie temperature, acquiring a total TRM in one step. Many modifications to these techniques have been developed over the years with the goal of identifying and/or accommodating non-ideal behavior, such as alteration and multi-domain (MD) remanence, which may lead to inaccurate paleofield estimates. From a technological standpoint, perhaps the most significant development in the last decade is the use of microwave (de)magnetization in both Thellier-type and Shaw-type experiments. By using microwaves to directly generate spin waves within the magnetic grains (rather than using phonons generated by heating, which then exchange energy with the magnetic system), a TRM can be acquired with minimal heating of the bulk sample, thus potentially minimizing sample alteration. The theory of TRM acquisition is best developed for single-domain (SD) grains, and most paleointensity techniques are predicated on the assumption that the remanence is carried predominantly by SD material. Because the vast majority of geological materials are characterized by a larger magnetic grain size, efforts to expand paleointensity studies over the past decade have focused on developing TRM theories and paleointensity methods for pseudo-single-domain (PSD) and MD samples. Other workers have been exploring the potential of SD materials that were not traditionally used in paleointensity studies, such as ash flow tuffs, submarine basaltic glass, and single silicate crystals with magnetite inclusions. The latter has the potential to shed light on early Earth processes, given that the fine-grained inclusions may be resistant to alteration over long time scales. We will review the major paleointensity techniques in use today, with special attention paid to the advantages and disadvantages of each. Techniques will be illustrated with examples highlighting new paleointensity applications to geologic processes at a variety of timescales.
Caplan, David; Michaud, Jennifer; Hufford, Rebecca
2013-01-01
Sixty one pwa were tested on syntactic comprehension in three tasks: sentence-picture matching, sentence-picture matching with auditory moving window presentation, and object manipulation. There were significant correlations of performances on sentences across tasks. First factors in unrotated factor analyses accounted for most of the variance on which all sentence types loaded in each task. Dissociations in performance between sentence types that differed minimally in their syntactic structures were not consistent across tasks. These results replicate previous results with smaller samples and provide important validation of basic aspects of aphasic performance in this area of language processing. They point to the role of a reduction in processing resources and of the interaction of task demands and parsing and interpretive abilities in the genesis of patient performance. PMID:24061104
RCT: Module 2.03, Counting Errors and Statistics, Course 8768
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hillmer, Kurt T.
2017-04-01
Radiological sample analysis involves the observation of a random process that may or may not occur and an estimation of the amount of radioactive material present based on that observation. Across the country, radiological control personnel are using the activity measurements to make decisions that may affect the health and safety of workers at those facilities and their surrounding environments. This course will present an overview of measurement processes, a statistical evaluation of both measurements and equipment performance, and some actions to take to minimize the sources of error in count room operations. This course will prepare the student withmore » the skills necessary for radiological control technician (RCT) qualification by passing quizzes, tests, and the RCT Comprehensive Phase 1, Unit 2 Examination (TEST 27566) and by providing in the field skills.« less
Finite Element Analysis in Concurrent Processing: Computational Issues
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett
2004-01-01
The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.
Investigation of Flow Conditioners for Compact Jet Engine Simulator Rig Noise Reduction
NASA Technical Reports Server (NTRS)
Doty, Michael J.; Haskin, Henry H.
2011-01-01
The design requirements for two new Compact Jet Engine Simulator (CJES) units for upcoming wind tunnel testing lead to the distinct possibility of rig noise contamination. The acoustic and aerodynamic properties of several flow conditioner devices are investigated over a range of operating conditions relevant to the CJES units to mitigate the risk of rig noise. An impinging jet broadband noise source is placed in the upstream plenum of the test facility permitting measurements of not only flow conditioner self-noise, but also noise attenuation characteristics. Several perforated plate and honeycomb samples of high porosity show minimal self-noise but also minimal attenuation capability. Conversely, low porosity perforated plate and sintered wire mesh conditioners exhibit noticeable attenuation but also unacceptable self-noise. One fine wire mesh sample (DP450661) shows minimal selfnoise and reasonable attenuation, particularly when combined in series with a 15.6 percent open area (POA) perforated plate upstream. This configuration is the preferred flow conditioner system for the CJES, providing up to 20 dB of broadband attenuation capability with minimal self-noise.
Improved ultrasonic TV images achieved by use of Lamb-wave orientation technique
NASA Technical Reports Server (NTRS)
Berger, H.
1967-01-01
Lamb-wave sample orientation technique minimizes the interference from standing waves in continuous wave ultrasonic television imaging techniques used with thin metallic samples. The sample under investigation is oriented such that the wave incident upon it is not normal, but slightly angled.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanjilal, Oindrila, E-mail: oindrila@civil.iisc.ernet.in; Manohar, C.S., E-mail: manohar@civil.iisc.ernet.in
The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the secondmore » explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations. - Highlights: • The distance minimizing control forces minimize a bound on the sampling variance. • Establishing Girsanov controls via solution of a two-point boundary value problem. • Girsanov controls via Volterra's series representation for the transfer functions.« less
Alternative techniques for high-resolution spectral estimation of spectrally encoded endoscopy
NASA Astrophysics Data System (ADS)
Mousavi, Mahta; Duan, Lian; Javidi, Tara; Ellerbee, Audrey K.
2015-09-01
Spectrally encoded endoscopy (SEE) is a minimally invasive optical imaging modality capable of fast confocal imaging of internal tissue structures. Modern SEE systems use coherent sources to image deep within the tissue and data are processed similar to optical coherence tomography (OCT); however, standard processing of SEE data via the Fast Fourier Transform (FFT) leads to degradation of the axial resolution as the bandwidth of the source shrinks, resulting in a well-known trade-off between speed and axial resolution. Recognizing the limitation of FFT as a general spectral estimation algorithm to only take into account samples collected by the detector, in this work we investigate alternative high-resolution spectral estimation algorithms that exploit information such as sparsity and the general region position of the bulk sample to improve the axial resolution of processed SEE data. We validate the performance of these algorithms using bothMATLAB simulations and analysis of experimental results generated from a home-built OCT system to simulate an SEE system with variable scan rates. Our results open a new door towards using non-FFT algorithms to generate higher quality (i.e., higher resolution) SEE images at correspondingly fast scan rates, resulting in systems that are more accurate and more comfortable for patients due to the reduced image time.
Huang, J; Loeffler, M; Muehle, U; Moeller, W; Mulders, J J L; Kwakman, L F Tz; Van Dorp, W F; Zschech, E
2018-01-01
A Ga focused ion beam (FIB) is often used in transmission electron microscopy (TEM) analysis sample preparation. In case of a crystalline Si sample, an amorphous near-surface layer is formed by the FIB process. In order to optimize the FIB recipe by minimizing the amorphization, it is important to predict the amorphous layer thickness from simulation. Molecular Dynamics (MD) simulation has been used to describe the amorphization, however, it is limited by computational power for a realistic FIB process simulation. On the other hand, Binary Collision Approximation (BCA) simulation is able and has been used to simulate ion-solid interaction process at a realistic scale. In this study, a Point Defect Density approach is introduced to a dynamic BCA simulation, considering dynamic ion-solid interactions. We used this method to predict the c-Si amorphization caused by FIB milling on Si. To validate the method, dedicated TEM studies are performed. It shows that the amorphous layer thickness predicted by the numerical simulation is consistent with the experimental data. In summary, the thickness of the near-surface Si amorphization layer caused by FIB milling can be well predicted using the Point Defect Density approach within the dynamic BCA model. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ramamoorthy, Sripriya; Zhang, Yuan; Petrie, Tracy; Fridberger, Anders; Ren, Tianying; Wang, Ruikang; Jacques, Steven L.; Nuttall, Alfred L.
2016-02-01
Sound processing in the inner ear involves separation of the constituent frequencies along the length of the cochlea. Frequencies relevant to human speech (100 to 500 Hz) are processed in the apex region. Among mammals, the guinea pig cochlear apex processes similar frequencies and is thus relevant for the study of speech processing in the cochlea. However, the requirement for extensive surgery has challenged the optical accessibility of this area to investigate cochlear processing of signals without significant intrusion. A simple method is developed to provide optical access to the guinea pig cochlear apex in two directions with minimal surgery. Furthermore, all prior vibration measurements in the guinea pig apex involved opening an observation hole in the otic capsule, which has been questioned on the basis of the resulting changes to cochlear hydrodynamics. Here, this limitation is overcome by measuring the vibrations through the unopened otic capsule using phase-sensitive Fourier domain optical coherence tomography. The optically and surgically advanced method described here lays the foundation to perform minimally invasive investigation of speech-related signal processing in the cochlea.
Thermal Diffusivity for III-VI Semiconductor Melts at Different Temperatures
NASA Technical Reports Server (NTRS)
Ban, H.; Li, C.; Lin, B.; Emoto, K.; Scripa, R. N.; Su, C.-H.; Lehoczky, S. L.
2004-01-01
The change of the thermal properties of semiconductor melts reflects the structural changes inside the melts, and a fundamental understanding of this structural transformation is essential for high quality semiconductor crystal growth process. This paper focused on the technical development and the measurement of thermal properties of III-VI semiconductor melts at high temperatures. Our previous work has improved the laser flash method for the specialized quartz sample cell. In this paper, we reported the results of our recent progress in further improvements of the measurement system by minimizing the free convection of the melt, adding a front IR detector, and placing the sample cell in a vacuum environment. The results for tellurium and selenium based compounds, some of which have never been reported in the literature, were obtained at different temperatures as a function of time. The data were compared with other measured thermophysical properties to shed light on the structural transformations of the melt.
Mode switching in volcanic seismicity: El Hierro 2011-2013
NASA Astrophysics Data System (ADS)
Roberts, Nick S.; Bell, Andrew F.; Main, Ian G.
2016-05-01
The Gutenberg-Richter b value is commonly used in volcanic eruption forecasting to infer material or mechanical properties from earthquake distributions. Such studies typically analyze discrete time windows or phases, but the choice of such windows is subjective and can introduce significant bias. Here we minimize this sample bias by iteratively sampling catalogs with randomly chosen windows and then stack the resulting probability density functions for the estimated b>˜ value to determine a net probability density function. We examine data from the El Hierro seismic catalog during a period of unrest in 2011-2013 and demonstrate clear multimodal behavior. Individual modes are relatively stable in time, but the most probable b>˜ value intermittently switches between modes, one of which is similar to that of tectonic seismicity. Multimodality is primarily associated with intermittent activation and cessation of activity in different parts of the volcanic system rather than with respect to any systematic inferred underlying process.
NASA Astrophysics Data System (ADS)
Hubert, Maxime; Pacureanu, Alexandra; Guilloud, Cyril; Yang, Yang; da Silva, Julio C.; Laurencin, Jerome; Lefebvre-Joud, Florence; Cloetens, Peter
2018-05-01
In X-ray tomography, ring-shaped artifacts present in the reconstructed slices are an inherent problem degrading the global image quality and hindering the extraction of quantitative information. To overcome this issue, we propose a strategy for suppression of ring artifacts originating from the coherent mixing of the incident wave and the object. We discuss the limits of validity of the empty beam correction in the framework of a simple formalism. We then deduce a correction method based on two-dimensional random sample displacement, with minimal cost in terms of spatial resolution, acquisition, and processing time. The method is demonstrated on bone tissue and on a hydrogen electrode of a ceramic-metallic solid oxide cell. Compared to the standard empty beam correction, we obtain high quality nanotomography images revealing detailed object features. The resulting absence of artifacts allows straightforward segmentation and posterior quantification of the data.
Chemical anesthesia of Northern sea otters (Enhydra lutris): Results of past field studies
Monson, Daniel H.; McCormick, C.; Ballachey, Brenda E.
2001-01-01
Between 1987 and 1997, we chemically immobilized 597 wild sea otters (Enhydra lutris) in Alaska for the collection of biological samples or for surgical instrumentation. One drug-related sea otter fatality occurred during this time. Fentanyl in combination with diazepam produced consistent, smooth inductions with minimal need for supplemental anesthetics during procedures lasting 30-40 min. Antagonism with naltrexone or naloxone was rapid and complete, although we observed narcotic recycling in sea otters treated with naloxone. For surgical procedures, we recommend a fentanyl target dose of 0.33 mg/kg of body mass and diazepam at 0.11 mg/kg. For nonsurgical biological sample collection procedures, we recommend fentanyl at 0.22 mg/kg and diazepam at 0.07 mg/kg. We advise the use of the opioid antagonist naltrexone at a ratio of 2:1 to the total fentanyl administered during processing.
Microsystems for the Capture of Low-Abundance Cells
NASA Astrophysics Data System (ADS)
Dharmasiri, Udara; Witek, Małgorzata A.; Adams, Andre A.; Soper, Steven A.
2010-07-01
Efficient selection and enumeration of low-abundance biological cells are highly important in a variety of applications. For example, the clinical utility of circulating tumor cells (CTCs) in peripheral blood is recognized as a viable biomarker for the management of various cancers, in which the clinically relevant number of CTCs per 7.5 ml of blood is two to five. Although there are several methods for isolating rare cells from a variety of heterogeneous samples, such as immunomagnetic-assisted cell sorting and fluorescence-activated cell sorting, they are fraught with challenges. Microsystem-based technologies are providing new opportunities for selecting and isolating rare cells from complex, heterogeneous samples. Such approaches involve reductions in target-cell loss, process automation, and minimization of contamination issues. In this review, we introduce different application areas requiring rare cell analysis, conventional techniques for their selection, and finally microsystem approaches for low-abundance-cell isolation and enumeration.
An efficient method for purifying high quality RNA from wheat pistils.
Manickavelu, A; Kambara, Kumiko; Mishina, Kohei; Koba, Takato
2007-02-15
Many methods are available for total RNA extraction from plants, except the floral organs like wheat pistils containing high levels of polysaccharides that bind/or co-precipitate with RNA. In this protocol, a simple and effective method for extracting total RNA from small and feathery wheat pistils has been developed. Lithium chloride (LiCl) and phenol:chloroform:isoamylalcohol (PCI) were employed and the samples were ground in microcentrifuge tube using plastic pestle. A jacket of liquid nitrogen and simplified procedures were applied to ensure thorough grinding of the pistils and to minimize the samples loss. These measures substantially increased the recovery of total RNA (approximately 50%) in the extraction process. Reliable differential display by cDNA-AFLP was successfully achieved with the total RNA after DNase treatment and reverse transcription. This method is also practicable for gene expression and gene regulation studies in floral parts of other plants.
A new method for stable lead isotope extraction from seawater.
Zurbrick, Cheryl M; Gallon, Céline; Flegal, A Russell
2013-10-24
A new technique for stable lead (Pb) isotope extraction from seawater is established using Toyopearl AF-Chelate 650M(®) resin (Tosoh Bioscience LLC). This new method is advantageous because it is semi-automated and relatively fast; in addition it introduces a relatively low blank by minimizing the volume of chemicals used in the extraction. Subsequent analyses by HR ICP-MS have a good relative external precision (2σ) of 3.5‰ for (206)Pb/(207)Pb, while analyses by MC-ICP-MS have a better relative external precision of 0.6‰. However, Pb sample concentrations limit MC-ICP-MS analyses to (206)Pb, (207)Pb, and (208)Pb. The method was validated by processing the common Pb isotope reference material NIST SRM-981 and several GEOTRACES intercalibration samples, followed by analyses by HR ICP-MS, all of which showed good agreement with previously reported values. Copyright © 2013 Elsevier B.V. All rights reserved.
Label-free optical imaging of membrane patches for atomic force microscopy
Churnside, Allison B.; King, Gavin M.; Perkins, Thomas T.
2010-01-01
In atomic force microscopy (AFM), finding sparsely distributed regions of interest can be difficult and time-consuming. Typically, the tip is scanned until the desired object is located. This process can mechanically or chemically degrade the tip, as well as damage fragile biological samples. Protein assemblies can be detected using the back-scattered light from a focused laser beam. We previously used back-scattered light from a pair of laser foci to stabilize an AFM. In the present work, we integrate these techniques to optically image patches of purple membranes prior to AFM investigation. These rapidly acquired optical images were aligned to the subsequent AFM images to ~40 nm, since the tip position was aligned to the optical axis of the imaging laser. Thus, this label-free imaging efficiently locates sparsely distributed protein assemblies for subsequent AFM study while simultaneously minimizing degradation of the tip and the sample. PMID:21164738
Spectral evidence for amorphous silicates in least-processed CO meteorites and their parent bodies
NASA Astrophysics Data System (ADS)
McAdam, Margaret M.; Sunshine, Jessica M.; Howard, Kieren T.; Alexander, Conel M.; McCoy, Timothy J.; Bus, Schelte J.
2018-05-01
Least-processed carbonaceous chondrites (carbonaceous chondrites that have experienced minimal aqueous alteration and thermal metamorphism) are characterized by their predominately amorphous iron-rich silicate interchondrule matrices and chondrule rims. This material is highly susceptible to destruction by the parent body processes of thermal metamorphism or aqueous alteration. The presence of abundant amorphous material in a meteorite indicates that the parent body, or at least a region of the parent body, experienced minimal processing since the time of accretion. The CO chemical group of carbonaceous chondrites has a significant number of these least-processed samples. We present visible/near-infrared and mid-infrared spectra of eight least-processed CO meteorites (petrologic type 3.0-3.1). In the visible/near-infrared, these COs are characterized by a broad weak feature that was first observed by Cloutis et al. (2012) to be at 1.3-μm and attributed to iron-rich amorphous silicate matrix materials. This feature is observed to be centered at 1.4-μm for terrestrially unweathered, least-processed CO meteorites. At mid-infrared wavelengths, a 21-μm feature, consistent with Si-O vibrations of amorphous materials and glasses, is also present. The spectral features of iron-rich amorphous silicate matrix are absent in both the near- and mid-infrared spectra of higher metamorphic grade COs because this material has recrystallized as crystalline olivine. Furthermore, spectra of least-processed primitive meteorites from other chemical groups (CRs, MET 00426 and QUE 99177, and C2-ungrouped Acfer 094), also exhibit a 21-μm feature. Thus, we conclude that the 1.4- and 21-μm features are characteristic of primitive least-processed meteorites from all chemical groups of carbonaceous chondrites. Finally, we present an IRTF + SPeX observation of asteroid (93) Minerva that has spectral similarities in the visible/near-infrared to the least-processed CO carbonaceous chondrites. While Minerva is not the only CO-like asteroid (e.g., Burbine et al., 2001), Minerva is likely the least-processed CO-like asteroid observed to date.
NASA Astrophysics Data System (ADS)
Hengl, Tomislav
2015-04-01
Efficiency of spatial sampling largely determines success of model building. This is especially important for geostatistical mapping where an initial sampling plan should provide a good representation or coverage of both geographical (defined by the study area mask map) and feature space (defined by the multi-dimensional covariates). Otherwise the model will need to extrapolate and, hence, the overall uncertainty of the predictions will be high. In many cases, geostatisticians use point data sets which are produced using unknown or inconsistent sampling algorithms. Many point data sets in environmental sciences suffer from spatial clustering and systematic omission of feature space. But how to quantify these 'representation' problems and how to incorporate this knowledge into model building? The author has developed a generic function called 'spsample.prob' (Global Soil Information Facilities package for R) and which simultaneously determines (effective) inclusion probabilities as an average between the kernel density estimation (geographical spreading of points; analysed using the spatstat package in R) and MaxEnt analysis (feature space spreading of points; analysed using the MaxEnt software used primarily for species distribution modelling). The output 'iprob' map indicates whether the sampling plan has systematically missed some important locations and/or features, and can also be used as an input for geostatistical modelling e.g. as a weight map for geostatistical model fitting. The spsample.prob function can also be used in combination with the accessibility analysis (cost of field survey are usually function of distance from the road network, slope and land cover) to allow for simultaneous maximization of average inclusion probabilities and minimization of total survey costs. The author postulates that, by estimating effective inclusion probabilities using combined geographical and feature space analysis, and by comparing survey costs to representation efficiency, an optimal initial sampling plan can be produced which satisfies both criteria: (a) good representation (i.e. within a tolerance threshold), and (b) minimized survey costs. This sampling analysis framework could become especially interesting for generating sampling plans in new areas e.g. for which no previous spatial prediction model exists. The presentation includes data processing demos with standard soil sampling data sets Ebergotzen (Germany) and Edgeroi (Australia), also available via the GSIF package.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hilaly, A.K.; Sikdar, S.K.
In this study, the authors introduced several modifications to the WAR (waste reduction) algorithm developed earlier. These modifications were made for systematically handling sensitivity analysis and various tasks of waste minimization. A design hierarchy was formulated to promote appropriate waste reduction tasks at designated levels of the hierarchy. A sensitivity coefficient was used to measure the relative impacts of process variables on the pollution index of a process. The use of the WAR algorithm was demonstrated by a fermentation process for making penicillin.
Synthesis and polymorphic control for visible light active titania nanoparticles
NASA Astrophysics Data System (ADS)
Kaewgun, Sujaree
Titania (TiO2) is useful for many applications in photocatalysis, antimicrobials, pigment, deodorization, and decomposition of harmful organics and undesirable compounds in the air and waste water under UV irradiation. Among the three phases of TiO2, Rutile, Anatase, and Brookite, studies have been more focused on the anatase and rutile phases. Pure brookite is the most difficult phase to prepare, even under hydrothermal conditions. Predominantly brookite phase TiO2 nanoparticles were prepared by the Water-based Ambient Condition Sol (WACS) process in our laboratory. The objectives of this research were to enhance visible light active (VLA) photocatalytic properties of polymorphic brookite TiO2 by minimizing the lattice defects and narrowing band gap of titania by nitrogen and/or carbon chromophone, and to investigate the deactivation, reusability, and regeneration of the VLA titania in order to design better titania catalysts for organic compound degradation applications. In order to study the influence of hydroxyl content on photocatalytic activities (PCAs) of polymorphic titania nanoparticles, the WACS samples were post-treated by a Solvent-based Ambient Condition Sol (SACS) process in sec-butanol (sec-BuOH). All samples were characterized for phase composition, surface area, hydroxyl contamination, and particle morphology by x-ray diffraction, N2 physisorption, FT-IR, solid state 1H NMR and scanning electron microscopy, and then compared to a commercial titania, Degussa P25. Evaluation of methyl orange (MO) degradation under UV irradiation results showed that the lower lattice hydroxyl content in SACS titania enhanced the PCA. As-prepared titania and SACS samples, which have similar surface areas and crystallinity, were compared in order to prove that the superior PCA came from the reduction in the lattice hydroxyl content. To enhance PCA and VLA properties of WACS, an alternative high boiling point polar solvent, N-methylpyrrolidone (NMP), was utilized in the SACS process at a higher treatment temperature to modify polymorphic titania nanoparticles. This SACS sample was called "SACS-NMP". SACS, using NMP as the solvent, could also extract lattice hydroxyls, and decorate nitrogen on the titania surface. The PCA of SACS-NMP was superior to that of SACS-sec-BuOH. Nitrogen incorporation of SACS-NMP titania was investigated by CHN analysis and x-ray photoelectron spectroscopy (XPS). VL absorbance for all samples was characterized by UV-Vis absorption spectrophotometry. PCA of MO degradation under UV and VL showed that SACS-NMP is a powerful treatment to enhance PCA by minimizing lattice hydroxyls and doping the titania surface with nitrogen. The effect of calcination conditions on SACS-NMP samples was also studied. The calcination conditions, especially the temperature and calcination atmosphere, have an influence on the BET surface area, crystallite size, titania phase content, and PCA under VL irradiation. SACS-NMP samples calcined in air at 200°C for 2 hours showed the best VL activated photocatalytic performance in this research. Additionally, the SACS-NMP sample exhibited superior VL properties to several available reference anatase titania samples. This could be explained as the effective charge separation by the intercrystalline electron transport from brookite to anatase grains complemented by strong VL absorption by the nitrogen species in NMP. The deactivation and regeneration of the VLA titania were investigated and compared to a commercial titania, Kronos VLP7000. PCA of the titania under VL for MO decolorization gradually decreased with increasing testing time and the number of runs. The cause of the deactivation was identified as the deposition of the decomposed MO or the carbonaceous deposit. Among the possible regeneration procedures for used SACS-NMP samples, methanol washing was shown to be the most effective up to ˜80% of the PCA recovery. Accordingly, the SACS-NMP samples could not be completely recovered since a regeneration process would possibly remove some of nitrogen species responsible for the VL properties.
NASA Astrophysics Data System (ADS)
Zhang, Xingong; Yin, Yunqiang; Wu, Chin-Chia
2017-01-01
There is a situation found in many manufacturing systems, such as steel rolling mills, fire fighting or single-server cycle-queues, where a job that is processed later consumes more time than that same job when processed earlier. The research finds that machine maintenance can improve the worsening of processing conditions. After maintenance activity, the machine will be restored. The maintenance duration is a positive and non-decreasing differentiable convex function of the total processing times of the jobs between maintenance activities. Motivated by this observation, the makespan and the total completion time minimization problems in the scheduling of jobs with non-decreasing rates of job processing time on a single machine are considered in this article. It is shown that both the makespan and the total completion time minimization problems are NP-hard in the strong sense when the number of maintenance activities is arbitrary, while the makespan minimization problem is NP-hard in the ordinary sense when the number of maintenance activities is fixed. If the deterioration rates of the jobs are identical and the maintenance duration is a linear function of the total processing times of the jobs between maintenance activities, then this article shows that the group balance principle is satisfied for the makespan minimization problem. Furthermore, two polynomial-time algorithms are presented for solving the makespan problem and the total completion time problem under identical deterioration rates, respectively.
Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils
2015-02-07
Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358