Almutairy, Meznah; Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.
Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989
Intelligent Sampling of Hazardous Particle Populations in Resource-Constrained Environments
NASA Astrophysics Data System (ADS)
McCollough, J. P.; Quinn, J. M.; Starks, M. J.; Johnston, W. R.
2017-10-01
Sampling of anomaly-causing space environment drivers is necessary for both real-time operations and satellite design efforts, and optimizing measurement sampling helps minimize resource demands. Relating these measurements to spacecraft anomalies requires the ability to resolve spatial and temporal variability in the energetic charged particle hazard of interest. Here we describe a method for sampling particle fluxes informed by magnetospheric phenomenology so that, along a given trajectory, the variations from both temporal dynamics and spatial structure are adequately captured while minimizing oversampling. We describe the coordinates, sampling method, and specific regions and parameters employed. We compare resulting sampling cadences with data from spacecraft spanning the regions of interest during a geomagnetically active period, showing that the algorithm retains the gross features necessary to characterize environmental impacts on space systems in diverse orbital regimes while greatly reducing the amount of sampling required. This enables sufficient environmental specification within a resource-constrained context, such as limited telemetry bandwidth, processing requirements, and timeliness.
Forcino, Frank L; Leighton, Lindsey R; Twerdy, Pamela; Cahill, James F
2015-01-01
Community ecologists commonly perform multivariate techniques (e.g., ordination, cluster analysis) to assess patterns and gradients of taxonomic variation. A critical requirement for a meaningful statistical analysis is accurate information on the taxa found within an ecological sample. However, oversampling (too many individuals counted per sample) also comes at a cost, particularly for ecological systems in which identification and quantification is substantially more resource consuming than the field expedition itself. In such systems, an increasingly larger sample size will eventually result in diminishing returns in improving any pattern or gradient revealed by the data, but will also lead to continually increasing costs. Here, we examine 396 datasets: 44 previously published and 352 created datasets. Using meta-analytic and simulation-based approaches, the research within the present paper seeks (1) to determine minimal sample sizes required to produce robust multivariate statistical results when conducting abundance-based, community ecology research. Furthermore, we seek (2) to determine the dataset parameters (i.e., evenness, number of taxa, number of samples) that require larger sample sizes, regardless of resource availability. We found that in the 44 previously published and the 220 created datasets with randomly chosen abundances, a conservative estimate of a sample size of 58 produced the same multivariate results as all larger sample sizes. However, this minimal number varies as a function of evenness, where increased evenness resulted in increased minimal sample sizes. Sample sizes as small as 58 individuals are sufficient for a broad range of multivariate abundance-based research. In cases when resource availability is the limiting factor for conducting a project (e.g., small university, time to conduct the research project), statistically viable results can still be obtained with less of an investment.
Shashidhar, Ravindranath; Dhokane, Varsha S; Hajare, Sachin N; Sharma, Arun; Bandekar, Jayant R
2007-04-01
The microbiological quality of market samples of minimally processed (MP) pineapple was examined. The effectiveness of radiation treatment in eliminating Salmonella Typhimurium from laboratory inoculated ready-to-eat pineapple slices was also studied. Microbiological quality of minimally processed pineapple samples from Mumbai market was poor; 8.8% of the samples were positive for Salmonella. D(10) (the radiation dose required to reduce bacterial population by 90%) value for S. Typhimurium inoculated in pineapple was 0.242 kGy. Inoculated pack studies in minimally processed pineapple showed that the treatment with a 2-kGy dose of gamma radiation could eliminate 5 log CFU/g of S. Typhimurium. The pathogen was not detected from radiation-processed samples up to 12 d during storage at 4 and 10 degrees C. The processing of market samples with 1 and 2 kGy was effective in improving the microbiological quality of these products.
Application of higher harmonic blade feathering for helicopter vibration reduction
NASA Technical Reports Server (NTRS)
Powers, R. W.
1978-01-01
Higher harmonic blade feathering for helicopter vibration reduction is considered. Recent wind tunnel tests confirmed the effectiveness of higher harmonic control in reducing articulated rotor vibratory hub loads. Several predictive analyses developed in support of the NASA program were shown to be capable of calculating single harmonic control inputs required to minimize a single 4P hub response. In addition, a multiple-input, multiple-output harmonic control predictive analysis was developed. All techniques developed thus far obtain a solution by extracting empirical transfer functions from sampled data. Algorithm data sampling and processing requirements are minimal to encourage adaptive control system application of such techniques in a flight environment.
Manufacturing Methods and Technology Project Summary Reports
1984-12-01
are used. The instrument chosen provides a convenient method of artifically aging a propellant sample while automatically analyzing for evolved oxides...and aging . Shortly after the engineering sample run, a change in REMBASS require- ments eliminated the crystal high shock requirements. This resulted...material with minimum outgassing in a precision vacuum QXFF. Minimal outgas- ..- sing reduces aging in the finished unit. A fixture was also developed to
Real-time combustion monitoring of PCDD/F indicators by REMPI-TOFMS
Analyses for polychlorinated dibenzodioxin and dibenzofuran (PCDD/F) emissions typically require a 4 h extractive sample taken on an annual or less frequent basis. This results in a potentially minimally representative monitoring scheme. More recently, methods for continual sampl...
Note: Four-port microfluidic flow-cell with instant sample switching
NASA Astrophysics Data System (ADS)
MacGriff, Christopher A.; Wang, Shaopeng; Tao, Nongjian
2013-10-01
A simple device for high-speed microfluidic delivery of liquid samples to a surface plasmon resonance sensor surface is presented. The delivery platform is comprised of a four-port microfluidic cell, two ports serve as inlets for buffer and sample solutions, respectively, and a high-speed selector valve to control the alternate opening and closing of the two outlet ports. The time scale of buffer/sample switching (or sample injection rise and fall time) is on the order of milliseconds, thereby minimizing the opportunity for sample plug dispersion. The high rates of mass transport to and from the central microfluidic sensing region allow for SPR-based kinetic analysis of binding events with dissociation rate constants (kd) up to 130 s-1. The required sample volume is only 1 μL, allowing for minimal sample consumption during high-speed kinetic binding measurement.
NASA Technical Reports Server (NTRS)
Boynton, W. V.; DRAKE; HILDEBRAND; JONES; LEWIS; TREIMAN; WARK
1987-01-01
The genesis of igneous rocks on terrestrial planets can only be understood through experiments at pressures corresponding to those in planetary mantles (10 to 50 kbar). Such experiments typically require a piston-cylinder apparatus, and an apparatus that has the advantage of controllable pressure and temperature, adequate sample volume, rapid sample quench, and minimal danger of catastrophic failure. It is proposed to perform high-pressure and high-temperature piston-cylinder experiments aboard the Space Station. The microgravity environment in the Space Station will minimize settling due to density contrasts and may, thus, allow experiments of moderate duration to be performed without a platinoid capsule and without the sample having to touch the container walls. The ideal pressure medium would have the same temperatures. It is emphasized, however, that this proposed experimental capability requires technological advances and innovations not currently available.
Effect of Common Cryoprotectants on Critical Warming Rates and Ice Formation in Aqueous Solutions
Hopkins, Jesse B.; Badeau, Ryan; Warkentin, Matthew; Thorne, Robert E.
2012-01-01
Ice formation on warming is of comparable or greater importance to ice formation on cooling in determining survival of cryopreserved samples. Critical warming rates required for ice-free warming of vitrified aqueous solutions of glycerol, dimethyl sulfoxide, ethylene glycol, polyethylene glycol 200 and sucrose have been measured for warming rates of order 10 to 104 K/s. Critical warming rates are typically one to three orders of magnitude larger than critical cooling rates. Warming rates vary strongly with cooling rates, perhaps due to the presence of small ice fractions in nominally vitrified samples. Critical warming and cooling rate data spanning orders of magnitude in rates provide rigorous tests of ice nucleation and growth models and their assumed input parameters. Current models with current best estimates for input parameters provide a reasonable account of critical warming rates for glycerol solutions at high concentrations/low rates, but overestimate both critical warming and cooling rates by orders of magnitude at lower concentrations and larger rates. In vitrification protocols, minimizing concentrations of potentially damaging cryoprotectants while minimizing ice formation will require ultrafast warming rates, as well as fast cooling rates to minimize the required warming rates. PMID:22728046
Minimizing target interference in PK immunoassays: new approaches for low-pH-sample treatment.
Partridge, Michael A; Pham, John; Dziadiv, Olena; Luong, Onson; Rafique, Ashique; Sumner, Giane; Torri, Albert
2013-08-01
Quantitating total levels of monoclonal antibody (mAb) biotherapeutics in serum using ELISA may be hindered by soluble targets. We developed two low-pH-sample-pretreatment techniques to minimize target interference. The first procedure involves sample pretreatment at pH <3.0 before neutralization and analysis in a target capture ELISA. Careful monitoring of acidification time is required to minimize potential impact on mAb detection. The second approach involves sample dilution into mild acid (pH ∼4.5) before transferring to an anti-human capture-antibody-coated plate without neutralization. Analysis of target-drug and drug-capture antibody interactions at pH 4.5 indicated that the capture antibody binds to the drug, while the drug and the target were dissociated. Using these procedures, total biotherapeutic levels were accurately measured when soluble target was >30-fold molar excess. These techniques provide alternatives for quantitating mAb biotherapeutics in the presence of a target when standard acid-dissociation procedures are ineffective.
NASA Astrophysics Data System (ADS)
Martins, C. G.; Behrens, J. H.; Destro, M. T.; Franco, B. D. G. M.; Vizeu, D. M.; Hutzler, B.; Landgraf, M.
2004-09-01
Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37°C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10. The shelf-life was increased by 1 {1}/{2} day when the product was exposed to 1 kGy.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR THE BENCH STEAM REFORMER TEST
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2010-08-03
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Fluid Bed Steam Reformer testing. The type, quantity and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluid bed steam reformer (FBSR). A determination of the adequacy of the FBSR process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the FBSR process is to select archived waste samples from the 222-S Laboratory that will be used to test the FBSR process. Analyses of the selected samples will be required to confirm the samples meet the testing criteria.« less
Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.
Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M
2016-10-07
Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.
DATA QUALITY OBJECTIVES FOR SELECTING WASTE SAMPLES FOR BENCH-SCALE REFORMER TREATABILITY STUDIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
BANNING DL
2011-02-11
This document describes the data quality objectives to select archived samples located at the 222-S Laboratory for Bench-Scale Reforming testing. The type, quantity, and quality of the data required to select the samples for Fluid Bed Steam Reformer testing are discussed. In order to maximize the efficiency and minimize the time to treat Hanford tank waste in the Waste Treatment and Immobilization Plant, additional treatment processes may be required. One of the potential treatment processes is the fluidized bed steam reformer. A determination of the adequacy of the fluidized bed steam reformer process to treat Hanford tank waste is required.more » The initial step in determining the adequacy of the fluidized bed steam reformer process is to select archived waste samples from the 222-S Laboratory that will be used in a bench scale tests. Analyses of the selected samples will be required to confirm the samples meet the shipping requirements and for comparison to the bench scale reformer (BSR) test sample selection requirements.« less
50 CFR 679.84 - Rockfish Program recordkeeping, permits, monitoring, and catch accounting.
Code of Federal Regulations, 2011 CFR
2011-10-01
... sampling baskets. This space must be within or adjacent to the observer sample station. (7) Pre-cruise..., NMFS may contact the vessel to arrange for a pre-cruise meeting. The pre-cruise meeting must minimally... monitoring requirements for shoreside and stationary floating processors—(1) Catch monitoring and control...
50 CFR 679.84 - Rockfish Program recordkeeping, permits, monitoring, and catch accounting.
Code of Federal Regulations, 2010 CFR
2010-10-01
... sampling baskets. This space must be within or adjacent to the observer sample station. (7) Pre-cruise..., NMFS may contact the vessel to arrange for a pre-cruise meeting. The pre-cruise meeting must minimally... monitoring requirements for shoreside and stationary floating processors—(1) Catch monitoring and control...
Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation
NASA Technical Reports Server (NTRS)
Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.
2004-01-01
Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.
Method for Hot Real-Time Sampling of Gasification Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomeroy, Marc D
The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beammore » Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.« less
An extension of command shaping methods for controlling residual vibration using frequency sampling
NASA Technical Reports Server (NTRS)
Singer, Neil C.; Seering, Warren P.
1992-01-01
The authors present an extension to the impulse shaping technique for commanding machines to move with reduced residual vibration. The extension, called frequency sampling, is a method for generating constraints that are used to obtain shaping sequences which minimize residual vibration in systems such as robots whose resonant frequencies change during motion. The authors present a review of impulse shaping methods, a development of the proposed extension, and a comparison of results of tests conducted on a simple model of the space shuttle robot arm. Frequency shaping provides a method for minimizing the impulse sequence duration required to give the desired insensitivity.
Droplet-Based Segregation and Extraction of Concentrated Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buie, C R; Buckley, P; Hamilton, J
2007-02-23
Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less
Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board
NASA Technical Reports Server (NTRS)
Breeding Shawn; Khodabandeh, Julia; Turner, Larry D. (Technical Monitor)
2001-01-01
The science requirements for materials processing is to provide the desired PI requirements of thermal gradient, solid/liquid interface front velocity for a given processing temperature desired by the PI. Processing is performed by translating the furnace with the sample in a stationary position to minimize any disturbances to the solid/liquid interface front during steady state processing. Typical sample materials for this metals and alloys furnace are lead-tin alloys, lead-antimony alloys, and aluminum alloys. Samples must be safe to process and therefore typically are contained with hermetically sealed cartridge tubes (gas tight) with inner ceramic liners (liquid tight) to prevent contamination and/or reaction of the sample material with the cartridge tube.
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
40 CFR 63.694 - Testing methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... determine treatment process required HAP biodegradation efficiency (Rbio) for compliance with standards... procedures to minimize the loss of compounds due to volatilization, biodegradation, reaction, or sorption... compounds due to volatilization, biodegradation, reaction, or sorption during the sample collection, storage...
Ultrasonic-based membrane aided sample preparation of urine proteomes.
Jesus, Jemmyson Romário; Santos, Hugo M; López-Fernández, H; Lodeiro, Carlos; Arruda, Marco Aurélio Zezzi; Capelo, J L
2018-02-01
A new ultrafast ultrasonic-based method for shotgun proteomics as well as label-free protein quantification in urine samples is developed. The method first separates the urine proteins using nitrocellulose-based membranes and then proteins are in-membrane digested using trypsin. The enzymatic digestion process is accelerated from overnight to four minutes using a sonoreactor ultrasonic device. Overall, the sample treatment pipeline comprising protein separation, digestion and identification is done in just 3h. The process is assessed using urine of healthy volunteers. The method shows that male can be differentiated from female using the protein content of urine in a fast, easy and straightforward way. 232 and 226 proteins are identified in urine of male and female, respectively. From this, 162 are common to both genders, whilst 70 are unique to male and 64 to female. From the 162 common proteins, 13 are present at levels statistically different (p < 0.05). The method matches the analytical minimalism concept as outlined by Halls, as each stage of this analysis is evaluated to minimize the time, cost, sample requirement, reagent consumption, energy requirements and production of waste products. Copyright © 2017 Elsevier B.V. All rights reserved.
2015-06-07
Field-Portable Gas Chromatograph-Mass Spectrometer.” Forensic Toxicol, 2006, 24, 17-22. Smith, P. “Person-Portable Gas Chromatography : Rapid Temperature...bench-top Gas Chromatograph-Mass Spectrometer (GC-MS) system (ISQ). Nine sites were sampled and analyzed for compounds using Environmental Protection...extraction methods for Liquid Chromatography -MS (LC- MS). Additionally, TD is approximately 1000X more sensitive, requires minimal sample preparation
Kraaij, Gert; Tuijthof, Gabrielle J M; Dankelman, Jenny; Nelissen, Rob G H H; Valstar, Edward R
2015-02-01
Waterjet cutting technology is considered a promising technology to be used for minimally invasive removal of interface tissue surrounding aseptically loose hip prostheses. The goal of this study was to investigate the feasibility of waterjet cutting of interface tissue membrane. Waterjets with 0.2 mm and 0.6 mm diameter, a stand-off distance of 5 mm, and a traverse speed of 0.5 mm/s were used to cut interface tissue samples in half. The water flow through the nozzle was controlled by means of a valve. By changing the flow, the resulting waterjet pressure was regulated. Tissue sample thickness and the required waterjet pressures were measured. Mean thickness of the samples tested within the 0.2 mm nozzle group was 2.3 mm (SD 0.7 mm) and within the 0.6 mm nozzle group 2.6 mm (SD 0.9 mm). The required waterjet pressure to cut samples was between 10 and 12 MPa for the 0.2 mm nozzle and between 5 and 10 MPa for the 0.6 mm nozzle. Cutting bone or bone cement requires about 3 times higher waterjet pressure (30-50 MPa, depending on used nozzle diameter) and therefore we consider waterjet cutting as a safe technique to be used for minimally invasive interface tissue removal. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
(I Can't Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research.
van Rijnsoever, Frank J
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: "random chance," which is based on probability sampling, "minimal information," which yields at least one new code per sampling step, and "maximum information," which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario.
Beattle, A J; Oliver, I
1994-12-01
Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.
The development of a Martian atmospheric Sample collection canister
NASA Astrophysics Data System (ADS)
Kulczycki, E.; Galey, C.; Kennedy, B.; Budney, C.; Bame, D.; Van Schilfgaarde, R.; Aisen, N.; Townsend, J.; Younse, P.; Piacentine, J.
The collection of an atmospheric sample from Mars would provide significant insight to the understanding of the elemental composition and sub-surface out-gassing rates of noble gases. A team of engineers at the Jet Propulsion Laboratory (JPL), California Institute of Technology have developed an atmospheric sample collection canister for Martian application. The engineering strategy has two basic elements: first, to collect two separately sealed 50 cubic centimeter unpressurized atmospheric samples with minimal sensing and actuation in a self contained pressure vessel; and second, to package this atmospheric sample canister in such a way that it can be easily integrated into the orbiting sample capsule for collection and return to Earth. Sample collection and integrity are demonstrated by emulating the atmospheric collection portion of the Mars Sample Return mission on a compressed timeline. The test results achieved by varying the pressure inside of a thermal vacuum chamber while opening and closing the valve on the sample canister at Mars ambient pressure. A commercial off-the-shelf medical grade micro-valve is utilized in the first iteration of this design to enable rapid testing of the system. The valve has been independently leak tested at JPL to quantify and separate the leak rates associated with the canister. The results are factored in to an overall system design that quantifies mass, power, and sensing requirements for a Martian atmospheric Sample Collection (MASC) canister as outlined in the Mars Sample Return mission profile. Qualitative results include the selection of materials to minimize sample contamination, preliminary science requirements, priorities in sample composition, flight valve selection criteria, a storyboard from sample collection to loading in the orbiting sample capsule, and contributions to maintaining “ Earth” clean exterior surfaces on the orbiting sample capsule.
Chen, Chun-Nan; Lin, Che-Yi; Chi, Fan-Hsiang; Chou, Chen-Han; Hsu, Ya-Ching; Kuo, Yen-Lin; Lin, Chih-Feng; Chen, Tseng-Cheng; Wang, Cheng-Ping; Lou, Pei-Jen; Ko, Jenq-Yuh; Hsiao, Tzu-Yu; Yang, Tsung-Lin
2016-01-01
Abstract Tumors of the supraclavicular fossa (SC) is clinically challenging because of anatomical complexity and tumor pathological diversity. Because of varied diseases entities and treatment choices of SC tumors, making the accurate decision among numerous differential diagnoses is imperative. Sampling by open biopsy (OB) remains the standard procedure for pathological confirmation. However, complicated anatomical structures of SC always render surgical intervention difficult to perform. Ultrasound-guided core biopsy (USCB) is a minimally invasive and office-based procedure for tissue sampling widely applied in many diseases of head and neck. This study aims to evaluate the clinical efficacy and utility of using USCB as the sampling method of SC tumors. From 2009 to 2014, consecutive patients who presented clinical symptoms and signs of supraclavicular tumors and were scheduled to receive sampling procedures for diagnostic confirmation were recruited. The patients received USCB or OB respectively in the initial tissue sampling. The accurate diagnostic rate based on pathological results was 90.2% for USCB, and 93.6% for OB. No significant difference was noted between USCB and OB groups in terms of diagnostic accuracy and the percentage of inadequate specimens. All cases in the USCB group had the sampling procedure completed within 10 minutes, but not in the OB group. No scars larger than 1 cm were found in USCB. Only patients in the OB groups had the need to receive general anesthesia and hospitalization and had scars postoperatively. Accordingly, USCB can serve as the first-line sampling tool for SC tumors with high diagnostic accuracy, minimal invasiveness, and low medical cost. PMID:26825877
Rugged, Portable, Real-Time Optical Gaseous Analyzer for Hydrogen Fluoride
NASA Technical Reports Server (NTRS)
Pilgrim, Jeffrey; Gonzales, Paula
2012-01-01
Hydrogen fluoride (HF) is a primary evolved combustion product of fluorinated and perfluorinated hydrocarbons. HF is produced during combustion by the presence of impurities and hydrogen- containing polymers including polyimides. This effect is especially dangerous in closed occupied volumes like spacecraft and submarines. In these systems, combinations of perfluorinated hydrocarbons and polyimides are used for insulating wiring. HF is both highly toxic and short-lived in closed environments due to its reactivity. The high reactivity also makes HF sampling problematic. An infrared optical sensor can detect promptly evolving HF with minimal sampling requirements, while providing both high sensitivity and high specificity. A rugged optical path length enhancement architecture enables both high HF sensitivity and rapid environmental sampling with minimal gaseous contact with the low-reactivity sensor surfaces. The inert optical sample cell, combined with infrared semiconductor lasers, is joined with an analog and digital electronic control architecture that allows for ruggedness and compactness. The combination provides both portability and battery operation on a simple camcorder battery for up to eight hours. Optical detection of gaseous HF is confounded by the need for rapid sampling with minimal contact between the sensor and the environmental sample. A sensor is required that must simultaneously provide the required sub-parts-permillion detection limits, but with the high specificity and selectivity expected of optical absorption techniques. It should also be rugged and compact for compatibility with operation onboard spacecraft and submarines. A new optical cell has been developed for which environmental sampling is accomplished by simply traversing the few mm-thick cell walls into an open volume where the measurement is made. A small, low-power fan or vacuum pump may be used to push or pull the gaseous sample into the sample volume for a response time of a few seconds. The optical cell simultaneously provides for an enhanced optical interaction path length between the environmental sample and the infrared laser. Further, the optical cell itself is comprised of inert materials that render it immune to attack by HF. In some cases, the sensor may be configured so that the optoelectronic devices themselves are protected and isolated from HF by the optical cell. The optical sample cell is combined with custom-developed analog and digital control electronics that provide rugged, compact operation on a platform that can run on a camcorder battery. The sensor is inert with respect to acidic gases like HF, while providing the required sensitivity, selectivity, and response time. Certain types of combustion events evolve copious amounts of HF, very little of other gases typically associated with combustion (e.g., carbon monoxide), and very low levels of aerosols and particulates (which confound traditional smoke detectors). The new sensor platform could warn occupants early enough to take the necessary countermeasures.
Investigation of Flow Conditioners for Compact Jet Engine Simulator Rig Noise Reduction
NASA Technical Reports Server (NTRS)
Doty, Michael J.; Haskin, Henry H.
2011-01-01
The design requirements for two new Compact Jet Engine Simulator (CJES) units for upcoming wind tunnel testing lead to the distinct possibility of rig noise contamination. The acoustic and aerodynamic properties of several flow conditioner devices are investigated over a range of operating conditions relevant to the CJES units to mitigate the risk of rig noise. An impinging jet broadband noise source is placed in the upstream plenum of the test facility permitting measurements of not only flow conditioner self-noise, but also noise attenuation characteristics. Several perforated plate and honeycomb samples of high porosity show minimal self-noise but also minimal attenuation capability. Conversely, low porosity perforated plate and sintered wire mesh conditioners exhibit noticeable attenuation but also unacceptable self-noise. One fine wire mesh sample (DP450661) shows minimal selfnoise and reasonable attenuation, particularly when combined in series with a 15.6 percent open area (POA) perforated plate upstream. This configuration is the preferred flow conditioner system for the CJES, providing up to 20 dB of broadband attenuation capability with minimal self-noise.
Sample Manipulation System for Sample Analysis at Mars
NASA Technical Reports Server (NTRS)
Mumm, Erik; Kennedy, Tom; Carlson, Lee; Roberts, Dustyn
2008-01-01
The Sample Analysis at Mars (SAM) instrument will analyze Martian samples collected by the Mars Science Laboratory Rover with a suite of spectrometers. This paper discusses the driving requirements, design, and lessons learned in the development of the Sample Manipulation System (SMS) within SAM. The SMS stores and manipulates 74 sample cups to be used for solid sample pyrolysis experiments. Focus is given to the unique mechanism architecture developed to deliver a high packing density of sample cups in a reliable, fault tolerant manner while minimizing system mass and control complexity. Lessons learned are presented on contamination control, launch restraint mechanisms for fragile sample cups, and mechanism test data.
NASA Technical Reports Server (NTRS)
Allton, J. H.; Zeigler, R. A.; Calaway, M. J.
2016-01-01
The Lunar Receiving Laboratory (LRL) was planned and constructed in the 1960s to support the Apollo program in the context of landing on the Moon and safely returning humans. The enduring science return from that effort is a result of careful curation of planetary materials. Technical decisions for the first facility included sample handling environment (vacuum vs inert gas), and instruments for making basic sample assessment, but the most difficult decision, and most visible, was stringent biosafety vs ultra-clean sample handling. Biosafety required handling of samples in negative pressure gloveboxes and rooms for containment and use of sterilizing protocols and animal/plant models for hazard assessment. Ultra-clean sample handling worked best in positive pressure nitrogen environment gloveboxes in positive pressure rooms, using cleanable tools of tightly controlled composition. The requirements for these two objectives were so different, that the solution was to design and build a new facility for specific purpose of preserving the scientific integrity of the samples. The resulting Lunar Curatorial Facility was designed and constructed, from 1972-1979, with advice and oversight by a very active committee comprised of lunar sample scientists. The high precision analyses required for planetary science are enabled by stringent contamination control of trace elements in the materials and protocols of construction (e.g., trace element screening for paint and flooring materials) and the equipment used in sample handling and storage. As other astromaterials, especially small particles and atoms, were added to the collections curated, the technical tension between particulate cleanliness and organic cleanliness was addressed in more detail. Techniques for minimizing particulate contamination in sample handling environments use high efficiency air filtering techniques typically requiring organic sealants which offgas. Protocols for reducing adventitious carbon on sample handling surfaces often generate particles. Further work is needed to achieve both minimal particulate and adventitious carbon contamination. This paper will discuss these facility topics and others in the historical context of nearly 50 years' curation experience for lunar rocks and regolith, meteorites, cosmic dust, comet particles, solar wind atoms, and asteroid particles at Johnson Space Center.
Trace Gas Analyzer (TGA) program
NASA Technical Reports Server (NTRS)
1977-01-01
The design, fabrication, and test of a breadboard trace gas analyzer (TGA) is documented. The TGA is a gas chromatograph/mass spectrometer system. The gas chromatograph subsystem employs a recirculating hydrogen carrier gas. The recirculation feature minimizes the requirement for transport and storage of large volumes of carrier gas during a mission. The silver-palladium hydrogen separator which permits the removal of the carrier gas and its reuse also decreases vacuum requirements for the mass spectrometer since the mass spectrometer vacuum system need handle only the very low sample pressure, not sample plus carrier. System performance was evaluated with a representative group of compounds.
Rapid Pneumatic Transport of Radioactive Samples - RaPToRS
NASA Astrophysics Data System (ADS)
Padalino, S.; Barrios, M.; Sangster, C.
2005-10-01
Some ICF neutron activation diagnostics require quick retrieval of the activated sample. Minimizing retrieval times is particularly important when the half-life of the activated material is on the order of the transport time or the degree of radioactivity is close to the background counting level. These restrictions exist in current experiments performed at the Laboratory for Laser Energetics, thus motivating the development of the RaPToRS system. The system has been designed to minimize transportation time while requiring no human intervention during transport or counting. These factors will be important if the system is to be used at the NIF where radiological hazards will be present during post activation. The sample carrier is pneumatically transported via a 4 inch ID PVC pipe to a remote location in excess of 100 meters from the activation site at a speed of approximately 7 m/s. It arrives at an end station where it is dismounted robotically from the carrier and removed from its hermetic package. The sample is then placed by the robot in a counting station. This system is currently being developed to measure back-to-back gamma rays produced by positron annihilation which were emitted by activated graphite. Funded in part by the U.S. DOE under sub contract with LLE at the University of Rochester.
(I Can’t Get No) Saturation: A simulation and guidelines for sample sizes in qualitative research
2017-01-01
I explore the sample size in qualitative research that is required to reach theoretical saturation. I conceptualize a population as consisting of sub-populations that contain different types of information sources that hold a number of codes. Theoretical saturation is reached after all the codes in the population have been observed once in the sample. I delineate three different scenarios to sample information sources: “random chance,” which is based on probability sampling, “minimal information,” which yields at least one new code per sampling step, and “maximum information,” which yields the largest number of new codes per sampling step. Next, I use simulations to assess the minimum sample size for each scenario for systematically varying hypothetical populations. I show that theoretical saturation is more dependent on the mean probability of observing codes than on the number of codes in a population. Moreover, the minimal and maximal information scenarios are significantly more efficient than random chance, but yield fewer repetitions per code to validate the findings. I formulate guidelines for purposive sampling and recommend that researchers follow a minimum information scenario. PMID:28746358
Designing a multiple dependent state sampling plan based on the coefficient of variation.
Yan, Aijun; Liu, Sanyang; Dong, Xiaojuan
2016-01-01
A multiple dependent state (MDS) sampling plan is developed based on the coefficient of variation of the quality characteristic which follows a normal distribution with unknown mean and variance. The optimal plan parameters of the proposed plan are solved by a nonlinear optimization model, which satisfies the given producer's risk and consumer's risk at the same time and minimizes the sample size required for inspection. The advantages of the proposed MDS sampling plan over the existing single sampling plan are discussed. Finally an example is given to illustrate the proposed plan.
The Variation Theorem Applied to H-2+: A Simple Quantum Chemistry Computer Project
ERIC Educational Resources Information Center
Robiette, Alan G.
1975-01-01
Describes a student project which requires limited knowledge of Fortran and only minimal computing resources. The results illustrate such important principles of quantum mechanics as the variation theorem and the virial theorem. Presents sample calculations and the subprogram for energy calculations. (GS)
Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development
Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei
2014-01-01
Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Grizzle, R E; Ward, L G; Fredriksson, D W; Irish, J D; Langan, R; Heinig, C S; Greene, J K; Abeels, H A; Peter, C R; Eberhardt, A L
2014-11-15
The seafloor at an open ocean finfish aquaculture facility in the western Gulf of Maine, USA was monitored from 1999 to 2008 by sampling sites inside a predicted impact area modeled by oceanographic conditions and fecal and food settling characteristics, and nearby reference sites. Univariate and multivariate analyses of benthic community measures from box core samples indicated minimal or no significant differences between impact and reference areas. These findings resulted in development of an adaptive monitoring protocol involving initial low-cost methods that required more intensive and costly efforts only when negative impacts were initially indicated. The continued growth of marine aquaculture is dependent on further development of farming methods that minimize negative environmental impacts, as well as effective monitoring protocols. Adaptive monitoring protocols, such as the one described herein, coupled with mathematical modeling approaches, have the potential to provide effective protection of the environment while minimize monitoring effort and costs. Copyright © 2014 Elsevier Ltd. All rights reserved.
A minimally invasive method for extraction of sturgeon oocytes
Candrl, James S.; Papoulias, Diana M.; Tillitt, Donald E.
2010-01-01
Fishery biologists, hatchery personnel, and caviar fishers routinely extract oocytes from sturgeon (Acipenseridae) to determine the stage of maturation by checking egg quality. Typically, oocytes are removed either by inserting a catheter into the oviduct or by making an incision in the body cavity. Both methods can be time-consuming and stressful to the fish. We describe a device to collect mature oocytes from sturgeons quickly and effectively with minimal stress on the fish. The device is made by creating a needle from stainless steel tubing and connecting it to a syringe with polyvinyl chloride tubing. The device is filled with saline solution or water, the needle is inserted into the abdominal wall, and eggs are extracted from the fish. Using this device, an oocyte sample can be collected in less than 30 s. Such sampling leaves a minute wound that heals quickly and does not require suturing. The extractor device can easily be used in the field or hatchery, reduces fish handling time, and minimizes stress.
Designing single- and multiple-shell sampling schemes for diffusion MRI using spherical code.
Cheng, Jian; Shen, Dinggang; Yap, Pew-Thian
2014-01-01
In diffusion MRI (dMRI), determining an appropriate sampling scheme is crucial for acquiring the maximal amount of information for data reconstruction and analysis using the minimal amount of time. For single-shell acquisition, uniform sampling without directional preference is usually favored. To achieve this, a commonly used approach is the Electrostatic Energy Minimization (EEM) method introduced in dMRI by Jones et al. However, the electrostatic energy formulation in EEM is not directly related to the goal of optimal sampling-scheme design, i.e., achieving large angular separation between sampling points. A mathematically more natural approach is to consider the Spherical Code (SC) formulation, which aims to achieve uniform sampling by maximizing the minimal angular difference between sampling points on the unit sphere. Although SC is well studied in the mathematical literature, its current formulation is limited to a single shell and is not applicable to multiple shells. Moreover, SC, or more precisely continuous SC (CSC), currently can only be applied on the continuous unit sphere and hence cannot be used in situations where one or several subsets of sampling points need to be determined from an existing sampling scheme. In this case, discrete SC (DSC) is required. In this paper, we propose novel DSC and CSC methods for designing uniform single-/multi-shell sampling schemes. The DSC and CSC formulations are solved respectively by Mixed Integer Linear Programming (MILP) and a gradient descent approach. A fast greedy incremental solution is also provided for both DSC and CSC. To our knowledge, this is the first work to use SC formulation for designing sampling schemes in dMRI. Experimental results indicate that our methods obtain larger angular separation and better rotational invariance than the generalized EEM (gEEM) method currently used in the Human Connectome Project (HCP).
ERIC Educational Resources Information Center
Fenk, Christopher J.; Hickman, Nicole M.; Fincke, Melissa A.; Motry, Douglas H.; Lavine, Barry
2010-01-01
An undergraduate LC-MS experiment is described for the identification and quantitative determination of acetaminophen, acetylsalicylic acid, and caffeine in commercial analgesic tablets. This inquiry-based experimental procedure requires minimal sample preparation and provides good analytical results. Students are provided sufficient background…
Optimum runway orientation relative to crosswinds
NASA Technical Reports Server (NTRS)
Falls, L. W.; Brown, S. C.
1972-01-01
Specific magnitudes of crosswinds may exist that could be constraints to the success of an aircraft mission such as the landing of the proposed space shuttle. A method is required to determine the orientation or azimuth of the proposed runway which will minimize the probability of certain critical crosswinds. Two procedures for obtaining the optimum runway orientation relative to minimizing a specified crosswind speed are described and illustrated with examples. The empirical procedure requires only hand calculations on an ordinary wind rose. The theoretical method utilizes wind statistics computed after the bivariate normal elliptical distribution is applied to a data sample of component winds. This method requires only the assumption that the wind components are bivariate normally distributed. This assumption seems to be reasonable. Studies are currently in progress for testing wind components for bivariate normality for various stations. The close agreement between the theoretical and empirical results for the example chosen substantiates the bivariate normal assumption.
A comparison between DART-MS and DSA-MS in the forensic analysis of writing inks.
Drury, Nicholas; Ramotowski, Robert; Moini, Mehdi
2018-05-23
Ambient ionization mass spectrometry is gaining momentum in forensic science laboratories because of its high speed of analysis, minimal sample preparation, and information-rich results. One such application of ambient ionization methodology includes the analysis of writing inks from questioned documents where colorants of interest may not be soluble in common solvents, rendering thin layer chromatography (TLC) and separation-mass spectrometry methods such as LC/MS (-MS) impractical. Ambient ionization mass spectrometry uses a variety of ionization techniques such as penning ionization in Direct Analysis in Real Time (DART), and atmospheric pressure chemical ionization in Direct Sample Analysis (DSA), and electrospray ionization in Desorption Electrospray Ionization (DESI). In this manuscript, two of the commonly used ambient ionization techniques are compared: Perkin Elmer DSA-MS and IonSense DART in conjunction with a JEOL AccuTOF MS. Both technologies were equally successful in analyzing writing inks and produced similar spectra. DSA-MS produced less background signal likely because of its closed source configuration; however, the open source configuration of DART-MS provided more flexibility for sample positioning for optimum sensitivity and thereby allowing smaller piece of paper containing writing ink to be analyzed. Under these conditions, the minimum sample required for DART-MS was 1mm strokes of ink on paper, whereas DSA-MS required a minimum of 3mm. Moreover, both techniques showed comparable repeatability. Evaluation of the analytical figures of merit, including sensitivity, linear dynamic range, and repeatability, for DSA-MS and DART-MS analysis is provided. To the forensic context of the technique, DART-MS was applied to the analysis of United States Secret Service ink samples directly on a sampling mesh, and the results were compared with DSA-MS of the same inks on paper. Unlike analysis using separation mass spectrometry, which requires sample preparation, both DART-MS and DSA-MS successfully analyzed writing inks with minimal sample preparation. Copyright © 2018 Elsevier B.V. All rights reserved.
The feasibility of recharge rate determinations using the steady- state centrifuge method
Nimmo, J.R.; Stonestrom, David A.; Akstin, K.C.
1994-01-01
The establishment of steady unsaturated flow in a centrifuge permits accurate measurement of small values of hydraulic conductivity (K). This method can provide a recharge determination if it is applied to an unsaturated core sample from a depth at which gravity alone drives the flow. A K value determined at the in situ water content indicates the long-term average recharge rate at a point. Tests of this approach have been made at two sites. For sandy core samples a better knowledge of the matric pressure profiles is required before a recharge rate can be determined. Fine-textured cores required new developments of apparatus and procedures, especially for making centrifuge measurements with minimal compaction of the samples. -from Authors
Sample size allocation for food item radiation monitoring and safety inspection.
Seto, Mayumi; Uriu, Koichiro
2015-03-01
The objective of this study is to identify a procedure for determining sample size allocation for food radiation inspections of more than one food item to minimize the potential risk to consumers of internal radiation exposure. We consider a simplified case of food radiation monitoring and safety inspection in which a risk manager is required to monitor two food items, milk and spinach, in a contaminated area. Three protocols for food radiation monitoring with different sample size allocations were assessed by simulating random sampling and inspections of milk and spinach in a conceptual monitoring site. Distributions of (131)I and radiocesium concentrations were determined in reference to (131)I and radiocesium concentrations detected in Fukushima prefecture, Japan, for March and April 2011. The results of the simulations suggested that a protocol that allocates sample size to milk and spinach based on the estimation of (131)I and radiocesium concentrations using the apparent decay rate constants sequentially calculated from past monitoring data can most effectively minimize the potential risks of internal radiation exposure. © 2014 Society for Risk Analysis.
International Space Station Urine Monitoring System Functional Integration and Science Testing
NASA Technical Reports Server (NTRS)
Rodriguez, Branelle R.; Broyan, James Lee, Jr.
2008-01-01
Exposure to microgravity during human spaceflight is required to be defined and understood as the human exploration of space requires longer duration missions. It is known that long term exposure to microgravity causes bone loss. Urine voids are capable of measuring the calcium and other metabolic byproducts in a constituent s urine. The International Space Station (ISS) Urine Monitoring System (UMS) is an automated urine collection device designed to collect urine, separate the urine and air, measure the void volume, and allow for syringe sampling. Accurate measuring and minimal cross contamination is essential to determine bone loss and the effectiveness of countermeasures. The ISS UMS provides minimal cross contamination (<0.7 ml urine) and has volume accuracy of +/-2% between 100 to 1000 ml urine voids.
International Space Station Urine Monitoring System Functional Integration and Science Testing
NASA Technical Reports Server (NTRS)
Cibuzar, Branelle R.; Broyan, James Lee, Jr.
2009-01-01
Exposure to microgravity during human spaceflight is required to be defined and understood as the human exploration of space requires longer duration missions. It is known that long term exposure to microgravity causes bone loss. Urine voids are capable of measuring the calcium and other metabolic byproducts in a constituent s urine. The International Space Station (ISS) Urine Monitoring System (UMS) is an automated urine collection device designed to collect urine, separate the urine and air, measure the void volume, and allow for syringe sampling. Accurate measuring and minimal cross contamination is essential to determine bone loss and the effectiveness of countermeasures. The ISS UMS provides minimal cross contamination (<0.7 ml urine) and has volume accuracy of +/-2% between 100 to 1000 ml urine voids.
MacAllister, Rhonda Pung; Lester McCully, Cynthia M; Bacher, John; Thomas, Marvin L; Cruz, Rafael; Wangari, Solomon; Warren, Katherine E
2016-01-01
Biomedical translational research frequently incorporates collection of CSF from NHP, because CSF drug levels are used as a surrogate for CNS tissue penetration in pharmacokinetic and dynamic studies. Surgical placement of a CNS ventricular catheter reservoir for CSF collection is an intensive model to create and maintain and thus may not be feasible or practical for short-term studies. Furthermore, previous NHP lumbar port models require laminectomy for catheter placement. The new model uses a minimally invasive technique for percutaneous placement of a lumbar catheter to create a closed, subcutaneous system for effective, repeated CSF sample collection. None of the rhesus macaques (Macaca mulatta; n = 10) implanted with our minimally invasive lumbar port (MILP) system experienced neurologic deficits, postoperative infection of the surgical site, or skin erosion around the port throughout the 21.7-mo study. Functional MILP systems were maintained in 70% of the macaques, with multiple, high-quality, 0.5- to 1.0-mL samples of CSF collected for an average of 3 mo by using aspiration or gravitational flow. Among these macaques, 57% had continuous functionality for a mean of 19.2 mo; 50% of the cohort required surgical repair for port repositioning and replacement during the study. The MILP was unsuccessful in 2 macaques, at an average of 9.5 d after surgery. Nonpatency in these animals was attributed to the position of the lumbar catheter. The MILP system is an appropriate replacement for temporary catheterization and previous models requiring laminectomy and is a short-term alternative for ventricular CSF collection systems in NHP. PMID:27538866
Biowaste monitoring system for shuttle
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Sauer, R. L.
1975-01-01
The acquisition of crew biomedical data has been an important task on all manned space missions from Project Mercury through the recently completed Skylab Missions. The monitoring of metabolic wastes from the crew is an important aspect of this activity. On early missions emphasis was placed on the collection and return of biowaste samples for post-mission analysis. On later missions such as Skylab, equipment for inflight measurement was also added. Life Science experiments are being proposed for Shuttle missions which will require the inflight measurement and sampling of metabolic wastes. In order to minimize the crew impact associated with these requirements, a high degree of automation of these processes will be required. This paper reviews the design and capabilities of urine biowaste monitoring equipment provided on past-manned space programs and defines and describes the urine volume measurement and sampling equipment planned for the Shuttle Orbiter program.
Pinart, Mariona; Nimptsch, Katharina; Bouwman, Jildau; Dragsted, Lars O; Yang, Chen; De Cock, Nathalie; Lachat, Carl; Perozzi, Giuditta; Canali, Raffaella; Lombardo, Rosario; D'Archivio, Massimo; Guillaume, Michèle; Donneau, Anne-Françoise; Jeran, Stephanie; Linseisen, Jakob; Kleiser, Christina; Nöthlings, Ute; Barbaresko, Janett; Boeing, Heiner; Stelmach-Mardas, Marta; Heuer, Thorsten; Laird, Eamon; Walton, Janette; Gasparini, Paolo; Robino, Antonietta; Castaño, Luis; Rojo-Martínez, Gemma; Merino, Jordi; Masana, Luis; Standl, Marie; Schulz, Holger; Biagi, Elena; Nurk, Eha; Matthys, Christophe; Gobbetti, Marco; de Angelis, Maria; Windler, Eberhard; Zyriax, Birgit-Christiane; Tafforeau, Jean; Pischon, Tobias
2018-02-01
Joint data analysis from multiple nutrition studies may improve the ability to answer complex questions regarding the role of nutritional status and diet in health and disease. The objective was to identify nutritional observational studies from partners participating in the European Nutritional Phenotype Assessment and Data Sharing Initiative (ENPADASI) Consortium, as well as minimal requirements for joint data analysis. A predefined template containing information on study design, exposure measurements (dietary intake, alcohol and tobacco consumption, physical activity, sedentary behavior, anthropometric measures, and sociodemographic and health status), main health-related outcomes, and laboratory measurements (traditional and omics biomarkers) was developed and circulated to those European research groups participating in the ENPADASI under the strategic research area of "diet-related chronic diseases." Information about raw data disposition and metadata sharing was requested. A set of minimal requirements was abstracted from the gathered information. Studies (12 cohort, 12 cross-sectional, and 2 case-control) were identified. Two studies recruited children only and the rest recruited adults. All studies included dietary intake data. Twenty studies collected blood samples. Data on traditional biomarkers were available for 20 studies, of which 17 measured lipoproteins, glucose, and insulin and 13 measured inflammatory biomarkers. Metabolomics, proteomics, and genomics or transcriptomics data were available in 5, 3, and 12 studies, respectively. Although the study authors were willing to share metadata, most refused, were hesitant, or had legal or ethical issues related to sharing raw data. Forty-one descriptors of minimal requirements for the study data were identified to facilitate data integration. Combining study data sets will enable sufficiently powered, refined investigations to increase the knowledge and understanding of the relation between food, nutrition, and human health. Furthermore, the minimal requirements for study data may encourage more efficient secondary usage of existing data and provide sufficient information for researchers to draft future multicenter research proposals in nutrition.
ERIC Educational Resources Information Center
Kagel, R. A.; Farwell, S. O.
1983-01-01
Background information, procedures, and results, are provided for an undergraduate experiment in which analgesic tablets are analyzed using liquid chromatography. The experiment, an improved, modified version of the Waters Associates Inc. experiment, is simple to prepare, requiring little glassware and minimal sample manipulation by students. (JN)
A technique for thermal desorption analyses suitable for thermally-labile, volatile compounds
USDA-ARS?s Scientific Manuscript database
Our group has for some time studied below ground plant produced volatile signals affecting nematode and insect behavior. The research requires repeated sampling of intact plant/soil systems in the lab as well as the field with the help of probes to minimize unwanted effects on the systems we are stu...
Toward high-resolution NMR spectroscopy of microscopic liquid samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, Mark C.; Mehta, Hardeep S.; Chen, Ying
A longstanding limitation of high-resolution NMR spectroscopy is the requirement for samples to have macroscopic dimensions. Commercial probes, for example, are designed for volumes of at least 5 mL, in spite of decades of work directed toward the goal of miniaturization. Progress in miniaturizing inductive detectors has been limited by a perceived need to meet two technical requirements: (1) minimal separation between the sample and the detector, which is essential for sensitivity, and (2) near-perfect magnetic-field homogeneity at the sample, which is typically needed for spectral resolution. The first of these requirements is real, but the second can be relaxed,more » as we demonstrate here. By using pulse sequences that yield high-resolution spectra in an inhomogeneous field, we eliminate the need for near-perfect field homogeneity and the accompanying requirement for susceptibility matching of microfabricated detector components. With this requirement removed, typical imperfections in microfabricated components can be tolerated, and detector dimensions can be matched to those of the sample, even for samples of volume << 5 uL. Pulse sequences that are robust to field inhomogeneity thus enable small-volume detection with optimal sensitivity. We illustrate the potential of this approach to miniaturization by presenting spectra acquired with a flat-wire detector that can easily be scaled to subnanoliter volumes. In particular, we report high-resolution NMR spectroscopy of an alanine sample of volume 500 pL.« less
Sampling Mars: Analytical requirements and work to do in advance
NASA Technical Reports Server (NTRS)
Koeberl, Christian
1988-01-01
Sending a mission to Mars to collect samples and return them to the Earth for analysis is without doubt one of the most exciting and important tasks for planetary science in the near future. Many scientifically important questions are associated with the knowledge of the composition and structure of Martian samples. Amongst the most exciting questions is the clarification of the SNC problem- to prove or disprove a possible Martian origin of these meteorites. Since SNC meteorites have been used to infer the chemistry of the planet Mars, and its evolution (including the accretion history), it would be important to know if the whole story is true. But before addressing possible scientific results, we have to deal with the analytical requirements, and with possible pre-return work. It is unlikely to expect that a possible Mars sample return mission will bring back anything close to the amount returned by the Apollo missions. It will be more like the amount returned by the Luna missions, or at least in that order of magnitude. This requires very careful sample selection, and very precise analytical techniques. These techniques should be able to use minimal sample sizes and on the other hand optimize the scientific output. The possibility to work with extremely small samples should not obstruct another problem: possible sampling errors. As we know from terrestrial geochemical studies, sampling procedures are quite complicated and elaborate to ensure avoiding sampling errors. The significance of analyzing a milligram or submilligram sized sample and putting that in relationship with the genesis of whole planetary crusts has to be viewed with care. This leaves a dilemma on one hand, to minimize the sample size as far as possible in order to have the possibility of returning as many different samples as possible, and on the other hand to take a sample large enough to be representative. Whole rock samples are very useful, but should not exceed the 20 to 50 g range, except in cases of extreme inhomogeneity, because for larger samples the information tends to become redundant. Soil samples should be in the 2 to 10 g range, permitting the splitting of the returned samples for studies in different laboratories with variety of techniques.
Radioisotope dilution analyses of geological samples using 236U and 229Th
Rosholt, J.N.
1984-01-01
The use of 236U and 229Th in alpha spectrometric measurements has some advantages over the use of other tracers and measurement techniques in isotope dilution analyses of most geological samples. The advantages are: (1) these isotopes do not occur in terrestrial rocks, (2) they have negligible decay losses because of their long half lives, (3) they cause minimal recoil contamination to surface-barrier detectors, (4) they allow for simultaneous determination of the concentration and isotopic composition of uranium and thorium in a variety of sample types, and (5) they allow for simple and constant corrections for spectral inferences, 0.5% of the 238U activity is subtracted for the contribution of 235U in the 236U peak and 1% of the 229Th activity is subtracted from the 230Th activity. Disadvantages in using 236U and 229Th are: (1) individual separates of uranium and thorium must be prepared as very thin sources for alpha spectrometry, (2) good resolution in the spectrometer system is required for thorium isotopic measurements where measurement times may extend to 300 h, and (3) separate calibrations of the 236U and 229Th spike solution with both uranium and thorium standards are required. The use of these tracers in applications of uranium-series disequilibrium studies has simplified the measurements required for the determination of the isotopic composition of uranium and thorium because of the minimal corrections needed for alpha spectral interferences. ?? 1984.
Sampling strategies for estimating acute and chronic exposures of pesticides in streams
Crawford, Charles G.
2004-01-01
The Food Quality Protection Act of 1996 requires that human exposure to pesticides through drinking water be considered when establishing pesticide tolerances in food. Several systematic and seasonally weighted systematic sampling strategies for estimating pesticide concentrations in surface water were evaluated through Monte Carlo simulation, using intensive datasets from four sites in northwestern Ohio. The number of samples for the strategies ranged from 4 to 120 per year. Sampling strategies with a minimal sampling frequency outside the growing season can be used for estimating time weighted mean and percentile concentrations of pesticides with little loss of accuracy and precision, compared to strategies with the same sampling frequency year round. Less frequent sampling strategies can be used at large sites. A sampling frequency of 10 times monthly during the pesticide runoff period at a 90 km 2 basin and four times monthly at a 16,400 km2 basin provided estimates of the time weighted mean, 90th, 95th, and 99th percentile concentrations that fell within 50 percent of the true value virtually all of the time. By taking into account basin size and the periodic nature of pesticide runoff, costs of obtaining estimates of time weighted mean and percentile pesticide concentrations can be minimized.
Hess, Allison E.; Potter, Kelsey A.; Tyler, Dustin J.; Zorman, Christian A.; Capadona, Jeffrey R.
2013-01-01
Implantable microdevices are gaining significant attention for several biomedical applications1-4. Such devices have been made from a range of materials, each offering its own advantages and shortcomings5,6. Most prominently, due to the microscale device dimensions, a high modulus is required to facilitate implantation into living tissue. Conversely, the stiffness of the device should match the surrounding tissue to minimize induced local strain7-9. Therefore, we recently developed a new class of bio-inspired materials to meet these requirements by responding to environmental stimuli with a change in mechanical properties10-14. Specifically, our poly(vinyl acetate)-based nanocomposite (PVAc-NC) displays a reduction in stiffness when exposed to water and elevated temperatures (e.g. body temperature). Unfortunately, few methods exist to quantify the stiffness of materials in vivo15, and mechanical testing outside of the physiological environment often requires large samples inappropriate for implantation. Further, stimuli-responsive materials may quickly recover their initial stiffness after explantation. Therefore, we have developed a method by which the mechanical properties of implanted microsamples can be measured ex vivo, with simulated physiological conditions maintained using moisture and temperature control13,16,17. To this end, a custom microtensile tester was designed to accommodate microscale samples13,17 with widely-varying Young's moduli (range of 10 MPa to 5 GPa). As our interests are in the application of PVAc-NC as a biologically-adaptable neural probe substrate, a tool capable of mechanical characterization of samples at the microscale was necessary. This tool was adapted to provide humidity and temperature control, which minimized sample drying and cooling17. As a result, the mechanical characteristics of the explanted sample closely reflect those of the sample just prior to explantation. The overall goal of this method is to quantitatively assess the in vivo mechanical properties, specifically the Young's modulus, of stimuli-responsive, mechanically-adaptive polymer-based materials. This is accomplished by first establishing the environmental conditions that will minimize a change in sample mechanical properties after explantation without contributing to a reduction in stiffness independent of that resulting from implantation. Samples are then prepared for implantation, handling, and testing (Figure 1A). Each sample is implanted into the cerebral cortex of rats, which is represented here as an explanted rat brain, for a specified duration (Figure 1B). At this point, the sample is explanted and immediately loaded into the microtensile tester, and then subjected to tensile testing (Figure 1C). Subsequent data analysis provides insight into the mechanical behavior of these innovative materials in the environment of the cerebral cortex. PMID:23995288
NASA Astrophysics Data System (ADS)
Sun, Chengjun; Jiang, Fenghua; Gao, Wei; Li, Xiaoyun; Yu, Yanzhen; Yin, Xiaofei; Wang, Yong; Ding, Haibing
2017-01-01
Detection of sulfur-oxidizing bacteria has largely been dependent on targeted gene sequencing technology or traditional cell cultivation, which usually takes from days to months to carry out. This clearly does not meet the requirements of analysis for time-sensitive samples and/or complicated environmental samples. Since energy-dispersive X-ray spectrometry (EDS) can be used to simultaneously detect multiple elements in a sample, including sulfur, with minimal sample treatment, this technology was applied to detect sulfur-oxidizing bacteria using their high sulfur content within the cell. This article describes the application of scanning electron microscopy imaging coupled with EDS mapping for quick detection of sulfur oxidizers in contaminated environmental water samples, with minimal sample handling. Scanning electron microscopy imaging revealed the existence of dense granules within the bacterial cells, while EDS identified large amounts of sulfur within them. EDS mapping localized the sulfur to these granules. Subsequent 16S rRNA gene sequencing showed that the bacteria detected in our samples belonged to the genus Chromatium, which are sulfur oxidizers. Thus, EDS mapping made it possible to identify sulfur oxidizers in environmental samples based on localized sulfur within their cells, within a short time (within 24 h of sampling). This technique has wide ranging applications for detection of sulfur bacteria in environmental water samples.
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Behavioural cues of reproductive status in seahorses Hippocampus abdominalis.
Whittington, C M; Musolf, K; Sommer, S; Wilson, A B
2013-07-01
A method is described to assess the reproductive status of male Hippocampus abdominalis on the basis of behavioural traits. The non-invasive nature of this technique minimizes handling stress and reduces sampling requirements for experimental work. It represents a useful tool to assist researchers in sample collection for studies of reproduction and development in viviparous syngnathids, which are emerging as important model species. © 2013 The Authors. Journal of Fish Biology © 2013 The Fisheries Society of the British Isles.
Neuromuscular dose-response studies: determining sample size.
Kopman, A F; Lien, C A; Naguib, M
2011-02-01
Investigators planning dose-response studies of neuromuscular blockers have rarely used a priori power analysis to determine the minimal sample size their protocols require. Institutional Review Boards and peer-reviewed journals now generally ask for this information. This study outlines a proposed method for meeting these requirements. The slopes of the dose-response relationships of eight neuromuscular blocking agents were determined using regression analysis. These values were substituted for γ in the Hill equation. When this is done, the coefficient of variation (COV) around the mean value of the ED₅₀ for each drug is easily calculated. Using these values, we performed an a priori one-sample two-tailed t-test of the means to determine the required sample size when the allowable error in the ED₅₀ was varied from ±10-20%. The COV averaged 22% (range 15-27%). We used a COV value of 25% in determining the sample size. If the allowable error in finding the mean ED₅₀ is ±15%, a sample size of 24 is needed to achieve a power of 80%. Increasing 'accuracy' beyond this point requires increasing greater sample sizes (e.g. an 'n' of 37 for a ±12% error). On the basis of the results of this retrospective analysis, a total sample size of not less than 24 subjects should be adequate for determining a neuromuscular blocking drug's clinical potency with a reasonable degree of assurance.
Bartram, Jack; Mountjoy, Edward; Brooks, Tony; Hancock, Jeremy; Williamson, Helen; Wright, Gary; Moppett, John; Goulden, Nick; Hubank, Mike
2016-07-01
High-throughput sequencing (HTS) (next-generation sequencing) of the rearranged Ig and T-cell receptor genes promises to be less expensive and more sensitive than current methods of monitoring minimal residual disease (MRD) in patients with acute lymphoblastic leukemia. However, the adoption of new approaches by clinical laboratories requires careful evaluation of all potential sources of error and the development of strategies to ensure the highest accuracy. Timely and efficient clinical use of HTS platforms will depend on combining multiple samples (multiplexing) in each sequencing run. Here we examine the Ig heavy-chain gene HTS on the Illumina MiSeq platform for MRD. We identify errors associated with multiplexing that could potentially impact the accuracy of MRD analysis. We optimize a strategy that combines high-purity, sequence-optimized oligonucleotides, dual indexing, and an error-aware demultiplexing approach to minimize errors and maximize sensitivity. We present a probability-based, demultiplexing pipeline Error-Aware Demultiplexer that is suitable for all MiSeq strategies and accurately assigns samples to the correct identifier without excessive loss of data. Finally, using controls quantified by digital PCR, we show that HTS-MRD can accurately detect as few as 1 in 10(6) copies of specific leukemic MRD. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Analysis of the Special Studies Program Based on the Interviews of Its Students.
ERIC Educational Resources Information Center
Esp, Barbarann; Torelli, Alexis
The special studies program at Hofstra University is designed for high school graduates applying to the university whose educational backgrounds require a more personalized approach to introductory college work. An attempt is made to minimize the risk of poor academic performance during the first year in college. A random sample of 24 students in…
A compact, fast ozone UV photometer and sampling inlet for research aircraft
NASA Astrophysics Data System (ADS)
Gao, R. S.; Ballard, J.; Watts, L. A.; Thornberry, T. D.; Ciciora, S. J.; McLaughlin, R. J.; Fahey, D. W.
2012-05-01
In situ measurements of atmospheric ozone (O3) are performed routinely from many research aircraft platforms. The most common technique depends on the strong absorption of ultraviolet (UV) light by ozone. As atmospheric science advances to the widespread use of unmanned aircraft systems (UASs), there is an increasing requirement for minimizing instrument space, weight, and power while maintaining instrument accuracy, precision and time response. The design and use of a new, dual-beam, polarized, UV photometer instrument for in situ O3 measurements is described. The instrument has a fast sampling rate (2 Hz), high accuracy (3%), and precision (1.1 × 1010 O3 molecules cm-3). The size (36 l), weight (18 kg), and power (50-200 W) make the instrument suitable for many UAS and other airborne platforms. Inlet and exhaust configurations are also described for ambient sampling in the troposphere and lower stratosphere (1000-50 mb) that optimize the sample flow rate to increase time response while minimizing loss of precision due to induced turbulence in the sample cell. In-flight and laboratory intercomparisons with existing O3 instruments show that measurement accuracy is maintained in flight.
Dayre McNally, J; Matheson, Loren A; Sankaran, Koravangattu; Rosenberg, Alan M
2008-11-01
This study compared 25-hydroxyvitamin D [25(OH)D] measurements in capillary and venous blood samples collected, respectively by fingerprick and venipuncture. Capillary blood for measuring 25(OH)D has potential advantages by reducing blood volume required (2mL versus 0.3mL for venipuncture and capillary sampling, respectively), facilitating blood collection for those populations in whom venipuncture is difficult (e.g. infants and children), improving patient convenience and reducing costs associated with phlebotomy. The results demonstrated a highly significant relationship between 25(OH)D levels in serum derived from venous and capillary blood samples (r(2)=0.901). Despite statistically higher 25(OH)D levels in fingerprick samples (108+/-9nmol/L) compared with venipuncture samples (90+/-7nmol/L), the correlation between venous and capillary samples provides support for this approach as a practical alternative to venipuncture for vitamin D determination. However, clinical application may require the incorporation of a correction factor for the assessment of insufficiency, and research studies should avoid using the two methods interchangeably. Studying vitamin D's role in health and disease requires collection techniques and measurement methods that are reliable, reproducible, easily accessible, inexpensive and minimally burdensome to the patient. The option to collect patient samples by fingerprick may facilitate the collection process.
Cleaning Genesis Sample Return Canister for Flight: Lessons for Planetary Sample Return
NASA Technical Reports Server (NTRS)
Allton, J. H.; Hittle, J. D.; Mickelson, E. T.; Stansbery, Eileen K.
2016-01-01
Sample return missions require chemical contamination to be minimized and potential sources of contamination to be documented and preserved for future use. Genesis focused on and successfully accomplished the following: - Early involvement provided input to mission design: a) cleanable materials and cleanable design; b) mission operation parameters to minimize contamination during flight. - Established contamination control authority at a high level and developed knowledge and respect for contamination control across all institutions at the working level. - Provided state-of-the-art spacecraft assembly cleanroom facilities for science canister assembly and function testing. Both particulate and airborne molecular contamination was minimized. - Using ultrapure water, cleaned spacecraft components to a very high level. Stainless steel components were cleaned to carbon monolayer levels (10 (sup 15) carbon atoms per square centimeter). - Established long-term curation facility Lessons learned and areas for improvement, include: - Bare aluminum is not a cleanable surface and should not be used for components requiring extreme levels of cleanliness. The problem is formation of oxides during rigorous cleaning. - Representative coupons of relevant spacecraft components (cut from the same block at the same time with identical surface finish and cleaning history) should be acquired, documented and preserved. Genesis experience suggests that creation of these coupons would be facilitated by specification on the engineering component drawings. - Component handling history is critical for interpretation of analytical results on returned samples. This set of relevant documents is not the same as typical documentation for one-way missions and does include data from several institutions, which need to be unified. Dedicated resources need to be provided for acquiring and archiving appropriate documents in one location with easy access for decades. - Dedicated, knowledgeable contamination control oversight should be provided at sites of fabrication and integration. Numerous excellent Genesis chemists and analytical facilities participated in the contamination oversight; however, additional oversight at fabrication sites would have been helpful.
da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P
2018-04-01
Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.
Keskin, Uğur; Karasahin, Kazim Emre; Ulubay, Mustafa; Fidan, Ulaş; Gungor, Sadettin; Ergun, Ali
2015-11-01
Intrauterine fetal transfusion needs extensive experience and requires excellent eye-hand coordination, good equipment and experienced team workers to achieve success. While the needle is in the umbilical vein, an assistant withdraws and/or transfuses blood. The needle point should be kept still to prevent lacerations and dislodging. We propose a simple set for Intrauterine Fetal blood transfusion is constructed by readily available materials in every clinic to minimize needle tip movement and movements during syringe attachments and withdrawals during the intrauterine fetal transfusion. This makes possible to withdraw fetal blood sample, and to transfuse blood with minimal intervention.
Progress in Developing Transfer Functions for Surface Scanning Eddy Current Inspections
NASA Astrophysics Data System (ADS)
Shearer, J.; Heebl, J.; Brausch, J.; Lindgren, E.
2009-03-01
As US Air Force (USAF) aircraft continue to age, additional inspections are required for structural components. The validation of new inspections typically requires a capability demonstration of the method using representative structure with representative damage. To minimize the time and cost required to prepare such samples, Electric Discharge machined (EDM) notches are commonly used to represent fatigue cracks in validation studies. However, the sensitivity to damage typically changes as a function of damage type. This requires a mathematical relationship to be developed between the responses from the two different flaw types to enable the use of EDM notched samples to validate new inspections. This paper reviews progress to develop transfer functions for surface scanning eddy current inspections of aluminum and titanium alloys found in structural aircraft components. Multiple samples with well characterized grown fatigue cracks and master gages with EDM notches, both with a range of flaw sizes, were used to collect flaw signals with USAF field inspection equipment. Analysis of this empirical data was used to develop a transfer function between the response from the EDM notches and grown fatigue cracks.
Variability of Photodynamic Killing in Escherichia coli and Avoidance of Variability with Agar
O'Bryan, Corliss; Harrison, Arthur P.
1971-01-01
Photodynamic killing of Escherichia coli in acridine orange is influenced by the composition of the containing vessel, and after high kill the variance between replicate suspensions is greater than attributable solely to sampling and plating. Addition of agar minimizes both phenomena, but a higher illumination dose is required to produce the same degree of killing. PMID:4934057
Intact preservation of environmental samples by freezing under an alternating magnetic field.
Morono, Yuki; Terada, Takeshi; Yamamoto, Yuhji; Xiao, Nan; Hirose, Takehiro; Sugeno, Masaya; Ohwada, Norio; Inagaki, Fumio
2015-04-01
The study of environmental samples requires a preservation system that stabilizes the sample structure, including cells and biomolecules. To address this fundamental issue, we tested the cell alive system (CAS)-freezing technique for subseafloor sediment core samples. In the CAS-freezing technique, an alternating magnetic field is applied during the freezing process to produce vibration of water molecules and achieve a stable, super-cooled liquid phase. Upon further cooling, the temperature decreases further, achieving a uniform freezing of sample with minimal ice crystal formation. In this study, samples were preserved using the CAS and conventional freezing techniques at 4, -20, -80 and -196 (liquid nitrogen) °C. After 6 months of storage, microbial cell counts by conventional freezing significantly decreased (down to 10.7% of initial), whereas that by CAS-freezing resulted in minimal. When Escherichia coli cells were tested under the same freezing conditions and storage for 2.5 months, CAS-frozen E. coli cells showed higher viability than the other conditions. In addition, an alternating magnetic field does not impact on the direction of remanent magnetization in sediment core samples, although slight partial demagnetization in intensity due to freezing was observed. Consequently, our data indicate that the CAS technique is highly useful for the preservation of environmental samples. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.
McDade, Thomas W; Williams, Sharon; Snodgrass, J Josh
2007-11-01
Logistical constraints associated with the collection and analysis of biological samples in community-based settings have been a significant impediment to integrative, multilevel bio-demographic and biobehavioral research. However recent methodological developments have overcome many of these constraints and have also expanded the options for incorporating biomarkers into population-based health research in international as well as domestic contexts. In particular using dried blood spot (DBS) samples-drops of whole blood collected on filter paper from a simple finger prick-provides a minimally invasive method for collecting blood samples in nonclinical settings. After a brief discussion of biomarkers more generally, we review procedures for collecting, handling, and analyzing DBS samples. Advantages of using DBS samples-compared with venipuncture include the relative ease and low cost of sample collection, transport, and storage. Disadvantages include requirements for assay development and validation as well as the relatively small volumes of sample. We present the results of a comprehensive literature review of published protocols for analysis of DBS samples, and we provide more detailed analysis of protocols for 45 analytes likely to be of particular relevance to population-level health research. Our objective is to provide investigators with the information they need to make informed decisions regarding the appropriateness of blood spot methods for their research interests.
A decomposition approach to the design of a multiferroic memory bit
NASA Astrophysics Data System (ADS)
Acevedo, Ruben; Liang, Cheng-Yen; Carman, Gregory P.; Sepulveda, Abdon E.
2017-06-01
The objective of this paper is to present a methodology for the design of a memory bit to minimize the energy required to write data at the bit level. By straining a ferromagnetic nickel nano-dot by means of a piezoelectric substrate, its magnetization vector rotates between two stable states defined as a 1 and 0 for digital memory. The memory bit geometry, actuation mechanism and voltage control law were used as design variables. The approach used was to decompose the overall design process into simpler sub-problems whose structure can be exploited for a more efficient solution. This method minimizes the number of fully dynamic coupled finite element analyses required to converge to a near optimal design, thus decreasing the computational time for the design process. An in-plane sample design problem is presented to illustrate the advantages and flexibility of the procedure.
[Diagnosis of primary hyperlipoproteinemia in umbilical cord blood (author's transl)].
Parwaresch, M R; Radzun, H J; Mäder, C
1977-10-01
The aim of the present investigation was to assay the frequency of primary dyslipoproteinemia in a random sample of one hundred newborns and to describe the minimal methodical requirements for sound diagnosis. After comparison of different methods total lipids were determined by gravimetry, cholesterol and triglycerides by enzymatic methods, nonesterified fatty acids by direct colorimetry; phospholipids were estimated indirectly. All measurements were applied to umbilical cord sera and to lipoprotein fractions separated by selective precipitation. The diagnosis of hyperlipoproteinemia type IV, which is the most frequent one in adults, is highly afflicted with pitfalls in the postnatal period. A primary hyper-alpha-liproteinemia occured in one case and type II-hyperlipoproteinemia in two cases, one of the parents being involved in each case. For mass screening triglycerides should be assayed in serum and cholesterol in precipitated and resolubilized LDL-fraction, for which the minimal requirements are described.
NASA Astrophysics Data System (ADS)
Carter, Jeffrey R.; Simon, Wayne E.
1990-08-01
Neural networks are trained using Recursive Error Minimization (REM) equations to perform statistical classification. Using REM equations with continuous input variables reduces the required number of training experiences by factors of one to two orders of magnitude over standard back propagation. Replacing the continuous input variables with discrete binary representations reduces the number of connections by a factor proportional to the number of variables reducing the required number of experiences by another order of magnitude. Undesirable effects of using recurrent experience to train neural networks for statistical classification problems are demonstrated and nonrecurrent experience used to avoid these undesirable effects. 1. THE 1-41 PROBLEM The statistical classification problem which we address is is that of assigning points in ddimensional space to one of two classes. The first class has a covariance matrix of I (the identity matrix) the covariance matrix of the second class is 41. For this reason the problem is known as the 1-41 problem. Both classes have equal probability of occurrence and samples from both classes may appear anywhere throughout the ddimensional space. Most samples near the origin of the coordinate system will be from the first class while most samples away from the origin will be from the second class. Since the two classes completely overlap it is impossible to have a classifier with zero error. The minimum possible error is known as the Bayes error and
Sensitive microplate assay for the detection of proteolytic enzymes using radiolabeled gelatin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, B.D.; Kwan-Lim, G.E.; Maizels, R.M.
1988-07-01
A sensitive, microplate assay is described for the detection of a wide range of proteolytic enzymes, using radio-iodine-labeled gelatin as substrate. The technique uses the Bolton-Hunter reagent to label the substrate, which is then coated onto the wells of polyvinyl chloride microtiter plates. By measuring the radioactivity released the assay is able to detect elastase, trypsin, and collagenase in concentrations of 1 ng/ml or less, while the microtiter format permits multiple sample handling and minimizes sample volumes required for analysis.
Magic Angle Spinning NMR Metabolomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhi Hu, Jian
Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.
Passive ultrasonics using sub-Nyquist sampling of high-frequency thermal-mechanical noise.
Sabra, Karim G; Romberg, Justin; Lani, Shane; Degertekin, F Levent
2014-06-01
Monolithic integration of capacitive micromachined ultrasonic transducer arrays with low noise complementary metal oxide semiconductor electronics minimizes interconnect parasitics thus allowing the measurement of thermal-mechanical (TM) noise. This enables passive ultrasonics based on cross-correlations of diffuse TM noise to extract coherent ultrasonic waves propagating between receivers. However, synchronous recording of high-frequency TM noise puts stringent requirements on the analog to digital converter's sampling rate. To alleviate this restriction, high-frequency TM noise cross-correlations (12-25 MHz) were estimated instead using compressed measurements of TM noise which could be digitized at a sampling frequency lower than the Nyquist frequency.
NASA Astrophysics Data System (ADS)
Lüpke, Felix; Cuma, David; Korte, Stefan; Cherepanov, Vasily; Voigtländer, Bert
2018-02-01
We present a four-point probe resistance measurement technique which uses four equivalent current measuring units, resulting in minimal hardware requirements and corresponding sources of noise. Local sample potentials are measured by a software feedback loop which adjusts the corresponding tip voltage such that no current flows to the sample. The resulting tip voltage is then equivalent to the sample potential at the tip position. We implement this measurement method into a multi-tip scanning tunneling microscope setup such that potentials can also be measured in tunneling contact, allowing in principle truly non-invasive four-probe measurements. The resulting measurement capabilities are demonstrated for \
Temperature dependent BRDF facility
NASA Astrophysics Data System (ADS)
Airola, Marc B.; Brown, Andrea M.; Hahn, Daniel V.; Thomas, Michael E.; Congdon, Elizabeth A.; Mehoke, Douglas S.
2014-09-01
Applications involving space based instrumentation and aerodynamically heated surfaces often require knowledge of the bi-directional reflectance distribution function (BRDF) of an exposed surface at high temperature. Addressing this need, the Johns Hopkins University Applied Physics Laboratory (JHU/APL) developed a BRDF facility that features a multiple-port vacuum chamber, multiple laser sources covering the spectral range from the longwave infrared to the ultraviolet, imaging pyrometry and laser heated samples. Laser heating eliminates stray light that would otherwise be seen from a furnace and requires minimal sample support structure, allowing low thermal conduction loss to be obtained, which is especially important at high temperatures. The goal is to measure the BRDF of ceramic-coated surfaces at temperatures in excess of 1000°C in a low background environment. Most ceramic samples are near blackbody in the longwave infrared, thus pyrometry using a LWIR camera can be very effective and accurate.
Determination of Aromatic Ring Number Using Multi-Channel Deep UV Native Fluorescence
NASA Technical Reports Server (NTRS)
Bhartia, R.; McDonald, G. D.; Salas, E.; Conrad, P.
2004-01-01
The in situ detection of organic material on an extraterrestrial surface requires both effective means of searching a relatively large surface area or volume for possible organic carbon, and a more specific means of identifying and quantifying compounds in indicated samples. Fluorescence spectroscopy fits the first requirement well, as it can be carried out rapidly, with minimal or no physical contact with the sample, and with sensitivity unmatched by any other organic analytical technique. Aromatic organic compounds with know fluorescence signatures have been identified in several extraterrestrial samples, including carbonaceous chondrites, interplanetary dust particles, and Martian meteorites. The compound distributions vary among these sources, however, with clear differences in relative abundances by number of aromatic rings and by degree of alkylation. This relative abundance information, therefore, can be used to infer the source of organic material detected on a planetary surface.
Li, Chao; Yu, Jiaquan; Schehr, Jennifer; Berry, Scott M; Leal, Ticiana A; Lang, Joshua M; Beebe, David J
2018-05-23
The concept of high liquid repellency in multi-liquid-phase systems (e.g., aqueous droplets in an oil background) has been applied to areas of biomedical research to realize intrinsic advantages not available in single-liquid-phase systems. Such advantages have included minimizing analyte loss, facile manipulation of single-cell samples, elimination of biofouling, and ease of use regarding loading and retrieving of the sample. In this paper, we present generalized design rules for predicting the wettability of solid-liquid-liquid systems (especially for discrimination between exclusive liquid repellency (ELR) and finite liquid repellency) to extend the applications of ELR. We then apply ELR to two model systems with open microfluidic design in cell biology: (1) in situ underoil culture and combinatorial coculture of mammalian cells in order to demonstrate directed single-cell multiencapsulation with minimal waste of samples as compared to stochastic cell seeding and (2) isolation of a pure population of circulating tumor cells, which is required for certain downstream analyses including sequencing and gene expression profiling.
A system architecture for online data interpretation and reduction in fluorescence microscopy
NASA Astrophysics Data System (ADS)
Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer
2010-01-01
In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.
Extrapolation of rotating sound fields.
Carley, Michael
2018-03-01
A method is presented for the computation of the acoustic field around a tonal circular source, such as a rotor or propeller, based on an exact formulation which is valid in the near and far fields. The only input data required are the pressure field sampled on a cylindrical surface surrounding the source, with no requirement for acoustic velocity or pressure gradient information. The formulation is approximated with exponentially small errors and appears to require input data at a theoretically minimal number of points. The approach is tested numerically, with and without added noise, and demonstrates excellent performance, especially when compared to extrapolation using a far-field assumption.
Three Dimensional Projection Environment for Molecular Design and Surgical Simulation
2011-08-01
bypasses the cumbersome meshing process . The deformation model is only comprised of mass nodes, which are generated by sampling the object volume before...force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post- processing technique is developed to...render distinct physi-cal tissue properties across different interaction areas. The proposed approach does not require any pre- processing and is
Preparation of positive blood cultures for direct MALDI-ToF MS identification.
Robinson, Andrew M; Ussher, James E
2016-08-01
MALDI-ToF MS can be used to identify microorganisms directly from blood cultures. This study compared two methods of sample preparation. Similar levels of genus- (91% vs 90%) and species-level identifications (79% vs 74%) were obtained with differential centrifugation and SDS methods. The SDS method is faster and requires minimal handling. Copyright © 2016 Elsevier B.V. All rights reserved.
Michelle A. Jusino; Daniel Lindner; John K. Cianchetti; Adam T. Grisé; Nicholas J. Brazee; Jeffrey R. Walters
2014-01-01
Relationships among cavity-nesting birds, trees, and wood decay fungi pose interesting management challenges and research questions in many systems. Ornithologists need to understand the relationships between cavity-nesting birds and fungi in order to understand the habitat requirements of these birds. Typically, researchers rely on fruiting body surveys to identify...
Minimally-invasive biomarker studies in eosinophilic esophagitis: a systematic review.
Hines, Brittany T; Rank, Matthew A; Wright, Benjamin L; Marks, Lisa A; Hagan, John B; Straumann, Alex; Greenhawt, Matthew; Dellon, Evan S
2018-05-10
Eosinophilic esophagitis (EoE) is a chronic, inflammatory disease of the esophagus which currently requires repeated endoscopic biopsies for diagnosis and monitoring as no reliable non-invasive markers have been identified. To identify promising minimally-invasive EoE biomarkers and remaining gaps in biomarker validation. We performed a systematic review of EMBASE, Ovid Medline, PubMed, and Web of Science from inception to June 6, 2017. Studies were included if subjects met the 2007 consensus criteria for EoE diagnosis, a minimally-invasive biomarker was assessed, and the study included at least 1 control for comparison. The search identified 2094 studies, with 234 reviewed at full text level, and 49 included in the analysis (20 adult, 19 pediatric, 7 pediatric and adult, and 3 not stated). The majority (26 of 49) were published after 2014. Thirty-five studies included normal controls, 9 analyzed atopic controls, and 29 compared samples from subjects with active and inactive EoE. Minimally-invasive biomarkers were obtained from peripheral blood (n=41 studies), sponge/string samples (3), oral/throat swab secretions (2), breath condensate (2), stool (2), and urine (2). The most commonly reported biomarkers were peripheral blood eosinophils (16), blood and string eosinophil granule proteins (14), and eosinophil surface or intracellular markers (12). EoE biomarkers distinguished active EoE from normal controls in 23 studies, atopic controls in 2 studies, and inactive EoE controls in 20 studies. Several promising minimally-invasive biomarkers for EoE have emerged; however, few are able to differentiate EoE from other atopic diseases. Copyright © 2018. Published by Elsevier Inc.
Smart Cup: A Minimally-Instrumented, Smartphone-Based Point-of-Care Molecular Diagnostic Device.
Liao, Shih-Chuan; Peng, Jing; Mauk, Michael G; Awasthi, Sita; Song, Jinzhao; Friedman, Harvey; Bau, Haim H; Liu, Changchun
2016-06-28
Nucleic acid amplification-based diagnostics offer rapid, sensitive, and specific means for detecting and monitoring the progression of infectious diseases. However, this method typically requires extensive sample preparation, expensive instruments, and trained personnel. All of which hinder its use in resource-limited settings, where many infectious diseases are endemic. Here, we report on a simple, inexpensive, minimally-instrumented, smart cup platform for rapid, quantitative molecular diagnostics of pathogens at the point of care. Our smart cup takes advantage of water-triggered, exothermic chemical reaction to supply heat for the nucleic acid-based, isothermal amplification. The amplification temperature is regulated with a phase-change material (PCM). The PCM maintains the amplification reactor at a constant temperature, typically, 60-65°C, when ambient temperatures range from 12 to 35°C. To eliminate the need for an optical detector and minimize cost, we use the smartphone's flashlight to excite the fluorescent dye and the phone camera to record real-time fluorescence emission during the amplification process. The smartphone can concurrently monitor multiple amplification reactors and analyze the recorded data. Our smart cup's utility was demonstrated by amplifying and quantifying herpes simplex virus type 2 (HSV-2) with LAMP assay in our custom-made microfluidic diagnostic chip. We have consistently detected as few as 100 copies of HSV-2 viral DNA per sample. Our system does not require any lab facilities and is suitable for use at home, in the field, and in the clinic, as well as in resource-poor settings, where access to sophisticated laboratories is impractical, unaffordable, or nonexistent.
Schulze, H Georg; Turner, Robin F B
2014-01-01
Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.
NASA Astrophysics Data System (ADS)
Kirkham, R.; Olsen, K.; Hayes, J. C.; Emer, D. F.
2013-12-01
Underground nuclear tests may be first detected by seismic or air samplers operated by the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization). After initial detection of a suspicious event, member nations may call for an On-Site Inspection (OSI) that in part, will sample for localized releases of radioactive noble gases and particles. Although much of the commercially available equipment and methods used for surface and subsurface environmental sampling of gases can be used for an OSI scenario, on-site sampling conditions, required sampling volumes and establishment of background concentrations of noble gases require development of specialized methodologies. To facilitate development of sampling equipment and methodologies that address OSI sampling volume and detection objectives, and to collect information required for model development, a field test site was created at a former underground nuclear explosion site located in welded volcanic tuff. A mixture of SF-6, Xe127 and Ar37 was metered into 4400 m3 of air as it was injected into the top region of the UNE cavity. These tracers were expected to move towards the surface primarily in response to barometric pumping or through delayed cavity pressurization (accelerated transport to minimize source decay time). Sampling approaches compared during the field exercise included sampling at the soil surface, inside surface fractures, and at soil vapor extraction points at depths down to 2 m. Effectiveness of various sampling approaches and the results of tracer gas measurements will be presented.
Baranec, Christoph; Dekany, Richard
2008-10-01
We introduce a Shack-Hartmann wavefront sensor for adaptive optics that enables dynamic control of the spatial sampling of an incoming wavefront using a segmented mirror microelectrical mechanical systems (MEMS) device. Unlike a conventional lenslet array, subapertures are defined by either segments or groups of segments of a mirror array, with the ability to change spatial pupil sampling arbitrarily by redefining the segment grouping. Control over the spatial sampling of the wavefront allows for the minimization of wavefront reconstruction error for different intensities of guide source and different atmospheric conditions, which in turn maximizes an adaptive optics system's delivered Strehl ratio. Requirements for the MEMS devices needed in this Shack-Hartmann wavefront sensor are also presented.
Characterizing the Mental Health Care of U.S. Cambodian Refugees.
Wong, Eunice C; Marshall, Grant N; Schell, Terry L; Berthold, S Megan; Hambarsoomians, Katrin
2015-09-01
This study examined U.S. Cambodian refugees' utilization of mental health services across provider types, levels of minimally adequate care, and mode of communication with providers. Face-to-face household interviews about mental health service use in the past 12 months were conducted as part of a study of a probability sample of Cambodian refugees. The analytic sample was restricted to the 227 respondents who met past 12-month criteria for posttraumatic stress disorder (PTSD) or major depressive disorder or both. Analyses were weighted to account for complex sampling design effects and for attrition. Fifty-two percent of Cambodian refugees who met diagnostic criteria obtained mental health services in the past 12 months. Of those who obtained care, 75% visited a psychiatrist and 56% a general medical provider. Only 7% had obtained care from other mental health specialty providers. Virtually all respondents who had seen a psychiatrist (100%) or a general medical doctor (97%) had been prescribed a psychotropic medication. Forty-five percent had received minimally adequate care. Most relied on interpreters to communicate with providers. Cambodian refugees' rates of mental health service utilization and minimally adequate care were comparable to those of individuals in the general U.S. Cambodian refugees obtained care almost entirely from psychiatrists and general medical doctors, and nearly all were receiving pharmacotherapy; these findings differ from rates seen in a nationally representative sample. Given this pattern of utilization, and the persistently high levels of PTSD and depression found among Cambodian refugees, treatment improvements may require identification of creative approaches to delivering more evidence-based psychotherapy.
ERIC Educational Resources Information Center
Anderson, Barry D.
Little is known about the costs of setting up and implementing legislated minimal competency testing (MCT). To estimate the financial obstacles which lie between the idea and its implementation, MCT requirements are viewed from two perspectives. The first, government regulation, views legislated minimal competency requirements as an attempt by the…
Automated blood-sample handling in the clinical laboratory.
Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O
1990-09-01
The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.
Øbro, Nina F; Ryder, Lars P; Madsen, Hans O; Andersen, Mette K; Lausen, Birgitte; Hasle, Henrik; Schmiegelow, Kjeld; Marquart, Hanne V
2012-01-01
Reduction in minimal residual disease, measured by real-time quantitative PCR or flow cytometry, predicts prognosis in childhood B-cell precursor acute lymphoblastic leukemia. We explored whether cells reported as minimal residual disease by flow cytometry represent the malignant clone harboring clone-specific genomic markers (53 follow-up bone marrow samples from 28 children with B-cell precursor acute lymphoblastic leukemia). Cell populations (presumed leukemic and non-leukemic) were flow-sorted during standard flow cytometry-based minimal residual disease monitoring and explored by PCR and/or fluorescence in situ hybridization. We found good concordance between flow cytometry and genomic analyses in the individual flow-sorted leukemic (93% true positive) and normal (93% true negative) cell populations. Four cases with discrepant results had plausible explanations (e.g. partly informative immunophenotype and antigen modulation) that highlight important methodological pitfalls. These findings demonstrate that with sufficient experience, flow cytometry is reliable for minimal residual disease monitoring in B-cell precursor acute lymphoblastic leukemia, although rare cases require supplementary PCR-based monitoring.
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Dialysis Extraction for Chromatography
NASA Technical Reports Server (NTRS)
Jahnsen, V. J.
1985-01-01
Chromatographic-sample pretreatment by dialysis detects traces of organic contaminants in water samples analyzed in field with minimal analysis equipment and minimal quantities of solvent. Technique also of value wherever aqueous sample and solvent must not make direct contact.
Kellenberger, Colleen A; Sales-Lee, Jade; Pan, Yuchen; Gassaway, Madalee M; Herr, Amy E; Hammond, Ming C
2015-01-01
Cyclic di-GMP (c-di-GMP) is a second messenger that is important in regulating bacterial physiology and behavior, including motility and virulence. Many questions remain about the role and regulation of this signaling molecule, but current methods of detection are limited by either modest sensitivity or requirements for extensive sample purification. We have taken advantage of a natural, high affinity receptor of c-di-GMP, the Vc2 riboswitch aptamer, to develop a sensitive and rapid electrophoretic mobility shift assay (EMSA) for c-di-GMP quantitation that required minimal engineering of the RNA.
Burtis, C.A.; Johnson, W.F.; Walker, W.A.
1985-08-05
A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.
Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.
1988-01-01
A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
Comparative Testis Tissue Proteomics Using 2-Dye Versus 3-Dye DIGE Analysis.
Holland, Ashling
2018-01-01
Comparative tissue proteomics aims to analyze alterations of the proteome in response to a stimulus. Two-dimensional difference gel electrophoresis (2D-DIGE) is a modified and advanced form of 2D gel electrophoresis. DIGE is a powerful biochemical method that compares two or three protein samples on the same analytical gel, and can be used to establish differentially expressed protein levels between healthy normal and diseased pathological tissue sample groups. Minimal DIGE labeling can be used via a 2-dye system with Cy3 and Cy5 or a 3-dye system with Cy2, Cy3, and Cy5 to fluorescently label samples with CyDye flours pre-electrophoresis. DIGE circumvents gel-to-gel variability by multiplexing samples to a single gel and through the use of a pooled internal standard for normalization. This form of quantitative high-resolution proteomics facilitates the comparative analysis and evaluation of tissue protein compositions. Comparing tissue groups under different conditions is crucially important for advancing the biomedical field by characterization of cellular processes, understanding pathophysiological development and tissue biomarker discovery. This chapter discusses 2D-DIGE as a comparative tissue proteomic technique and describes in detail the experimental steps required for comparative proteomic analysis employing both options of 2-dye and 3-dye DIGE minimal labeling.
Fabregat-Cabello, Neus; Castillo, Ángel; Sancho, Juan V; González, Florenci V; Roig-Navarro, Antoni Francesc
2013-08-02
In this work we have developed and validated an accurate and fast methodology for the determination of 4-nonylphenol (technical mixture) in complex matrix water samples by UHPLC-ESI-MS/MS. The procedure is based on isotope dilution mass spectrometry (IDMS) in combination with isotope pattern deconvolution (IPD), which provides the concentration of the analyte directly from the spiked sample without requiring any methodological calibration graph. To avoid any possible isotopic effect during the analytical procedure the in-house synthesized (13)C1-4-(3,6-dimethyl-3-heptyl)phenol was used as labeled compound. This proposed surrogate was able to compensate the matrix effect even from wastewater samples. A SPE pre-concentration step together with exhaustive efforts to avoid contamination were included to reach the signal-to-noise ratio necessary to detect the endogenous concentrations present in environmental samples. Calculations were performed acquiring only three transitions, achieving limits of detection lower than 100ng/g for all water matrix assayed. Recoveries within 83-108% and coefficients of variation ranging from 1.5% to 9% were obtained. On the contrary a considerable overestimation was obtained with the most usual classical calibration procedure using 4-n-nonylphenol as internal standard, demonstrating the suitability of the minimal labeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.
A Technique for Thermal Desorption Analyses Suitable for Thermally-Labile, Volatile Compounds.
Alborn, Hans T
2018-02-01
Many plant and insect interactions are governed by odors released by the plants or insects and there exists a continual need for new or improved methods to collect and identify these odors. Our group has for some time studied below-ground, plant-produced volatile signals affecting nematode and insect behavior. The research requires repeated sampling of volatiles of intact plant/soil systems in the laboratory as well as the field with the help of probes to minimize unwanted effects on the systems we are studying. After evaluating solid adsorbent filters with solvent extraction or solid phase micro extraction fiber sample collection, we found dynamic sampling of small air volumes on Tenax TA filters followed by thermal desorption sample introduction to be the most suitable analytical technique for our applications. Here we present the development and evaluation of a low-cost and relatively simple thermal desorption technique where a cold trap cooled with liquid carbon dioxide is added as an integral part of a splitless injector. Temperature gradient-based focusing and low thermal mass minimizes aerosol formation and eliminates the need for flash heating, resulting in low sample degradation comparable to solvent-based on-column injections. Additionally, since the presence of the cold trap does not affect normal splitless injections, on-the-fly switching between splitless and thermal desorption modes can be used for external standard quantification.
Molecular epidemiology biomarkers-Sample collection and processing considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Nina T.; Pfleger, Laura; Berger, Eileen
2005-08-07
Biomarker studies require processing and storage of numerous biological samples with the goals of obtaining a large amount of information and minimizing future research costs. An efficient study design includes provisions for processing of the original samples, such as cryopreservation, DNA isolation, and preparation of specimens for exposure assessment. Use of standard, two-dimensional and nanobarcodes and customized electronic databases assure efficient management of large sample collections and tracking results of data analyses. Standard operating procedures and quality control plans help to protect sample quality and to assure validity of the biomarker data. Specific state, federal and international regulations are inmore » place regarding research with human samples, governing areas including custody, safety of handling, and transport of human samples. Appropriate informed consent must be obtained from the study subjects prior to sample collection and confidentiality of results maintained. Finally, examples of three biorepositories of different scale (European Cancer Study, National Cancer Institute and School of Public Health Biorepository, University of California, Berkeley) are used to illustrate challenges faced by investigators and the ways to overcome them. New software and biorepository technologies are being developed by many companies that will help to bring biological banking to a new level required by molecular epidemiology of the 21st century.« less
On-line IR analyzer system to monitor cephamycin C loading on ion-exchange resin
NASA Astrophysics Data System (ADS)
Shank, Sheldon; Russ, Warren; Gravatt, Douglas; Lee, Wesley; Donahue, Steven M.
1992-08-01
An on-line infrared analyzer is being developed for monitoring cephamycin C loading on ion exchange resin. Accurate measurement of product loading offers productivity improvements with direct savings from product loss avoidance, minimized raw material cost, and reduced off-line laboratory testing. Ultrafiltered fermentation broth is fed onto ion exchange columns under conditions which adsorb the product, cephamycin C, to the resin while allowing impurities to pass unretained. Product loading is stopped when the on-line analyzer determines that resin capacity for adsorbing product is nearly exhausted. Infrared spectroscopy has been shown capable of quantifying cephamycin C in the process matrix at concentrations that support process control decisions. Process-to-analyzer interface challenges have been resolved, including sample conditioning requirements. Analyzer requirements have been defined. The sample conditioning station is under design.
A minimax technique for time-domain design of preset digital equalizers using linear programming
NASA Technical Reports Server (NTRS)
Vaughn, G. L.; Houts, R. C.
1975-01-01
A linear programming technique is presented for the design of a preset finite-impulse response (FIR) digital filter to equalize the intersymbol interference (ISI) present in a baseband channel with known impulse response. A minimax technique is used which minimizes the maximum absolute error between the actual received waveform and a specified raised-cosine waveform. Transversal and frequency-sampling FIR digital filters are compared as to the accuracy of the approximation, the resultant ISI and the transmitted energy required. The transversal designs typically have slightly better waveform accuracy for a given distortion; however, the frequency-sampling equalizer uses fewer multipliers and requires less transmitted energy. A restricted transversal design is shown to use the least number of multipliers at the cost of a significant increase in energy and loss of waveform accuracy at the receiver.
NMR methods for metabolomics of mammalian cell culture bioreactors.
Aranibar, Nelly; Reily, Michael D
2014-01-01
Metabolomics has become an important tool for measuring pools of small molecules in mammalian cell cultures expressing therapeutic proteins. NMR spectroscopy has played an important role, largely because it requires minimal sample preparation, does not require chromatographic separation, and is quantitative. The concentrations of large numbers of small molecules in the extracellular media or within the cells themselves can be measured directly on the culture supernatant and on the supernatant of the lysed cells, respectively, and correlated with endpoints such as titer, cell viability, or glycosylation patterns. The observed changes can be used to generate hypotheses by which these parameters can be optimized. This chapter focuses on the sample preparation, data acquisition, and analysis to get the most out of NMR metabolomics data from CHO cell cultures but could easily be extended to other in vitro culture systems.
A Rational Approach to Determine Minimum Strength Thresholds in Novel Structural Materials
NASA Technical Reports Server (NTRS)
Schur, Willi W.; Bilen, Canan; Sterling, Jerry
2003-01-01
Design of safe and survivable structures requires the availability of guaranteed minimum strength thresholds for structural materials to enable a meaningful comparison of strength requirement and available strength. This paper develops a procedure for determining such a threshold with a desired degree of confidence, for structural materials with none or minimal industrial experience. The problem arose in attempting to use a new, highly weight-efficient structural load tendon material to achieve a lightweight super-pressure balloon. The developed procedure applies to lineal (one dimensional) structural elements. One important aspect of the formulation is that it extrapolates to expected probability distributions for long length specimen samples from some hypothesized probability distribution that has been obtained from a shorter length specimen sample. The use of the developed procedure is illustrated using both real and simulated data.
Aerospace Non Chrome Corrosion Inhibiting Primer Systems
2009-09-01
Meet all HSE specifications, TSCA, REACh, Akzo • Strippable • No weight increase over current system • Meet specification requirements for corrosion, not...Positive and negative controls • Primer only and topcoated samples Aerospace Coatings | Title 9 OEM CF Optimization/ Down Selects •Usual issues...found to be true • Good NSS ≠ Good filiform ≠ Good cure ≠ Good application properties • Down select process is to minimize ≠ and move to a balance of
Overlay improvement methods with diffraction based overlay and integrated metrology
NASA Astrophysics Data System (ADS)
Nam, Young-Sun; Kim, Sunny; Shin, Ju Hee; Choi, Young Sin; Yun, Sang Ho; Kim, Young Hoon; Shin, Si Woo; Kong, Jeong Heung; Kang, Young Seog; Ha, Hun Hwan
2015-03-01
To accord with new requirement of securing more overlay margin, not only the optical overlay measurement is faced with the technical limitations to represent cell pattern's behavior, but also the larger measurement samples are inevitable for minimizing statistical errors and better estimation of circumstance in a lot. From these reasons, diffraction based overlay (DBO) and integrated metrology (IM) were mainly proposed as new approaches for overlay enhancement in this paper.
Efficient electromagnetic source imaging with adaptive standardized LORETA/FOCUSS.
Schimpf, Paul H; Liu, Hesheng; Ramon, Ceon; Haueisen, Jens
2005-05-01
Functional brain imaging and source localization based on the scalp's potential field require a solution to an ill-posed inverse problem with many solutions. This makes it necessary to incorporate a priori knowledge in order to select a particular solution. A computational challenge for some subject-specific head models is that many inverse algorithms require a comprehensive sampling of the candidate source space at the desired resolution. In this study, we present an algorithm that can accurately reconstruct details of localized source activity from a sparse sampling of the candidate source space. Forward computations are minimized through an adaptive procedure that increases source resolution as the spatial extent is reduced. With this algorithm, we were able to compute inverses using only 6% to 11% of the full resolution lead-field, with a localization accuracy that was not significantly different than an exhaustive search through a fully-sampled source space. The technique is, therefore, applicable for use with anatomically-realistic, subject-specific forward models for applications with spatially concentrated source activity.
NASA Astrophysics Data System (ADS)
Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Dübner, Matthias; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.
2016-07-01
Therapeutic drug monitoring (TDM) typically requires painful blood drawn from patients. We propose a painless and minimally-invasive alternative for TDM using hollow microneedles suitable to extract extremely small volumes (<1 nL) of interstitial fluid to measure drug concentrations. The inner lumen of a microneedle is functionalized to be used as a micro-reactor during sample collection to trap and bind target drug candidates during extraction, without requirements of sample transfer. An optofluidic device is integrated with this microneedle to rapidly quantify drug analytes with high sensitivity using a straightforward absorbance scheme. Vancomycin is currently detected by using volumes ranging between 50-100 μL with a limit of detection (LoD) of 1.35 μM. The proposed microneedle-optofluidic biosensor can detect vancomycin with a sample volume of 0.6 nL and a LoD of <100 nM, validating this painless point of care system with significant potential to reduce healthcare costs and patients suffering.
Dried Blood Spot Collection of Health Biomarkers to Maximize Participation in Population Studies
Ostler, Michael W.; Porter, James H.; Buxton, Orfeu M.
2014-01-01
Biomarkers are directly-measured biological indicators of disease, health, exposures, or other biological information. In population and social sciences, biomarkers need to be easy to obtain, transport, and analyze. Dried Blood Spots meet this need, and can be collected in the field with high response rates. These elements are particularly important in longitudinal study designs including interventions where attrition is critical to avoid, and high response rates improve the interpretation of results. Dried Blood Spot sample collection is simple, quick, relatively painless, less invasive then venipuncture, and requires minimal field storage requirements (i.e. samples do not need to be immediately frozen and can be stored for a long period of time in a stable freezer environment before assay). The samples can be analyzed for a variety of different analytes, including cholesterol, C-reactive protein, glycosylated hemoglobin, numerous cytokines, and other analytes, as well as provide genetic material. DBS collection is depicted as employed in several recent studies. PMID:24513728
Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Dübner, Matthias; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.
2016-01-01
Therapeutic drug monitoring (TDM) typically requires painful blood drawn from patients. We propose a painless and minimally-invasive alternative for TDM using hollow microneedles suitable to extract extremely small volumes (<1 nL) of interstitial fluid to measure drug concentrations. The inner lumen of a microneedle is functionalized to be used as a micro-reactor during sample collection to trap and bind target drug candidates during extraction, without requirements of sample transfer. An optofluidic device is integrated with this microneedle to rapidly quantify drug analytes with high sensitivity using a straightforward absorbance scheme. Vancomycin is currently detected by using volumes ranging between 50–100 μL with a limit of detection (LoD) of 1.35 μM. The proposed microneedle-optofluidic biosensor can detect vancomycin with a sample volume of 0.6 nL and a LoD of <100 nM, validating this painless point of care system with significant potential to reduce healthcare costs and patients suffering. PMID:27380889
Zafra-Gómez, Alberto; Garballo, Antonio; Morales, Juan C; García-Ayuso, Luis E
2006-06-28
A fast, simple, and reliable method for the isolation and determination of the vitamins thiamin, riboflavin, niacin, pantothenic acid, pyridoxine, folic acid, cyanocobalamin, and ascorbic acid in food samples is proposed. The most relevant advantages of the proposed method are the simultaneous determination of the eight more common vitamins in enriched food products and a reduction of the time required for quantitative extraction, because the method consists merely of the addition of a precipitation solution and centrifugation of the sample. Furthermore, this method saves a substantial amount of reagents as compared with official methods, and minimal sample manipulation is achieved due to the few steps required. The chromatographic separation is carried out on a reverse phase C18 column, and the vitamins are detected at different wavelengths by either fluorescence or UV-visible detection. The proposed method was applied to the determination of water-soluble vitamins in supplemented milk, infant nutrition products, and milk powder certified reference material (CRM 421, BCR) with recoveries ranging from 90 to 100%.
Adequacy of depression treatment among college students in the United States.
Eisenberg, Daniel; Chung, Henry
2012-01-01
There is no published evidence on the adequacy of depression care among college students and how this varies by subpopulations and provider types. We estimated the prevalence of minimally adequate treatment among students with significant past-year depressive symptoms. Data were collected via a confidential online survey of a random sample of 8488 students from 15 colleges and universities in the 2009 Healthy Minds Study. Depressive symptoms were assessed by the Patient Health Questionnaire-2, adapted to a past-year time frame. Students with probable depression were coded as having received minimally adequate depression care based on the criteria from Wang and colleagues (2005). Minimally adequate treatment was received by only 22% of depressed students. The likelihood of minimally adequate treatment was similarly low for both psychiatric medication and psychotherapy. Minimally adequate care was lower for students prescribed medication by a primary care provider as compared to a psychiatrist (P<.01). Racial/ethnic minority students were less likely to receive depression care (P<.01). Adequacy of depression care is a significant problem in the college population. Solutions will likely require greater availability of psychiatry care, better coordination between specialty and primary care using collaborative care models, and increased efforts to retain students in psychotherapy. Copyright © 2012 Elsevier Inc. All rights reserved.
Feline mitochondrial DNA sampling for forensic analysis: when enough is enough!
Grahn, Robert A; Alhaddad, Hasan; Alves, Paulo C; Randi, Ettore; Waly, Nashwa E; Lyons, Leslie A
2015-05-01
Pet hair has a demonstrated value in resolving legal issues. Cat hair is chronically shed and it is difficult to leave a home with cats without some level of secondary transfer. The power of cat hair as an evidentiary resource may be underused because representative genetic databases are not available for exclusionary purposes. Mitochondrial control region databases are highly valuable for hair analyses and have been developed for the cat. In a representative worldwide data set, 83% of domestic cat mitotypes belong to one of twelve major types. Of the remaining 17%, 7.5% are unique within the published 1394 sample database. The current research evaluates the sample size necessary to establish a representative population for forensic comparison of the mitochondrial control region for the domestic cat. For most worldwide populations, randomly sampling 50 unrelated local individuals will achieve saturation at 95%. The 99% saturation is achieved by randomly sampling 60-170 cats, depending on the numbers of mitotypes available in the population at large. Likely due to the recent domestication of the cat and minimal localized population substructure, fewer cats are needed to meet mitochondria DNA control region database practical saturation than for humans or dogs. Coupled with the available worldwide feline control region database of nearly 1400 cats, minimal local sampling will be required to establish an appropriate comparative representative database and achieve significant exclusionary power. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Sampling requirements for forage quality characterization of rectangular hay bales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheaffer, C.C.; Martin, N.P.; Jewett, J.G.
2000-02-01
Commercial lots of alfalfa (Medicago sativa L.) hay are often bought and sold on the basis of forage quality. Proper sampling is essential to obtain accurate forage quality results for pricing of alfalfa hay, but information about sampling is limited to small, 20- to 40-kg rectangular bales. Their objectives were to determine the within-bale variation in 400-kg rectangular bales and to determine the number and distribution of core samples required to represent the crude protein (CP), acid detergent fiber (ADF), neutral detergent fiber (NDF), and dry matter (DM) concentration in commercial lots of alfalfa hay. Four bales were selected frommore » each of three hay lots and core sampled nine times per side for a total of 54 cores per bale. There was no consistent pattern of forage quality variation within bales. Averaged across lots, any portion of a bale was highly correlated with bale grand means for CP, ADF, NDF, and DM. Three lots of hay were probed six times per bale, one core per bale side from 55, 14, and 14 bales per lot. For determination of CP, ADF, NDF, and DM concentration, total core numbers required to achieve an acceptable standard error (SE) were minimized by sampling once per bale. Bootstrap analysis of data from the most variable hay lot suggested that forage quality of any lot of 400-kg alfalfa hay bales should be adequately represented by 12 bales sampled once per bale.« less
Sampling design optimization for spatial functions
Olea, R.A.
1984-01-01
A new procedure is presented for minimizing the sampling requirements necessary to estimate a mappable spatial function at a specified level of accuracy. The technique is based on universal kriging, an estimation method within the theory of regionalized variables. Neither actual implementation of the sampling nor universal kriging estimations are necessary to make an optimal design. The average standard error and maximum standard error of estimation over the sampling domain are used as global indices of sampling efficiency. The procedure optimally selects those parameters controlling the magnitude of the indices, including the density and spatial pattern of the sample elements and the number of nearest sample elements used in the estimation. As an illustration, the network of observation wells used to monitor the water table in the Equus Beds of Kansas is analyzed and an improved sampling pattern suggested. This example demonstrates the practical utility of the procedure, which can be applied equally well to other spatial sampling problems, as the procedure is not limited by the nature of the spatial function. ?? 1984 Plenum Publishing Corporation.
NASA Technical Reports Server (NTRS)
Erickson, E. F.; Young, E. T.; Wolf, J.; Asbrock, J. F.; Lum, N.; DeVincenzi, D. (Technical Monitor)
2002-01-01
Arrays of far-infrared photoconductor detectors operate at a few degrees Kelvin and require electronic amplifiers in close proximity. For the electronics, a cryogenic multiplexer is ideal to avoid the large number of wires associated with individual amplifiers for each pixel, and to avoid adverse effects of thermal and radiative heat loads from the circuitry. For low background applications, the 32 channel CRC 696 CMOS device was previously developed for SIRTF, the cryogenic Space Infrared Telescope Facility. For higher background applications, we have developed a similar circuit, featuring several modifications: (a) an AC coupled, capacitive feedback transimpedence unit cell, to minimize input offset effects, thereby enabling low detector biases, (b) selectable feedback capacitors to enable operation over a wide range of backgrounds, and (c) clamp and sample & hold output circuits to improve sampling efficiency, which is a concern at the high readout rates required. We describe the requirements for and design of the new device.
Analysis of munitions constituents in groundwater using a field-portable GC-MS.
Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K
2012-05-01
The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Khawaja, M. Sami; Rushton, Josh
Evaluating an energy efficiency program requires assessing the total energy and demand saved through all of the energy efficiency measures provided by the program. For large programs, the direct assessment of savings for each participant would be cost-prohibitive. Even if a program is small enough that a full census could be managed, such an undertaking would almost always be an inefficient use of evaluation resources. The bulk of this chapter describes methods for minimizing and quantifying sampling error. Measurement error and regression error are discussed in various contexts in other chapters.
Thermoluminescence: Potential Applications in Forensic Science
NASA Technical Reports Server (NTRS)
Ingham, J. D.; Lawson, D. D.
1973-01-01
In crime laboratories one of the most difficult operations is to determine unequivocally whether or not two samples of evidence of the same type were originally part of the same thing or were from the same source. It has been found that high temperature thermoluminescence (room temperature to 723 K) can be used for comparisons of this type, although work to date indicates that there is generally a finite probability for coincidental matching of glass or soil samples. Further work is required to determine and attempt to minimize these probabilities for different types of materials, and to define more clearly the scope of applicability of thermoluminescence to actual forensic situations.
2016-01-01
The function of bioenergetic membranes is strongly influenced by the spatial arrangement of their constituent membrane proteins. Atomic force microscopy (AFM) can be used to probe protein organization at high resolution, allowing individual proteins to be identified. However, previous AFM studies of biological membranes have typically required that curved membranes are ruptured and flattened during sample preparation, with the possibility of disruption of the native protein arrangement or loss of proteins. Imaging native, curved membranes requires minimal tip–sample interaction in both lateral and vertical directions. Here, long-range tip–sample interactions are reduced by optimizing the imaging buffer. Tapping mode AFM with high-resonance-frequency small and soft cantilevers, in combination with a high-speed AFM, reduces the forces due to feedback error and enables application of an average imaging force of tens of piconewtons. Using this approach, we have imaged the membrane organization of intact vesicular bacterial photosynthetic “organelles”, chromatophores. Despite the highly curved nature of the chromatophore membrane and lack of direct support, the resolution was sufficient to identify the photosystem complexes and quantify their arrangement in the native state. Successive imaging showed the proteins remain surprisingly static, with minimal rotation or translation over several-minute time scales. High-order assemblies of RC-LH1-PufX complexes are observed, and intact ATPases are successfully imaged. The methods developed here are likely to be applicable to a broad range of protein-rich vesicles or curved membrane systems, which are an almost ubiquitous feature of native organelles. PMID:28114766
Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil
2016-01-01
Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Li, Liang; Mustafi, Debarshi; Fu, Qiang; Tereshko, Valentina; Chen, Delai L.; Tice, Joshua D.; Ismagilov, Rustem F.
2006-01-01
High-throughput screening and optimization experiments are critical to a number of fields, including chemistry and structural and molecular biology. The separation of these two steps may introduce false negatives and a time delay between initial screening and subsequent optimization. Although a hybrid method combining both steps may address these problems, miniaturization is required to minimize sample consumption. This article reports a “hybrid” droplet-based microfluidic approach that combines the steps of screening and optimization into one simple experiment and uses nanoliter-sized plugs to minimize sample consumption. Many distinct reagents were sequentially introduced as ≈140-nl plugs into a microfluidic device and combined with a substrate and a diluting buffer. Tests were conducted in ≈10-nl plugs containing different concentrations of a reagent. Methods were developed to form plugs of controlled concentrations, index concentrations, and incubate thousands of plugs inexpensively and without evaporation. To validate the hybrid method and demonstrate its applicability to challenging problems, crystallization of model membrane proteins and handling of solutions of detergents and viscous precipitants were demonstrated. By using 10 μl of protein solution, ≈1,300 crystallization trials were set up within 20 min by one researcher. This method was compatible with growth, manipulation, and extraction of high-quality crystals of membrane proteins, demonstrated by obtaining high-resolution diffraction images and solving a crystal structure. This robust method requires inexpensive equipment and supplies, should be especially suitable for use in individual laboratories, and could find applications in a number of areas that require chemical, biochemical, and biological screening and optimization. PMID:17159147
Torczynski, John R.
2000-01-01
A spin coating apparatus requires less cleanroom air flow than prior spin coating apparatus to minimize cleanroom contamination. A shaped exhaust duct from the spin coater maintains process quality while requiring reduced cleanroom air flow. The exhaust duct can decrease in cross section as it extends from the wafer, minimizing eddy formation. The exhaust duct can conform to entrainment streamlines to minimize eddy formation and reduce interprocess contamination at minimal cleanroom air flow rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.
2014-04-15
Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample sizemore » required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence.« less
Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.
2014-01-01
Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence. PMID:24694150
Wavefront optimized nonlinear microscopy of ex vivo human retinas
NASA Astrophysics Data System (ADS)
Gualda, Emilio J.; Bueno, Juan M.; Artal, Pablo
2010-03-01
A multiphoton microscope incorporating a Hartmann-Shack (HS) wavefront sensor to control the ultrafast laser beam's wavefront aberrations has been developed. This instrument allowed us to investigate the impact of the laser beam aberrations on two-photon autofluorescence imaging of human retinal tissues. We demonstrated that nonlinear microscopy images are improved when laser beam aberrations are minimized by realigning the laser system cavity while wavefront controlling. Nonlinear signals from several human retinal anatomical features have been detected for the first time, without the need of fixation or staining procedures. Beyond the improved image quality, this approach reduces the required excitation power levels, minimizing the side effects of phototoxicity within the imaged sample. In particular, this may be important to study the physiology and function of the healthy and diseased retina.
NASA Astrophysics Data System (ADS)
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples.
Six degree of freedom fine motion positioning stage based on magnetic levitation
NASA Technical Reports Server (NTRS)
Arling, R. W.; Kohler, S. M.
1994-01-01
The design of a magnetically suspended six degree of freedom positioning system capable of nanometer positioning is presented. The sample holder is controlled in six degrees of freedom (DOF) over 300 micrometers of travel in the X, Y, and Z directions. A design and control summary and test results indicating stability and power dissipation are included in the paper. The system is vacuum compatible, uses commercially available materials, and requires minimal assembly and setup.
Hyperpolarization of Nitrogen-15 Schiff Bases by Reversible Exchange Catalysis with para-Hydrogen.
Logan, Angus W J; Theis, Thomas; Colell, Johannes F P; Warren, Warren S; Malcolmson, Steven J
2016-07-25
NMR with thermal polarization requires relatively concentrated samples, particularly for nuclei with low abundance and low gyromagnetic ratios, such as (15) N. We expand the substrate scope of SABRE, a recently introduced hyperpolarization method, to allow access to (15) N-enriched Schiff bases. These substrates show fractional (15) N polarization levels of up to 2 % while having only minimal (1) H enhancements. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
International Space Station Urine Monitoring System Functional Integration and Science Testing
NASA Technical Reports Server (NTRS)
Rodriquez, Branelle R.; Broyan, James Lee, Jr.
2011-01-01
Exposure to microgravity during human spaceflight needs to be better understood as the human exploration of space requires longer duration missions. It is known that long term exposure to microgravity causes bone loss. Measuring the calcium and other metabolic byproducts in a crew member s urine can evaluate the effectiveness of bone loss countermeasures. The International Space Station (ISS) Urine Monitoring System (UMS) is an automated urine collection device designed to collect urine, separate the urine and air, measure the void volume, and allow for syringe sampling. Accurate measuring and minimal cross-contamination is essential to determine bone loss and the effectiveness of countermeasures. The ISS UMS provides minimal cross-contamination (<0.7 mL urine) and has volume accuracy of 2% between 100 to 1000 mL urine voids. Designed to provide a non-invasive means to collect urine samples from crew members, the ISS UMS operates in-line with the Node 3 Waste and Hygiene Compartment (WHC). The ISS UMS has undergone modifications required to interface with the WHC, including material changes, science algorithm improvements, and software platform revisions. Integrated functional testing was performed to determine the pressure drop, air flow rate, and the maximum amount of fluid capable of being discharged from the UMS to the WHC. This paper will detail the results of the science and the functional integration tests.
NASA Astrophysics Data System (ADS)
Zawadowicz, M. A.; Del Negro, L. A.
2010-12-01
Hazardous air pollutants (HAPs) are usually present in the atmosphere at pptv-level, requiring measurements with high sensitivity and minimal contamination. Commonly used evacuated canister methods require an overhead in space, money and time that often is prohibitive to primarily-undergraduate institutions. This study optimized an analytical method based on solid-phase microextraction (SPME) of ambient gaseous matrix, which is a cost-effective technique of selective VOC extraction, accessible to an unskilled undergraduate. Several approaches to SPME extraction and sample analysis were characterized and several extraction parameters optimized. Extraction time, temperature and laminar air flow velocity around the fiber were optimized to give highest signal and efficiency. Direct, dynamic extraction of benzene from a moving air stream produced better precision (±10%) than sampling of stagnant air collected in a polymeric bag (±24%). Using a low-polarity chromatographic column in place of a standard (5%-Phenyl)-methylpolysiloxane phase decreased the benzene detection limit from 2 ppbv to 100 pptv. The developed method is simple and fast, requiring 15-20 minutes per extraction and analysis. It will be field-validated and used as a field laboratory component of various undergraduate Chemistry and Environmental Studies courses.
TEMPUS: A facility for containerless electromagnetic processing onboard spacelab
NASA Technical Reports Server (NTRS)
Lenski, H.; Willnecker, R.
1990-01-01
The electromagnetic containerless processing facility TEMPUS was recently assigned for a flight on the IML-2 mission. In comparison to the TEMPUS facility already flown on a sounding rocket, several improvements had to be implemented. These are in particular related to: safety; resource management; and the possibility to process different samples with different requirements in one mission. The basic design of this facility as well as the expected processing capabilities are presented. Two operational aspects turned out to strongly influence the facility design: control of the sample motion (first experimental results indicate that crew or ground interaction will be necessary to minimize residual sample motions during processing); and exchange of RF-coils (during processing in vacuum, evaporated sample materials will condense at the cold surface and may force a coil exchange, when a critical thickness is exceeded).
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
Minimally processed vegetable salads: microbial quality evaluation.
Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa
2007-05-01
The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.
Deb, Rajib; Sengar, Gyanendra Singh; Singh, Umesh; Kumar, Sushil; Alyethodi, R R; Alex, Rani; Raja, T V; Das, A K; Prakash, B
2016-12-01
Loop-mediated isothermal amplification (LAMP) is a diagnostic method for amplification of DNA with rapid and minimal equipment requirement. In the present study, we applied the LAMP assay for rapid detection of cow components adulteration in buffalo milk/meat samples. The test can be completed within around 1 h 40 min starting from DNA extraction and can be performed in water bath without requirement of thermocycler. The cow DNA in buffalo samples were identified in the developed LAMP assay by either visualizing with SYBR Green I/HNB dyes or observing the typical ladder pattern on gel electrophoresis. The test can detect up to 5 % level of cow milk/meat mixed in buffalo counterparts. Due to the simplicity and specificity, the developed LAMP test can be easily adapted in any laboratory for rapid detection of cow species identification in livestock by products.
Kovačević, Mira; Burazin, Jelena; Pavlović, Hrvoje; Kopjar, Mirela; Piližota, Vlasta
2013-04-01
Minimally processed and refrigerated vegetables can be contaminated with Listeria species bacteria including Listeria monocytogenes due to extensive handling during processing or by cross contamination from the processing environment. The objective of this study was to examine the microbiological quality of ready-to-eat minimally processed and refrigerated vegetables from supermarkets in Osijek, Croatia. 100 samples of ready-to-eat vegetables collected from different supermarkets in Osijek, Croatia, were analyzed for presence of Listeria species and Listeria monocytogenes. The collected samples were cut iceberg lettuces (24 samples), other leafy vegetables (11 samples), delicatessen salads (23 samples), cabbage salads (19 samples), salads from mixed (17 samples) and root vegetables (6 samples). Listeria species was found in 20 samples (20 %) and Listeria monocytogenes was detected in only 1 sample (1 %) of cut red cabbage (less than 100 CFU/g). According to Croatian and EU microbiological criteria these results are satisfactory. However, the presence of Listeria species and Listeria monocytogenes indicates poor hygiene quality. The study showed that these products are often improperly labeled, since 24 % of analyzed samples lacked information about shelf life, and 60 % of samples lacked information about storage conditions. With regard to these facts, cold chain abruption with extended use after expiration date is a probable scenario. Therefore, the microbiological risk for consumers of ready-to-eat minimally processed and refrigerated vegetables is not completely eliminated.
Tips and traps in the 14C bio-AMS preparation laboratory
NASA Astrophysics Data System (ADS)
Buchholz, Bruce A.; Freeman, Stewart P. H. T.; Haack, Kurt W.; Vogel, John S.
2000-10-01
Maintaining a contamination free sample preparation lab for biological 14C AMS requires the same or more diligence as a radiocarbon dating prep lab. Isotope ratios of materials routinely range over 4-8 orders of magnitude in a single experiment, dosing solutions contain thousands of DPM and gels used to separate proteins possess 14C ratios of 1 amol 14C/mg C. Radiocarbon contamination is a legacy of earlier tracer work in most biological laboratories, even if they were never hot labs. Removable surface contamination can be found and monitored using swipes. Contamination can be found on any surface routinely touched: door knobs, light switches, drawer handles, water faucets. In general, all surfaces routinely touched need to be covered with paper, foil or plastic that can be changed frequently. Shared air supplies can also present problems by distributing hot aerosols throughout a building. Aerosols can be monitored for 14C content using graphitized coal or fullerene soot mixed with metal powder as an absorber. The monitors can be set out in work spaces for 1-2 weeks and measured by AMS with regular samples. Frequent air changes help minimize aerosol contamination in many cases. Cross-contamination of samples can be minimized by using disposable plastic or glassware in the prep lab, isolating samples from the air when possible and using positive displacement pipettors.
An Overview on Prenatal Screening for Chromosomal Aberrations.
Hixson, Lucas; Goel, Srishti; Schuber, Paul; Faltas, Vanessa; Lee, Jessica; Narayakkadan, Anjali; Leung, Ho; Osborne, Jim
2015-10-01
This article is a review of current and emerging methods used for prenatal detection of chromosomal aneuploidies. Chromosomal anomalies in the developing fetus can occur in any pregnancy and lead to death prior to or shortly after birth or to costly lifelong disabilities. Early detection of fetal chromosomal aneuploidies, an atypical number of certain chromosomes, can help parents evaluate their pregnancy options. Current diagnostic methods include maternal serum sampling or nuchal translucency testing, which are minimally invasive diagnostics, but lack sensitivity and specificity. The gold standard, karyotyping, requires amniocentesis or chorionic villus sampling, which are highly invasive and can cause abortions. In addition, many of these methods have long turnaround times, which can cause anxiety in mothers. Next-generation sequencing of fetal DNA in maternal blood enables minimally invasive, sensitive, and reasonably rapid analysis of fetal chromosomal anomalies and can be of clinical utility to parents. This review covers traditional methods and next-generation sequencing techniques for diagnosing aneuploidies in terms of clinical utility, technological characteristics, and market potential. © 2015 Society for Laboratory Automation and Screening.
Optimal Time-Resource Allocation for Energy-Efficient Physical Activity Detection
Thatte, Gautam; Li, Ming; Lee, Sangwon; Emken, B. Adar; Annavaram, Murali; Narayanan, Shrikanth; Spruijt-Metz, Donna; Mitra, Urbashi
2011-01-01
The optimal allocation of samples for physical activity detection in a wireless body area network for health-monitoring is considered. The number of biometric samples collected at the mobile device fusion center, from both device-internal and external Bluetooth heterogeneous sensors, is optimized to minimize the transmission power for a fixed number of samples, and to meet a performance requirement defined using the probability of misclassification between multiple hypotheses. A filter-based feature selection method determines an optimal feature set for classification, and a correlated Gaussian model is considered. Using experimental data from overweight adolescent subjects, it is found that allocating a greater proportion of samples to sensors which better discriminate between certain activity levels can result in either a lower probability of error or energy-savings ranging from 18% to 22%, in comparison to equal allocation of samples. The current activity of the subjects and the performance requirements do not significantly affect the optimal allocation, but employing personalized models results in improved energy-efficiency. As the number of samples is an integer, an exhaustive search to determine the optimal allocation is typical, but computationally expensive. To this end, an alternate, continuous-valued vector optimization is derived which yields approximately optimal allocations and can be implemented on the mobile fusion center due to its significantly lower complexity. PMID:21796237
NASA Technical Reports Server (NTRS)
Calaway, Michael J.; Allen, Carlton C.; Allton, Judith H.
2014-01-01
Future robotic and human spaceflight missions to the Moon, Mars, asteroids, and comets will require curating astromaterial samples with minimal inorganic and organic contamination to preserve the scientific integrity of each sample. 21st century sample return missions will focus on strict protocols for reducing organic contamination that have not been seen since the Apollo manned lunar landing program. To properly curate these materials, the Astromaterials Acquisition and Curation Office under the Astromaterial Research and Exploration Science Directorate at NASA Johnson Space Center houses and protects all extraterrestrial materials brought back to Earth that are controlled by the United States government. During fiscal year 2012, we conducted a year-long project to compile historical documentation and laboratory tests involving organic investigations at these facilities. In addition, we developed a plan to determine the current state of organic cleanliness in curation laboratories housing astromaterials. This was accomplished by focusing on current procedures and protocols for cleaning, sample handling, and storage. While the intention of this report is to give a comprehensive overview of the current state of organic cleanliness in JSC curation laboratories, it also provides a baseline for determining whether our cleaning procedures and sample handling protocols need to be adapted and/or augmented to meet the new requirements for future human spaceflight and robotic sample return missions.
Survey methods for assessing land cover map accuracy
Nusser, S.M.; Klaas, E.E.
2003-01-01
The increasing availability of digital photographic materials has fueled efforts by agencies and organizations to generate land cover maps for states, regions, and the United States as a whole. Regardless of the information sources and classification methods used, land cover maps are subject to numerous sources of error. In order to understand the quality of the information contained in these maps, it is desirable to generate statistically valid estimates of accuracy rates describing misclassification errors. We explored a full sample survey framework for creating accuracy assessment study designs that balance statistical and operational considerations in relation to study objectives for a regional assessment of GAP land cover maps. We focused not only on appropriate sample designs and estimation approaches, but on aspects of the data collection process, such as gaining cooperation of land owners and using pixel clusters as an observation unit. The approach was tested in a pilot study to assess the accuracy of Iowa GAP land cover maps. A stratified two-stage cluster sampling design addressed sample size requirements for land covers and the need for geographic spread while minimizing operational effort. Recruitment methods used for private land owners yielded high response rates, minimizing a source of nonresponse error. Collecting data for a 9-pixel cluster centered on the sampled pixel was simple to implement, and provided better information on rarer vegetation classes as well as substantial gains in precision relative to observing data at a single-pixel.
Automated high-throughput protein purification using an ÄKTApurifier and a CETAC autosampler.
Yoo, Daniel; Provchy, Justin; Park, Cynthia; Schulz, Craig; Walker, Kenneth
2014-05-30
As the pace of drug discovery accelerates there is an increased focus on screening larger numbers of protein therapeutic candidates to identify those that are functionally superior and to assess manufacturability earlier in the process. Although there have been advances toward high throughput (HT) cloning and expression, protein purification is still an area where improvements can be made to conventional techniques. Current methodologies for purification often involve a tradeoff between HT automation or capacity and quality. We present an ÄKTA combined with an autosampler, the ÄKTA-AS, which has the capability of purifying up to 240 samples in two chromatographic dimensions without the need for user intervention. The ÄKTA-AS has been shown to be reliable with sample volumes between 0.5 mL and 100 mL, and the innovative use of a uniquely configured loading valve ensures reliability by efficiently removing air from the system as well as preventing sample cross contamination. Incorporation of a sample pump flush minimizes sample loss and enables recoveries ranging from the low tens of micrograms to milligram quantities of protein. In addition, when used in an affinity capture-buffer exchange format the final samples are formulated in a buffer compatible with most assays without requirement of additional downstream processing. The system is designed to capture samples in 96-well microplate format allowing for seamless integration of downstream HT analytic processes such as microfluidic or HPLC analysis. Most notably, there is minimal operator intervention to operate this system, thereby increasing efficiency, sample consistency and reducing the risk of human error. Copyright © 2014 Elsevier B.V. All rights reserved.
Chen, Wei-Qiang; Obermayr, Philipp; Černigoj, Urh; Vidič, Jana; Panić-Janković, Tanta; Mitulović, Goran
2017-11-01
Classical proteomics approaches involve enzymatic hydrolysis of proteins (either separated by polyacrylamide gels or in solution) followed by peptide identification using LC-MS/MS analysis. This method requires normally more than 16 h to complete. In the case of clinical analysis, it is of the utmost importance to provide fast and reproducible analysis with minimal manual sample handling. Herein we report the method development for online protein digestion on immobilized monolithic enzymatic reactors (IMER) to accelerate protein digestion, reduce manual sample handling, and provide reproducibility to the digestion process in clinical laboratory. An integrated online digestion and separation method using monolithic immobilized enzymatic reactor was developed and applied to digestion and separation of in-vitro-fertilization media. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Jenkins, Jill A.
2011-01-01
Investigations into cellular and molecular characteristics of male gametes obtained from fish in natural ecosystems require careful sample handling and shipping in order to minimize artifacts. Maintaining sample integrity engenders confident assessments of ecosystem health, whereby animal condition is often reflected by gamete biomarkers - indicators that respond in measurable ways to changes. A number of our investigations have addressed the hypothesis that biomarkers from fish along a pollution gradient are reflective of site location. Species biology and the selected biological endpoints direct choice of parameters such as: temperature, buffer osmolality, time in transit, fixation, cryoprotectants, protease inhibition, and antibiotic inclusion in extender. This paper will highlight case studies, and outline parameters and thoughts on approaches for use by field and laboratory researchers.
Computer modelling of grain microstructure in three dimensions
NASA Astrophysics Data System (ADS)
Narayan, K. Lakshmi
We present a program that generates the two-dimensional micrographs of a three dimensional grain microstructure. The code utilizes a novel scanning, pixel mapping technique to secure statistical distributions of surface areas, grain sizes, aspect ratios, perimeters, number of nearest neighbors and volumes of the randomly nucleated particles. The program can be used for comparing the existing theories of grain growth, and interpretation of two-dimensional microstructure of three-dimensional samples. Special features have been included to minimize the computation time and resource requirements.
Martini, Valeria; Bernardi, Serena; Marelli, Priscilla; Cozzi, Marzia; Comazzi, Stefano
2018-06-01
Objectives Flow cytometry (FC) is becoming increasingly popular among veterinary oncologists for the diagnosis of lymphoma or leukaemia. It is accurate, fast and minimally invasive. Several studies of FC have been carried out in canine oncology and applied with great results, whereas there is limited knowledge and use of this technique in feline patients. This is mainly owing to the high prevalence of intra-abdominal lymphomas in this species and the difficulty associated with the diagnostic procedures needed to collect the sample. The purpose of the present study is to investigate whether any pre-analytical factor might affect the quality of suspected feline lymphoma samples for FC analysis. Methods Ninety-seven consecutive samples of suspected feline lymphoma were retrospectively selected from the authors' institution's FC database. The referring veterinarians were contacted and interviewed about several different variables, including signalment, appearance of the lesion, features of the sampling procedure and the experience of veterinarians performing the sampling. Statistical analyses were performed to assess the possible influence of these variables on the cellularity of the samples and the likelihood of it being finally processed for FC. Results Sample cellularity is a major factor in the likelihood of the sample being processed. Moreover, sample cellularity was significantly influenced by the needle size, with 21 G needles providing the highest cellularity. Notably, the sample cellularity and the likelihood of being processed did not vary between peripheral and intra-abdominal lesions. Approximately half of the cats required pharmacological restraint. Side effects were reported in one case only (transient swelling after peripheral lymph node sampling). Conclusions and relevance FC can be safely applied to cases of suspected feline lymphomas, including intra-abdominal lesions. A 21 G needle should be preferred for sampling. This study provides the basis for the increased use of this minimally invasive, fast and cost-effective technique in feline medicine.
FPGA design for constrained energy minimization
NASA Astrophysics Data System (ADS)
Wang, Jianwei; Chang, Chein-I.; Cao, Mang
2004-02-01
The Constrained Energy Minimization (CEM) has been widely used for hyperspectral detection and classification. The feasibility of implementing the CEM as a real-time processing algorithm in systolic arrays has been also demonstrated. The main challenge of realizing the CEM in hardware architecture in the computation of the inverse of the data correlation matrix performed in the CEM, which requires a complete set of data samples. In order to cope with this problem, the data correlation matrix must be calculated in a causal manner which only needs data samples up to the sample at the time it is processed. This paper presents a Field Programmable Gate Arrays (FPGA) design of such a causal CEM. The main feature of the proposed FPGA design is to use the Coordinate Rotation DIgital Computer (CORDIC) algorithm that can convert a Givens rotation of a vector to a set of shift-add operations. As a result, the CORDIC algorithm can be easily implemented in hardware architecture, therefore in FPGA. Since the computation of the inverse of the data correlction involves a series of Givens rotations, the utility of the CORDIC algorithm allows the causal CEM to perform real-time processing in FPGA. In this paper, an FPGA implementation of the causal CEM will be studied and its detailed architecture will be also described.
Optimized Geometry for Superconducting Sensing Coils
NASA Technical Reports Server (NTRS)
Eom, Byeong Ho; Pananen, Konstantin; Hahn, Inseob
2008-01-01
An optimized geometry has been proposed for superconducting sensing coils that are used in conjunction with superconducting quantum interference devices (SQUIDs) in magnetic resonance imaging (MRI), magnetoencephalography (MEG), and related applications in which magnetic fields of small dipoles are detected. In designing a coil of this type, as in designing other sensing coils, one seeks to maximize the sensitivity of the detector of which the coil is a part, subject to geometric constraints arising from the proximity of other required equipment. In MRI or MEG, the main benefit of maximizing the sensitivity would be to enable minimization of measurement time. In general, to maximize the sensitivity of a detector based on a sensing coil coupled with a SQUID sensor, it is necessary to maximize the magnetic flux enclosed by the sensing coil while minimizing the self-inductance of this coil. Simply making the coil larger may increase its self-inductance and does not necessarily increase sensitivity because it also effectively increases the distance from the sample that contains the source of the signal that one seeks to detect. Additional constraints on the size and shape of the coil and on the distance from the sample arise from the fact that the sample is at room temperature but the coil and the SQUID sensor must be enclosed within a cryogenic shield to maintain superconductivity.
Pereira, Polyana F; Marra, Mariana C; Munoz, Rodrigo A A; Richter, Eduardo M
2012-02-15
A simple, accurate and fast (180 injections h(-1)) batch injection analysis (BIA) system with multiple-pulse amperometric detection has been developed for selective determination of ethanol in gasohol and fuel ethanol. A sample aliquot (100 μL) was directly injected onto a gold electrode immersed in 0.5 mol L(-1) NaOH solution (unique reagent). The proposed BIA method requires minimal sample manipulation and can be easily used for on-site analysis. The results obtained with the BIA method were compared to those obtained by gas-chromatography and similar results were obtained (at 95% of confidence level). Published by Elsevier B.V.
Tracer-monitored flow titrations.
Sasaki, Milton K; Rocha, Diogo L; Rocha, Fábio R P; Zagatto, Elias A G
2016-01-01
The feasibility of implementing tracer-monitored titrations in a flow system is demonstrated. A dye tracer is used to estimate the instant sample and titrant volumetric fractions without the need for volume, mass or peak width measurements. The approach was applied to spectrophotometric flow titrations involving variations of sample and titrant flow-rates (i.e. triangle programmed technique) or concentration gradients established along the sample zone (i.e. flow injection system). Both strategies required simultaneous monitoring of two absorbing species, namely the titration indicator and the dye tracer. Mixing conditions were improved by placing a chamber with mechanical stirring in the analytical path aiming at to minimize diffusional effects. Unlike most of flow-based titrations, the innovation is considered as a true titration, as it does not require a calibration curve thus complying with IUPAC definition. As an application, acidity evaluation in vinegars involving titration with sodium hydroxide was selected. Phenolphthalein and brilliant blue FCF were used as indicator and dye tracer, respectively. Effects of sample volume, titrand/titrant concentrations and flow rates were investigated aiming at improved accuracy and precision. Results were reliable and in agreement with those obtained by a reference titration procedure. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hecht, Michael; Carsey, Frank
2005-01-01
The subsurface ice probe (SIPR) is a proposed apparatus that would bore into ice to depths as great as hundreds of meters by melting the ice and pumping the samples of meltwater to the surface. Originally intended for use in exploration of subsurface ice on Mars and other remote planets, the SIPR could also be used on Earth as an alternative to coring, drilling, and melting apparatuses heretofore used to sample Arctic and Antarctic ice sheets. The SIPR would include an assembly of instrumentation and electronic control equipment at the surface, connected via a tether to a compact assembly of boring, sampling, and sensor equipment in the borehole (see figure). Placing as much equipment as possible at the surface would help to attain primary objectives of minimizing power consumption, sampling with high depth resolution, and unobstructed imaging of the borehole wall. To the degree to which these requirements would be satisfied, the SIPR would offer advantages over the aforementioned ice-probing systems.
Fernández, Beatriz; Rodríguez-González, Pablo; García Alonso, J Ignacio; Malherbe, Julien; García-Fonseca, Sergio; Pereiro, Rosario; Sanz-Medel, Alfredo
2014-12-03
We report on the determination of trace elements in solid samples by the combination of on-line double isotope dilution and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). The proposed method requires the sequential analysis of the sample and a certified natural abundance standard by on-line IDMS using the same isotopically-enriched spike solution. In this way, the mass fraction of the analyte in the sample can be directly referred to the certified standard so the previous characterization of the spike solution is not required. To validate the procedure, Sr, Rb and Pb were determined in certified reference materials with different matrices, including silicate glasses (SRM 610, 612 and 614) and powdered samples (PACS-2, SRM 2710a, SRM 1944, SRM 2702 and SRM 2780). The analysis of powdered samples was carried out both by the preparation of pressed pellets and by lithium borate fusion. Experimental results for the analysis of powdered samples were in agreement with the certified values for all materials. Relative standard deviations in the range of 6-21% for pressed pellets and 3-21% for fused solids were obtained from n=3 independent measurements. Minimal sample preparation, data treatment and consumption of the isotopically-enriched isotopes are the main advantages of the method over previously reported approaches. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wood, W. T.; Runyan, T. E.; Palmsten, M.; Dale, J.; Crawford, C.
2016-12-01
Natural Gas (primarily methane) and gas hydrate accumulations require certain bio-geochemical, as well as physical conditions, some of which are poorly sampled and/or poorly understood. We exploit recent advances in the prediction of seafloor porosity and heat flux via machine learning techniques (e.g. Random forests and Bayesian networks) to predict the occurrence of gas and subsequently gas hydrate in marine sediments. The prediction (actually guided interpolation) of key parameters we use in this study is a K-nearest neighbor technique. KNN requires only minimal pre-processing of the data and predictors, and requires minimal run-time input so the results are almost entirely data-driven. Specifically we use new estimates of sedimentation rate and sediment type, along with recently derived compaction modeling to estimate profiles of porosity and age. We combined the compaction with seafloor heat flux to estimate temperature with depth and geologic age, which, with estimates of organic carbon, and models of methanogenesis yield limits on the production of methane. Results include geospatial predictions of gas (and gas hydrate) accumulations, with quantitative estimates of uncertainty. The Generic Earth Modeling System (GEMS) we have developed to derive the machine learning estimates is modular and easily updated with new algorithms or data.
NASA Astrophysics Data System (ADS)
Levin, Barnaby
The transmission electron microscope (TEM) is a powerful tool for characterizing the nanoscale and atomic structure of materials, offering insights into their fundamental physical properties. However, TEM characterization requires very thin samples of material to be placed in a high vacuum environment, and exposed to electron radiation. The high vacuum will induce some materials to evaporate or sublimate, preventing them from being accurately characterized, radiation may damage the sample, causing mass loss, or altering its structure, and structurally delicate samples may collapse and break apart when they are thinned for TEM imaging. This dissertation discusses three different projects in which each of these three difficulties pose challenges to TEM characterization of samples. Firstly, we outline strategies for minimizing radiation damage when characterizing materials in TEM at atomic resolution. We consider types of radiation damage, such as vacancy enhanced displacement, that are not included in some previous discussions of beam damage, and we consider how to minimize damage when using new imaging techniques such as annular bright-field scanning TEM. Our methodology emphasizes the general principle that variation of both signal strength and damage cross section must be considered when choosing an experimental electron beam voltage to minimize damage. Secondly, we consider samples containing sulfur, which is prone to sublimation in high vacuum. TEM is routinely used to attempt to characterize the sulfur distribution in lithium-sulfur battery electrodes, but sublimation artifacts can give misleading results. We demonstrate that sulfur sublimation can be suppressed by using cryogenic TEM to characterize sulfur at very low temperatures, or by using the recently developed airSEM to characterize sulfur without exposing it to vacuum. Finally, we discuss the characterization of aging cadmium yellow paint from early 20th century art masterpieces. The binding medium holding paint particles together bends and curls as sample thickness is reduced to 100 nm, making high resolution characterization challenging. We acquire lattice resolution images of the pigment particles through the binder using high voltage zero-loss energy filtered TEM, allowing us to measure the pigment particle size and determine the pigment crystal structure, providing insight into why the paint is aging and how it was synthesized.
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples. Graphical Abstract ᅟ.
3D hyperpolarized C-13 EPI with calibrationless parallel imaging
NASA Astrophysics Data System (ADS)
Gordon, Jeremy W.; Hansen, Rie B.; Shin, Peter J.; Feng, Yesu; Vigneron, Daniel B.; Larson, Peder E. Z.
2018-04-01
With the translation of metabolic MRI with hyperpolarized 13C agents into the clinic, imaging approaches will require large volumetric FOVs to support clinical applications. Parallel imaging techniques will be crucial to increasing volumetric scan coverage while minimizing RF requirements and temporal resolution. Calibrationless parallel imaging approaches are well-suited for this application because they eliminate the need to acquire coil profile maps or auto-calibration data. In this work, we explored the utility of a calibrationless parallel imaging method (SAKE) and corresponding sampling strategies to accelerate and undersample hyperpolarized 13C data using 3D blipped EPI acquisitions and multichannel receive coils, and demonstrated its application in a human study of [1-13C]pyruvate metabolism.
Révész, Kinga M; Landwehr, Jurate M
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 +/- 20 micro g) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H(3)PO(4)/CaCO(3)) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H(3)PO(4)/CaCO(3) reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 degrees C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was =0.1 and =0.2 per mill or per thousand, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for delta(18)O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
Lab-on-chip systems for integrated bioanalyses
Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia
2016-01-01
Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042
Predehydration and Ice Seeding in the Presence of Trehalose Enable Cell Cryopreservation
2017-01-01
Conventional approaches for cell cryopreservation require the use of toxic membrane-penetrating cryoprotective agents (pCPA), which limits the clinical application of cryopreserved cells. Here, we show intentionally induced ice formation at a high subzero temperature (> −10 °C) during cryopreservation, which is often referred to as ice seeding, could result in significant cell injury in the absence of any pCPA. This issue can be mitigated by predehydrating cells using extracellular trehalose to their minimal volume with minimized osmotically active water before ice seeding. We further observe that ice seeding can minimize the interfacial free energy that drives the devastating ice recrystallization-induced cell injury during warming cryopreserved samples. Indeed, by combining predehydration using extracellular trehalose with ice seeding at high subzero temperatures, high cell viability or recovery is achieved for fibroblasts, adult stem cells, and red blood cells after cryopreservation without using any pCPA. The pCPA-free technology developed in this study may greatly facilitate the long-term storage and ready availability of living cells, tissues, and organs that are of high demand by modern cell-based medicine. PMID:28824959
Deniz, Cem M; Vaidya, Manushka V; Sodickson, Daniel K; Lattanzi, Riccardo
2016-01-01
We investigated global specific absorption rate (SAR) and radiofrequency (RF) power requirements in parallel transmission as the distance between the transmit coils and the sample was increased. We calculated ultimate intrinsic SAR (UISAR), which depends on object geometry and electrical properties but not on coil design, and we used it as the reference to compare the performance of various transmit arrays. We investigated the case of fixing coil size and increasing the number of coils while moving the array away from the sample, as well as the case of fixing coil number and scaling coil dimensions. We also investigated RF power requirements as a function of lift-off, and tracked local SAR distributions associated with global SAR optima. In all cases, the target excitation profile was achieved and global SAR (as well as associated maximum local SAR) decreased with lift-off, approaching UISAR, which was constant for all lift-offs. We observed a lift-off value that optimizes the balance between global SAR and power losses in coil conductors. We showed that, using parallel transmission, global SAR can decrease at ultra high fields for finite arrays with a sufficient number of transmit elements. For parallel transmission, the distance between coils and object can be optimized to reduce SAR and minimize RF power requirements associated with homogeneous excitation. © 2015 Wiley Periodicals, Inc.
Preliminary Assessment/Site Inspection Work Plan for Granite Mountain Radio Relay System
1994-09-01
represent field conditions, and (3) sampling results are repeatable. Final (04 WV---,,1-, ,W•, S 2, mbr . 19W4 13 RyCWed 1.5.2 Sample Handling Sample...procedures specified in Section 2.1.3. Samples collected from shallow depths will be obtained by submerging a stainless- steel, Teflon, or glass... submerged in a manner that minimizes agitation of sediment and the water sample. If a seep or spring has minimal discharge flow, gravel, boulders, and soil
NASA Astrophysics Data System (ADS)
Siegert, Martin J.; Clarke, Rachel J.; Mowlem, Matt; Ross, Neil; Hill, Christopher S.; Tait, Andrew; Hodgson, Dominic; Parnell, John; Tranter, Martyn; Pearce, David; Bentley, Michael J.; Cockell, Charles; Tsaloglou, Maria-Nefeli; Smith, Andy; Woodward, John; Brito, Mario P.; Waugh, Ed
2012-01-01
Antarctic subglacial lakes are thought to be extreme habitats for microbial life and may contain important records of ice sheet history and climate change within their lake floor sediments. To find whether or not this is true, and to answer the science questions that would follow, direct measurement and sampling of these environments are required. Ever since the water depth of Vostok Subglacial Lake was shown to be >500 m, attention has been given to how these unique, ancient, and pristine environments may be entered without contamination and adverse disturbance. Several organizations have offered guidelines on the desirable cleanliness and sterility requirements for direct sampling experiments, including the U.S. National Academy of Sciences and the Scientific Committee on Antarctic Research. Here we summarize the scientific protocols and methods being developed for the exploration of Ellsworth Subglacial Lake in West Antarctica, planned for 2012-2013, which we offer as a guide to future subglacial environment research missions. The proposed exploration involves accessing the lake using a hot-water drill and deploying a sampling probe and sediment corer to allow sample collection. We focus here on how this can be undertaken with minimal environmental impact while maximizing scientific return without compromising the environment for future experiments.
Dhir, Ashish; Rogawski, Michael A
2018-05-01
Diazepam, administered by the intravenous, oral, or rectal routes, is widely used for the management of acute seizures. Dosage forms for delivery of diazepam by other routes of administration, including intranasal, intramuscular, and transbuccal, are under investigation. In predicting what dosages are necessary to terminate seizures, the minimal exposure required to confer seizure protection must be known. Here we administered diazepam by continuous intravenous infusion to obtain near-steady-state levels, which allowed an assessment of the minimal levels that elevate seizure threshold. The thresholds for various behavioral seizure signs (myoclonic jerk, clonus, and tonus) were determined with the timed intravenous pentylenetetrazol seizure threshold test in rats. Diazepam was administered to freely moving animals by continuous intravenous infusion via an indwelling jugular vein cannula. Blood samples for assay of plasma levels of diazepam and metabolites were recovered via an indwelling cannula in the contralateral jugular vein. The pharmacokinetic parameters of diazepam following a single 80-μg/kg intravenous bolus injection were determined using a noncompartmental pharmacokinetic approach. The derived parameters V d , CL, t 1/2α (distribution half-life) and t 1/2β (terminal half-life) for diazepam were, respectively, 608 mL, 22.1 mL/min, 13.7 minutes, and 76.8 minutes, respectively. Various doses of diazepam were continuously infused without or with an initial loading dose. At the end of the infusions, the thresholds for various behavioral seizure signs were determined. The minimal plasma diazepam concentration associated with threshold elevations was estimated at approximately 70 ng/mL. The active metabolites nordiazepam, oxazepam, and temazepam achieved levels that are expected to make only minor contributions to the threshold elevations. Diazepam elevates seizure threshold at steady-state plasma concentrations lower than previously recognized. The minimally effective plasma concentration provides a reference that may be considered when estimating the diazepam exposure required for acute seizure treatment. Wiley Periodicals, Inc. © 2018 International League Against Epilepsy.
NASA Astrophysics Data System (ADS)
Sheikholeslami, R.; Hosseini, N.; Razavi, S.
2016-12-01
Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).
Drummond, A; Rodrigo, A G
2000-12-01
Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.
Low-Power SOI CMOS Transceiver
NASA Technical Reports Server (NTRS)
Fujikawa, Gene (Technical Monitor); Cheruiyot, K.; Cothern, J.; Huang, D.; Singh, S.; Zencir, E.; Dogan, N.
2003-01-01
The work aims at developing a low-power Silicon on Insulator Complementary Metal Oxide Semiconductor (SOI CMOS) Transceiver for deep-space communications. RF Receiver must accomplish the following tasks: (a) Select the desired radio channel and reject other radio signals, (b) Amplify the desired radio signal and translate them back to baseband, and (c) Detect and decode the information with Low BER. In order to minimize cost and achieve high level of integration, receiver architecture should use least number of external filters and passive components. It should also consume least amount of power to minimize battery cost, size, and weight. One of the most stringent requirements for deep-space communication is the low-power operation. Our study identified that two candidate architectures listed in the following meet these requirements: (1) Low-IF receiver, (2) Sub-sampling receiver. The low-IF receiver uses minimum number of external components. Compared to Zero-IF (Direct conversion) architecture, it has less severe offset and flicker noise problems. The Sub-sampling receiver amplifies the RF signal and samples it using track-and-hold Subsampling mixer. These architectures provide low-power solution for the short- range communications missions on Mars. Accomplishments to date include: (1) System-level design and simulation of a Double-Differential PSK receiver, (2) Implementation of Honeywell SOI CMOS process design kit (PDK) in Cadence design tools, (3) Design of test circuits to investigate relationships between layout techniques, geometry, and low-frequency noise in SOI CMOS, (4) Model development and verification of on-chip spiral inductors in SOI CMOS process, (5) Design/implementation of low-power low-noise amplifier (LNA) and mixer for low-IF receiver, and (6) Design/implementation of high-gain LNA for sub-sampling receiver. Our initial results show that substantial improvement in power consumption is achieved using SOI CMOS as compared to standard CMOS process. Potential advantages of SOI CMOS for deep-space communication electronics include: (1) Radiation hardness, (2) Low-power operation, and (3) System-on-Chip (SOC) solutions.
Advanced Curation Protocols for Mars Returned Sample Handling
NASA Astrophysics Data System (ADS)
Bell, M.; Mickelson, E.; Lindstrom, D.; Allton, J.
Introduction: Johnson Space Center has over 30 years experience handling precious samples which include Lunar rocks and Antarctic meteorites. However, we recognize that future curation of samples from such missions as Genesis, Stardust, and Mars S mple Return, will require a high degree of biosafety combined witha extremely low levels of inorganic, organic, and biological contamination. To satisfy these requirements, research in the JSC Advanced Curation Lab is currently focused toward two major areas: preliminary examination techniques and cleaning and verification techniques . Preliminary Examination Techniques : In order to minimize the number of paths for contamination we are exploring the synergy between human &robotic sample handling in a controlled environment to help determine the limits of clean curation. Within the Advanced Curation Laboratory is a prototype, next-generation glovebox, which contains a robotic micromanipulator. The remotely operated manipulator has six degrees-of- freedom and can be programmed to perform repetitive sample handling tasks. Protocols are being tested and developed to perform curation tasks such as rock splitting, weighing, imaging, and storing. Techniques for sample transfer enabling more detailed remote examination without compromising the integrity of sample science are also being developed . The glovebox is equipped with a rapid transfer port through which samples can be passed without exposure. The transfer is accomplished by using a unique seal and engagement system which allows passage between containers while maintaining a first seal to the outside environment and a second seal to prevent the outside of the container cover and port door from becoming contaminated by the material being transferred. Cleaning and Verification Techniques: As part of the contamination control effort, innovative cleaning techniques are being identified and evaluated in conjunction with sensitive cleanliness verification methods. Towards this end, cleaning techniques such as ultrasonication in ultra -pure water (UPW), oxygen (O2) plasma, and carbon dioxide (CO2) "snow" are being used to clean a variety of different contaminants on a variety of different surfaces. Additionally, once cleaned, techniques to directly verify the s rface cleanliness are being developed. Theseu include X ray photoelectron spectroscopy (XPS) quantification, and screening with- contact angle measure ments , which can be correlated with XPS standards. Methods developed in the Advanced Curation Laboratory will determine the extent to which inorganic and biological contamination can be controlled and minimized.
Sohn, Martin Y; Barnes, Bryan M; Silver, Richard M
2018-03-01
Accurate optics-based dimensional measurements of features sized well-below the diffraction limit require a thorough understanding of the illumination within the optical column and of the three-dimensional scattered fields that contain the information required for quantitative metrology. Scatterfield microscopy can pair simulations with angle-resolved tool characterization to improve agreement between the experiment and calculated libraries, yielding sub-nanometer parametric uncertainties. Optimized angle-resolved illumination requires bi-telecentric optics in which a telecentric sample plane defined by a Köhler illumination configuration and a telecentric conjugate back focal plane (CBFP) of the objective lens; scanning an aperture or an aperture source at the CBFP allows control of the illumination beam angle at the sample plane with minimal distortion. A bi-telecentric illumination optics have been designed enabling angle-resolved illumination for both aperture and source scanning modes while yielding low distortion and chief ray parallelism. The optimized design features a maximum chief ray angle at the CBFP of 0.002° and maximum wavefront deviations of less than 0.06 λ for angle-resolved illumination beams at the sample plane, holding promise for high quality angle-resolved illumination for improved measurements of deep-subwavelength structures using deep-ultraviolet light.
Behbehani, Gregory K.; Thom, Colin; Zunder, Eli R.; Finck, Rachel; Gaudilliere, Brice; Fragiadakis, Gabriela K.; Fantl, Wendy J.; Nolan, Garry P.
2015-01-01
Fluorescent cellular barcoding and mass-tag cellular barcoding are cytometric methods that enable high sample throughput, minimize inter-sample variation, and reduce reagent consumption. Previously employed barcoding protocols require that barcoding be performed after surface marker staining, complicating combining the technique with measurement of alcohol-sensitive surface epitopes. This report describes a method of barcoding fixed cells after a transient partial permeabilization with 0.02% saponin that results in efficient and consistent barcode staining with fluorescent or mass-tagged reagents while preserving surface marker staining. This approach simplifies barcoding protocols and allows direct comparison of surface marker staining of multiple samples without concern for variations in the antibody cocktail volume, antigen-antibody ratio, or machine sensitivity. Using this protocol, cellular barcoding can be used to reliably detect subtle differences in surface marker expression. PMID:25274027
NASA Astrophysics Data System (ADS)
Nelson, Johanna; Yang, Yuan; Misra, Sumohan; Andrews, Joy C.; Cui, Yi; Toney, Michael F.
2013-09-01
Radiation damage is a topic typically sidestepped in formal discussions of characterization techniques utilizing ionizing radiation. Nevertheless, such damage is critical to consider when planning and performing experiments requiring large radiation doses or radiation sensitive samples. High resolution, in situ transmission X-ray microscopy of Li-ion batteries involves both large X-ray doses and radiation sensitive samples. To successfully identify changes over time solely due to an applied current, the effects of radiation damage must be identified and avoided. Although radiation damage is often significantly sample and instrument dependent, the general procedure to identify and minimize damage is transferable. Here we outline our method of determining and managing the radiation damage observed in lithium sulfur batteries during in situ X-ray imaging on the transmission X-ray microscope at Stanford Synchrotron Radiation Lightsource.
Rappoport, Louis H; Luna, Ingrid Y; Joshua, Gita
2017-05-01
Proper diagnosis and treatment of sacroiliac joint (SIJ) pain remains a clinical challenge. Dysfunction of the SIJ can produce pain in the lower back, buttocks, and extremities. Triangular titanium implants for minimally invasive surgical arthrodesis have been available for several years, with reputed high levels of success and patient satisfaction. This study reports on a novel hydroxyapatite-coated screw for surgical treatment of SIJ pain. Data were prospectively collected on 32 consecutive patients who underwent minimally invasive SIJ fusion with a novel hydroxyapatite-coated screw. Clinical assessments and radiographs were collected and evaluated at 3, 6, and 12 months postoperatively. Mean (standard deviation) patient age was 55.2 ± 10.7 years, and 62.5% were female. More patients (53.1%) underwent left versus right SIJ treatment, mean operative time was 42.6 ± 20.4 minutes, and estimated blood loss did not exceed 50 mL. Overnight hospital stay was required for 84% of patients, and the remaining patients needed a 2-day stay (16%). Mean preoperative visual analog scale back and leg pain scores decreased significantly by 12 months postoperatively (P < 0.01). Mechanical stability was achieved in 93.3% (28/30) of patients, and all patients who were employed preoperatively returned to work within 3 months. Two patients who required revision surgery reported symptom improvement within 3 weeks and did not require subsequent surgery. Positive clinical outcomes are reported 1 year postoperatively after implantation of a novel implant to treat sacroiliac joint pain. Future clinical studies with larger samples are warranted to assess long-term patient outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.
Children's views on microneedle use as an alternative to blood sampling for patient monitoring.
Mooney, Karen; McElnay, James C; Donnelly, Ryan F
2014-10-01
To explore children's views on microneedle use for this population, particularly as an alternative approach to blood sampling, in monitoring applications, and so, examine the acceptability of this approach to children. Focus groups were conducted with children (aged 10-14 years) in a range of schools across Northern Ireland. Convenience sampling was employed, i.e. children involved in a university-directed community-outreach project (Pharmacists in Schools) were recruited. A total of 86 children participated in 13 focus groups across seven schools in Northern Ireland. A widespread disapproval for blood sampling was evident, with pain, blood and traditional needle visualisation particularly unpopular aspects. In general, microneedles had greater visual acceptability and caused less fear. A patch-based design enabled minimal patient awareness of the monitoring procedure, with personalised designs, e.g. cartoon themes, favoured. Children's concerns included possible allergy and potential inaccuracies with this novel approach; however, many had confidence in the judgement of healthcare professionals if deeming this technique appropriate. They considered paediatric patient education critical for acceptance of this new approach and called for an alternative name, without any reference to 'needles'. The findings presented here support the development of blood-free, minimally invasive techniques and provide an initial indication of microneedle acceptability in children, particularly for monitoring purposes. A proactive response to these unique insights should enable microneedle array design to better meet the needs of this end-user group. Further work in this area is recommended to ascertain the perspectives of a purposive sample of children with chronic conditions who require regular monitoring. © 2013 Royal Pharmaceutical Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadler, D.A.; Sun, F.; Littlejohn, D.
1995-12-31
ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less
A multi-staining chip using hydrophobic valves for exfoliative cytology in cancer
NASA Astrophysics Data System (ADS)
Lee, Tae Hee; Bu, Jiyoon; Moon, Jung Eun; Kim, Young Jun; Kang, Yoon-Tae; Cho, Young-Ho; Kim, In Sik
2017-07-01
Exfoliative cytology is a highly established technique for the diagnosis of tumors. Various microfluidic devices have been developed to minimize the sample numbers by conjugating multiple antibodies in a single sample. However, the previous multi-staining devices require complex control lines and valves operated by external power sources, to deliver multiple antibodies separately for a single sample. In addition, most of these devices are composed of hydrophobic materials, causing unreliable results due to the non-specific binding of antibodies. Here, we present a multi-staining chip using hydrophobic valves, which is formed by the partial treatment of 2-hydroxyethyl methacrylate (HEMA). Our chip consists of a circular chamber, divided into six equal fan-shaped regions. Switchable injection ports are located at the center of the chamber and at the middle of the arc of each fan-shaped zone. Thus, our device is beneficial for minimizing the control lines, since pre-treatment solutions flow from the center to outer ports, while six different antibodies are introduced oppositely from the outer ports. Furthermore, hydrophobic narrow channels, connecting the central region and each of the six fan-shaped zones, are closed by capillary effect, thus preventing the fluidic mixing without external power sources. Meanwhile, HEMA treatment on the exterior region results in hydrophobic-to-hydrophilic transition and prevents the non-specific binding of antibodies. For the application, we measured the expression of six different antibodies in a single sample using our device. The expression levels of each antibody highly matched the conventional immunocytochemistry results. Our device enables cancer screening with a small number of antibodies for a single sample.
Comet nucleus sample return mission
NASA Technical Reports Server (NTRS)
1983-01-01
A comet nucleus sample return mission in terms of its relevant science objectives, candidate mission concepts, key design/technology requirements, and programmatic issues is discussed. The primary objective was to collect a sample of undisturbed comet material from beneath the surface of an active comet and to preserve its chemical and, if possible, its physical integrity and return it to Earth in a minimally altered state. The secondary objectives are to: (1) characterize the comet to a level consistent with a rendezvous mission; (2) monitor the comet dynamics through perihelion and aphelion with a long lived lander; and (3) determine the subsurface properties of the nucleus in an area local to the sampled core. A set of candidate comets is discussed. The hazards which the spacecraft would encounter in the vicinity of the comet are also discussed. The encounter strategy, the sampling hardware, the thermal control of the pristine comet material during the return to Earth, and the flight performance of various spacecraft systems and the cost estimates of such a mission are presented.
Boson Sampling with Single-Photon Fock States from a Bright Solid-State Source.
Loredo, J C; Broome, M A; Hilaire, P; Gazzano, O; Sagnes, I; Lemaitre, A; Almeida, M P; Senellart, P; White, A G
2017-03-31
A boson-sampling device is a quantum machine expected to perform tasks intractable for a classical computer, yet requiring minimal nonclassical resources as compared to full-scale quantum computers. Photonic implementations to date employed sources based on inefficient processes that only simulate heralded single-photon statistics when strongly reducing emission probabilities. Boson sampling with only single-photon input has thus never been realized. Here, we report on a boson-sampling device operated with a bright solid-state source of single-photon Fock states with high photon-number purity: the emission from an efficient and deterministic quantum dot-micropillar system is demultiplexed into three partially indistinguishable single photons, with a single-photon purity 1-g^{(2)}(0) of 0.990±0.001, interfering in a linear optics network. Our demultiplexed source is between 1 and 2 orders of magnitude more efficient than current heralded multiphoton sources based on spontaneous parametric down-conversion, allowing us to complete the boson-sampling experiment faster than previous equivalent implementations.
Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K
2011-12-01
Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.
The Gulliver sample return mission to Deimos
NASA Astrophysics Data System (ADS)
Britt, D. T.; Robinson, M.; Gulliver Team
The Martian moon Deimos presents a unique opportunity for a sample return mission. Deimos is spectrally analogous to type D asteroids, which are thought to be composed of highly primitive carbonaceous material that originated in the outer asteroid belt. It also is in orbit around Mars and has been accumulating material ejected from the Martian surface ever since the earliest periods of Martian history, over 4.4 Gyrs ago. There are a number of factors that make sample return from Deimos extremely attractive. It is Better: Deimos is a repository for two kinds of extremely significant and scientifically exciting ancient samples: (1) Primitive spectral D-type material that may have accreted in the outer asteroid belt and Trojan swarm. This material samples the composition of solar nebula well outside the zone of terrestrial planets and provides a direct sample of primitive material so common past 3 AU but so uncommon in the meteorite collection. (2) Ancient Mars, which could include the full range of Martian crustal and upper mantle material from the early differentiation and crustal-forming epoch as well as samples from the era of high volatile flux, thick atmosphere, and possible surface water. The Martian material on Deimos would be dominated by ejecta from the ancient crust of Mars, delivered during the Noachian Period of basin-forming impacts and heavy bombardment. It is Closer: Compared to other primitive D-type asteroids, Deimos is by far the most accessible. Because of its orbit around Mars, Deimos is far closer than any other D asteroid. It is Safer: Deimos is also by far the safest small body for sample return yet imaged. It is an order of magnitude less rocky than Eros and the NEAR-Shoemaker mission succeeded in landing on Eros with a spacecraft not designed for landing and proximity maneuvering. Because of Viking imagery we already know a great deal about the surface roughness of Deimos. It is known to be very smooth and have moderate topography and gravitational slopes. It is Easier: Deimos is farther from Mars and smaller than Phobos. This location minimizes the delta-V penalties from entering the Martian gravity well; minimizes the energy requirements for sampling maneuvers; and minimizes Martian tidal effects on S/C operations. After initial processing these samples will be made available as soon as possible to the international cosmochemistry community for detailed analysis. The mission management team includes Lockheed Martin Astronautics (flight system, I&T) and JPL (payload, mission ops, and mission management).
NASA Astrophysics Data System (ADS)
Hodgson, Lorna; Thompson, Andrew
2012-03-01
This paper presents the results of a non-HMDS (non-silane) adhesion promoter that was used to reduce the zeta potential for very thin (proprietary) polymer on silicon. By reducing the zeta potential, as measured by the minimum sample required to fully coat a wafer, the amount of polymer required to coat silicon substrates was significantly reduced in the manufacture of X-ray windows used for high transmission of low-energy X-rays. Moreover, this approach used aqueous based adhesion promoter described as a cationic surface active agent that has been shown to improve adhesion of photoresists (positive, negative, epoxy [SU8], e-beam and dry film). As well as reducing the amount of polymer required to coat substrates, this aqueous adhesion promoter is nonhazardous, and contains non-volatile solvents.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Silicon microneedle array for minimally invasive human health monitoring
NASA Astrophysics Data System (ADS)
Smith, Rosemary L.; Collins, Scott D.; Duy, Janice; Minogue, Timothy D.
2018-02-01
A silicon microneedle array with integrated microfluidic channels is presented, which is designed to extract dermal interstitial fluid (ISF) for biochemical analysis. ISF is a cell-free biofluid that is known to contain many of the same constituents as blood plasma, but the scope and dynamics of biomarker similarities are known for only a few components, most notably glucose. Dermal ISF is accessible just below the outer skin layer (epidermis), which can be reached and extracted with minimal sensation and tissue trauma by using a microneedle array. The microneedle arrays presented here are being developed to extract dermal ISF for off-chip profiling of nucleic acid constituents in order to identify potential biomarkers of disease. In order to assess sample volume requirements, preliminary RNA profiling was performed with suction blister ISF. The microneedles are batch fabricated using established silicon technology (low cost), are small in size, and can be integrated with sensors for on-chip analysis. This approach portends a more rapid, less expensive, self-administered assessment of human health than is currently achievable with blood sampling, especially in non-clinical and austere settings. Ultimately, a wearable device for monitoring a person's health in any setting is envisioned.
PELE web server: atomistic study of biomolecular systems at your fingertips.
Madadkar-Sobhani, Armin; Guallar, Victor
2013-07-01
PELE, Protein Energy Landscape Exploration, our novel technology based on protein structure prediction algorithms and a Monte Carlo sampling, is capable of modelling the all-atom protein-ligand dynamical interactions in an efficient and fast manner, with two orders of magnitude reduced computational cost when compared with traditional molecular dynamics techniques. PELE's heuristic approach generates trial moves based on protein and ligand perturbations followed by side chain sampling and global/local minimization. The collection of accepted steps forms a stochastic trajectory. Furthermore, several processors may be run in parallel towards a collective goal or defining several independent trajectories; the whole procedure has been parallelized using the Message Passing Interface. Here, we introduce the PELE web server, designed to make the whole process of running simulations easier and more practical by minimizing input file demand, providing user-friendly interface and producing abstract outputs (e.g. interactive graphs and tables). The web server has been implemented in C++ using Wt (http://www.webtoolkit.eu) and MySQL (http://www.mysql.com). The PELE web server, accessible at http://pele.bsc.es, is free and open to all users with no login requirement.
NASA Astrophysics Data System (ADS)
Liu, Yongliang; Chen, Yud-Ren; Nou, Xiangwu; Chao, Kaunglin
2007-09-01
Rapid and routine identification of foodborne bacteria are considerably important, because of bio- / agro- terrorism threats, public health concerns, and economic loss. Conventional, PCR, and immunoassay methods for the detection of bacteria are generally time-consuming, chemical reagent necessary and multi-step procedures. Fast microbial detection requires minimal sample preparation, permits the routine analysis of large numbers of samples with negligible reagent costs, and is easy to operate. Therefore, we have developed silver colloidal nanoparticle based surface-enhanced Raman scattering (SERS) spectroscopy as a potential tool for the rapid and routine detection of E. coli and L. monocytogenes. This study presents the further results of our examination on S. typhimonium, one of the most commonly outbreak bacteria, for the characteristic bands and subsequent identification.
Woo, A H; Lindsay, R C
1980-07-01
A rapid quantiative method was developed for routine analysis of the major, even carbon-numbered free fatty acids in butter and cream. Free fatty acids were isolated directly from intact samples by a modified silicic acid-potassium hydroxide arrestant column and were separated by gas chromatography with a 1.8 m x 2 mm inner diameter glass column packed with 10% neopentyl glycol adipate on 80/100 Chromosorb W. Purified, formic acid-saturated carrier gas was required for minimal peak tailing and extended column life. The accuracy and reproducibility of the mmethod was established through quantitative recovery studies of free fatty acid mixtures, free fatty acids added to butter, and replicate analysis of butter and cream samples.
Nano-plasmonic exosome diagnostics
Im, Hyungsoon; Shao, Huilin; Weissleder, Ralph; Castro, Cesar M.; Lee, Hakho
2015-01-01
Exosomes have emerged as a promising biomarker. These vesicles abound in biofluids and harbor molecular constituents from their parent cells, thereby offering a minimally-invasive avenue for molecular analyses. Despite such clinical potential, routine exosomal analysis, particularly the protein assay, remains challenging, due to requirements for large sample volumes and extensive processing. We have been developing miniaturized systems to facilitate clinical exosome studies. These systems can be categorized into two components: microfluidics for sample preparation and analytical tools for protein analyses. In this report, we review a new assay platform, nano-plasmonic exosome (nPLEX), in which sensing is based on surface plasmon resonance to achieve label-free exosome detection. Looking forward, we also discuss some potential challenges and improvements in exosome studies. PMID:25936957
Cross-Sectional HIV Incidence Estimation in HIV Prevention Research
Brookmeyer, Ron; Laeyendecker, Oliver; Donnell, Deborah; Eshleman, Susan H.
2013-01-01
Accurate methods for estimating HIV incidence from cross-sectional samples would have great utility in prevention research. This report describes recent improvements in cross-sectional methods that significantly improve their accuracy. These improvements are based on the use of multiple biomarkers to identify recent HIV infections. These multi-assay algorithms (MAAs) use assays in a hierarchical approach for testing that minimizes the effort and cost of incidence estimation. These MAAs do not require mathematical adjustments for accurate estimation of the incidence rates in study populations in the year prior to sample collection. MAAs provide a practical, accurate, and cost-effective approach for cross-sectional HIV incidence estimation that can be used for HIV prevention research and global epidemic monitoring. PMID:23764641
DOE Office of Scientific and Technical Information (OSTI.GOV)
DOUGLAS, J.G.
2006-07-06
This document presents the technical justification for choosing and using propane as a calibration standard for estimating total flammable volatile organic compounds (VOCs) in an air matrix. A propane-in-nitrogen standard was selected based on a number of criteria: (1) has an analytical response similar to the VOCs of interest, (2) can be made with known accuracy and traceability, (3) is available with good purity, (4) has a matrix similar to the sample matrix, (5) is stable during storage and use, (6) is relatively non-hazardous, and (7) is a recognized standard for similar analytical applications. The Waste Retrieval Project (WRP) desiresmore » a fast, reliable, and inexpensive method for screening the flammable VOC content in the vapor-phase headspace of waste containers. Table 1 lists the flammable VOCs of interest to the WRP. The current method used to determine the VOC content of a container is to sample the container's headspace and submit the sample for gas chromatography--mass spectrometry (GC-MS) analysis. The driver for the VOC measurement requirement is safety: potentially flammable atmospheres in the waste containers must be allowed to diffuse prior to processing the container. The proposed flammable VOC screening method is to inject an aliquot of the headspace sample into an argon-doped pulsed-discharge helium ionization detector (Ar-PDHID) contained within a gas chromatograph. No actual chromatography is performed; the sample is transferred directly from a sample loop to the detector through a short, inert transfer line. The peak area resulting from the injected sample is proportional to the flammable VOC content of the sample. However, because the Ar-PDHID has different response factors for different flammable VOCs, a fundamental assumption must be made that the agent used to calibrate the detector is representative of the flammable VOCs of interest that may be in the headspace samples. At worst, we desire that calibration with the selected calibrating agent overestimate the value of the VOCs in a sample. By overestimating the VOC content of a sample, we want to minimize false negatives. A false negative is defined as incorrectly estimating the VOC content of the sample to be below programmatic action limits when, in fact, the sample,exceeds the action limits. The disadvantage of overestimating the flammable VOC content of a sample is that additional cost may be incurred because additional sampling and GC-MS analysis may be required to confirm results over programmatic action limits. Therefore, choosing an appropriate calibration standard for the Ar-PDHID is critical to avoid false negatives and to minimize additional analytical costs.« less
Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek
2011-10-30
The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.
Real-time implementation of second generation of audio multilevel information coding
NASA Astrophysics Data System (ADS)
Ali, Murtaza; Tewfik, Ahmed H.; Viswanathan, V.
1994-03-01
This paper describes real-time implementation of a novel wavelet- based audio compression method. This method is based on the discrete wavelet (DWT) representation of signals. A bit allocation procedure is used to allocate bits to the transform coefficients in an adaptive fashion. The bit allocation procedure has been designed to take advantage of the masking effect in human hearing. The procedure minimizes the number of bits required to represent each frame of audio signals at a fixed distortion level. The real-time implementation provides almost transparent compression of monophonic CD quality audio signals (samples at 44.1 KHz and quantized using 16 bits/sample) at bit rates of 64-78 Kbits/sec. Our implementation uses two ASPI Elf boards, each of which is built around a TI TMS230C31 DSP chip. The time required for encoding of a mono CD signal is about 92 percent of real time and that for decoding about 61 percent.
Martin, Brigitte E.; Jia, Kun; Sun, Hailiang; Ye, Jianqiang; Hall, Crystal; Ware, Daphne; Wan, Xiu-Feng
2016-01-01
Identification of antigenic variants is the key to a successful influenza vaccination program. The empirical serological methods to determine influenza antigenic properties require viral propagation. Here a novel quantitative PCR-based antigenic characterization method using polyclonal antibody and proximity ligation assays, or so-called polyPLA, was developed and validated. This method can detect a viral titer that is less than 1000 TCID50/mL. Not only can this method differentiate between different HA subtypes of influenza viruses but also effectively identify antigenic drift events within the same HA subtype of influenza viruses. Applications in H3N2 seasonal influenza data showed that the results from this novel method are consistent with those from the conventional serological assays. This method is not limited to the detection of antigenic variants in influenza but also other pathogens. It has the potential to be applied through a large-scale platform in disease surveillance requiring minimal biosafety and directly using clinical samples. PMID:25546251
NASA Technical Reports Server (NTRS)
1977-01-01
The 20x9 TDI array was developed to meet the LANDSAT Thematic Mapper Requirements. This array is based upon a self-aligned, transparent gate, buried channel process. The process features: (1) buried channel, four phase, overlapping gate CCD's for high transfer efficiency without fat zero; (2) self-aligned transistors to minimize clock feedthrough and parasitic capacitance; and (3) transparent tin oxide electrode for high quantum efficiency with front surface irradiation. The requirements placed on the array and the performance achieved are summarized. This data is the result of flat field measurements only, no imaging or dynamic target measurements were made during this program. Measurements were performed with two different test stands. The bench test equipment fabricated for this program operated at the 8 micro sec line time and employed simple sampling of the gated MOSFET output video signal. The second stand employed Correlated Doubled Sampling (CDS) and operated at 79.2 micro sec line time.
Giangarra, Jenna E; Barry, Sabrina L; Dahlgren, Linda A; Lanz, Otto I; Benitez, Marian E; Werre, Stephen R
2018-04-25
To identify if synovial fluid prostaglandin E 2 increases in response to a single intra-articular dose of bupivacaine in the normal canine stifle. There were no significant differences in synovial fluid prostaglandin E 2 (PGE 2 ) concentrations between treatment groups or over time within bupivacaine or saline groups. Samples requiring ≥ 3 arthrocentesis attempts had significantly higher PGE 2 concentrations compared to samples requiring 1 or 2 attempts. Following correction for number of arthrocentesis attempts, PGE 2 concentrations were significantly higher than baseline at 24 and 48 h in the bupivacaine group; however there were no significant differences between the bupivacaine and saline groups. In normal dogs, a single bupivacaine injection did not cause significant synovial inflammation, as measured by PGE 2 concentrations, compared to saline controls. Future research should minimize aspiration attempts and include evaluation of the synovial response to bupivacaine in clinical cases with joint disease.
Determining minimal display element requirements for surface map displays
DOT National Transportation Integrated Search
2003-04-14
There is a great deal of interest in developing electronic surface map displays to enhance safety and reduce incidents and incursions on or near the airport surface. There is a lack of research, however, detailing the minimal display elements require...
Callaham, Michael; John, Leslie K
2018-01-05
We define a minimally important difference for the Likert-type scores frequently used in scientific peer review (similar to existing minimally important differences for scores in clinical medicine). The magnitude of score change required to change editorial decisions has not been studied, to our knowledge. Experienced editors at a journal in the top 6% by impact factor were asked how large a change of rating in "overall desirability for publication" was required to trigger a change in their initial decision on an article. Minimally important differences were assessed twice for each editor: once assessing the rating change required to shift the editor away from an initial decision to accept, and the other assessing the magnitude required to shift away from an initial rejection decision. Forty-one editors completed the survey (89% response rate). In the acceptance frame, the median minimally important difference was 0.4 points on a scale of 1 to 5. Editors required a greater rating change to shift from an initial rejection decision; in the rejection frame, the median minimally important difference was 1.2 points. Within each frame, there was considerable heterogeneity: in the acceptance frame, 38% of editors did not change their decision within the maximum available range; in the rejection frame, 51% did not. To our knowledge, this is the first study to determine the minimally important difference for Likert-type ratings of research article quality, or in fact any nonclinical scientific assessment variable. Our findings may be useful for future research assessing whether changes to the peer review process produce clinically meaningful differences in editorial decisionmaking. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.; Rykoff, E. S.; Abate, A.
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
redMaGiC: Selecting luminous red galaxies from the DES Science Verification data
Rozo, E.; Rykoff, E. S.; Abate, A.; ...
2016-05-30
Here, we introduce redMaGiC, an automated algorithm for selecting luminous red galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the colour cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine learning-based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalogue sampling themore » redshift range z ϵ [0.2, 0.8]. Our fiducial sample has a comoving space density of 10 –3 (h –1 Mpc) –3, and a median photo-z bias (zspec – zphoto) and scatter (σz/(1 + z)) of 0.005 and 0.017, respectively. The corresponding 5σ outlier fraction is 1.4 per cent. We also test our algorithm with Sloan Digital Sky Survey Data Release 8 and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1 per cent level.« less
Recent Updates in the Endoscopic Diagnosis of Barrett's Oesophagus.
Sharma, Neel; Ho, Khek Yu
2016-10-01
Barrett's oesophagus (BO) is a premalignant condition associated with the development of oesophageal adenocarcinoma (OAC). Despite the low risk of progression per annum, OAC is associated with significant morbidity and mortality, with an estimated 5-year survival of 10%. Furthermore, the incidence of OAC continues to rise globally. Therefore, it is imperative to detect the premalignant phase of BO and follow up such patients accordingly. The mainstay diagnosis of BO is endoscopy and biopsy sampling. However, limitations with white light endoscopy (WLE) and undertaking biopsies have shifted the current focus towards real-time image analysis. Utilization of additional tools such as chromoendoscopy, narrow-band imaging (NBI), confocal laser endomicroscopy (CLE), and optical coherence tomography (OCT) are proving beneficial. Furthermore, it is also becoming more apparent that often these tools are utilized by experts in the field. Therefore, for the non-expert, training in these systems is key. Currently as yet, the methodologies used for training optimization require further inquiry. (1) Real-time imaging can serve to minimize excess biopsies. (2) Tools such as chromoendoscopy, NBI, CLE, and OCT can help to compliment WLE. WLE is associated with limited sensitivity. Biopsy sampling is cost-ineffective and associated with sampling error. Hence, from a practical perspective, endoscopists should aim to utilize additional tools to help in real-time image interpretation and minimize an overreliance on histology.
Recent Updates in the Endoscopic Diagnosis of Barrett's Oesophagus
Sharma, Neel; Ho, Khek Yu
2016-01-01
Background Barrett's oesophagus (BO) is a premalignant condition associated with the development of oesophageal adenocarcinoma (OAC). Despite the low risk of progression per annum, OAC is associated with significant morbidity and mortality, with an estimated 5-year survival of 10%. Furthermore, the incidence of OAC continues to rise globally. Therefore, it is imperative to detect the premalignant phase of BO and follow up such patients accordingly. Summary The mainstay diagnosis of BO is endoscopy and biopsy sampling. However, limitations with white light endoscopy (WLE) and undertaking biopsies have shifted the current focus towards real-time image analysis. Utilization of additional tools such as chromoendoscopy, narrow-band imaging (NBI), confocal laser endomicroscopy (CLE), and optical coherence tomography (OCT) are proving beneficial. Furthermore, it is also becoming more apparent that often these tools are utilized by experts in the field. Therefore, for the non-expert, training in these systems is key. Currently as yet, the methodologies used for training optimization require further inquiry. Key Message (1) Real-time imaging can serve to minimize excess biopsies. (2) Tools such as chromoendoscopy, NBI, CLE, and OCT can help to compliment WLE. Practical Implications WLE is associated with limited sensitivity. Biopsy sampling is cost-ineffective and associated with sampling error. Hence, from a practical perspective, endoscopists should aim to utilize additional tools to help in real-time image interpretation and minimize an overreliance on histology. PMID:27904863
Nonparametric Methods in Astronomy: Think, Regress, Observe—Pick Any Three
NASA Astrophysics Data System (ADS)
Steinhardt, Charles L.; Jermyn, Adam S.
2018-02-01
Telescopes are much more expensive than astronomers, so it is essential to minimize required sample sizes by using the most data-efficient statistical methods possible. However, the most commonly used model-independent techniques for finding the relationship between two variables in astronomy are flawed. In the worst case they can lead without warning to subtly yet catastrophically wrong results, and even in the best case they require more data than necessary. Unfortunately, there is no single best technique for nonparametric regression. Instead, we provide a guide for how astronomers can choose the best method for their specific problem and provide a python library with both wrappers for the most useful existing algorithms and implementations of two new algorithms developed here.
Development of Experiment Kits for Processing Biological Samples In-Flight on SLS-2
NASA Technical Reports Server (NTRS)
Jaquez, R.; Savage, P. D.; Hinds, W. E.; Evans, J.; Dubrovin, L.
1994-01-01
The design of the hematology experiment kits for SLS-2 has resulted in a modular, flexible configuration which maximizes crew efficiency and minimizes error and confusion when dealing with over 1200 different components over the course of the mission. The kit layouts proved to be very easy to use and their packaging design provided for positive, secure containment of the many small components. The secondary Zero(Tm) box enclosure also provided an effective means for transport of the kits within the Spacelab and for grouping individual kits by flight day usage. The kits are readily adaptable to use on future flights by simply replacing the inner components as required and changing the labelling scheme to match new mission requirements.
Optimal power distribution for minimizing pupil walk in a 7.5X afocal zoom lens
NASA Astrophysics Data System (ADS)
Song, Wanyue; Zhao, Yang; Berman, Rebecca; Bodell, S. Yvonne; Fennig, Eryn; Ni, Yunhui; Papa, Jonathan C.; Yang, Tianyi; Yee, Anthony J.; Moore, Duncan T.; Bentley, Julie L.
2017-11-01
An extensive design study was conducted to find the best optimal power distribution and stop location for a 7.5x afocal zoom lens that controls the pupil walk and pupil location through zoom. This afocal zoom lens is one of the three components in a VIS-SWIR high-resolution microscope for inspection of photonic chips. The microscope consists of an afocal zoom, a nine-element objective and a tube lens and has diffraction limited performance with zero vignetting. In this case, the required change in object (sample) size and resolution is achieved by the magnification change of the afocal component. This creates strict requirements for both the entrance and exit pupil locations of the afocal zoom to couple the two sides successfully. The first phase of the design study looked at conventional four group zoom lenses with positive groups in the front and back and the stop at a fixed location outside the lens but resulted in significant pupil walk. The second phase of the design study focused on several promising unconventional four-group power distribution designs with moving stops that minimized pupil walk and had an acceptable pupil location (as determined by the objective and tube lens).
Fraisier, V; Clouvel, G; Jasaitis, A; Dimitrov, A; Piolot, T; Salamero, J
2015-09-01
Multiconfocal microscopy gives a good compromise between fast imaging and reasonable resolution. However, the low intensity of live fluorescent emitters is a major limitation to this technique. Aberrations induced by the optical setup, especially the mismatch of the refractive index and the biological sample itself, distort the point spread function and further reduce the amount of detected photons. Altogether, this leads to impaired image quality, preventing accurate analysis of molecular processes in biological samples and imaging deep in the sample. The amount of detected fluorescence can be improved with adaptive optics. Here, we used a compact adaptive optics module (adaptive optics box for sectioning optical microscopy), which was specifically designed for spinning disk confocal microscopy. The module overcomes undesired anomalies by correcting for most of the aberrations in confocal imaging. Existing aberration detection methods require prior illumination, which bleaches the sample. To avoid multiple exposures of the sample, we established an experimental model describing the depth dependence of major aberrations. This model allows us to correct for those aberrations when performing a z-stack, gradually increasing the amplitude of the correction with depth. It does not require illumination of the sample for aberration detection, thus minimizing photobleaching and phototoxicity. With this model, we improved both signal-to-background ratio and image contrast. Here, we present comparative studies on a variety of biological samples. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
5 CFR 582.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...
Chen, Ming-Kai; Menard, David H; Cheng, David W
2016-03-01
In pursuit of as-low-as-reasonably-achievable (ALARA) doses, this study investigated the minimal required radioactivity and corresponding imaging time for reliable semiquantification in PET/CT imaging. Using a phantom containing spheres of various diameters (3.4, 2.1, 1.5, 1.2, and 1.0 cm) filled with a fixed (18)F-FDG concentration of 165 kBq/mL and a background concentration of 23.3 kBq/mL, we performed PET/CT at multiple time points over 20 h of radioactive decay. The images were acquired for 10 min at a single bed position for each of 10 half-lives of decay using 3-dimensional list mode and were reconstructed into 1-, 2-, 3-, 4-, 5-, and 10-min acquisitions per bed position using an ordered-subsets expectation maximum algorithm with 24 subsets and 2 iterations and a gaussian 2-mm filter. SUVmax and SUVavg were measured for each sphere. The minimal required activity (±10%) for precise SUVmax semiquantification in the spheres was 1.8 kBq/mL for an acquisition of 10 min, 3.7 kBq/mL for 3-5 min, 7.9 kBq/mL for 2 min, and 17.4 kBq/mL for 1 min. The minimal required activity concentration-acquisition time product per bed position was 10-15 kBq/mL⋅min for reproducible SUV measurements within the spheres without overestimation. Using the total radioactivity and counting rate from the entire phantom, we found that the minimal required total activity-time product was 17 MBq⋅min and the minimal required counting rate-time product was 100 kcps⋅min. Our phantom study determined a threshold for minimal radioactivity and acquisition time for precise semiquantification in (18)F-FDG PET imaging that can serve as a guide in pursuit of achieving ALARA doses. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Gusmaroli, Lucia; Insa, Sara; Petrovic, Mira
2018-04-24
During the last decades, the quality of aquatic ecosystems has been threatened by increasing levels of pollutions, caused by the discharge of man-made chemicals, both via accidental release of pollutants as well as a consequence of the constant outflow of inadequately treated wastewater effluents. For this reason, the European Union is updating its legislations with the aim of limiting the release of emerging contaminants. The Commission Implementing Decision (EU) 2015/495 published in March 2015 drafts a "Watch list" of compounds to be monitored Europe-wide. In this study, a methodology based on online solid-phase extraction (SPE) ultra-high-performance liquid chromatography coupled to a triple-quadrupole mass spectrometer (UHPLC-MS/MS) was developed for the simultaneous determination of the 17 compounds listed therein. The proposed method offers advantages over already available methods, such as versatility (all 17 compounds can be analyzed simultaneously), shorter time required for analysis, robustness, and sensitivity. The employment of online sample preparation minimized sample manipulation and reduced dramatically the sample volume needed and time required, dramatically the sample volume needed and time required, thus making the analysis fast and reliable. The method was successfully validated in surface water and influent and effluent wastewater. Limits of detection ranged from sub- to low-nanogram per liter levels, in compliance with the EU limits, with the only exception of EE2. Graphical abstract Schematic of the workflow for the analysis of the Watch list compounds.
Krõlov, Katrin; Frolova, Jekaterina; Tudoran, Oana; Suhorutsenko, Julia; Lehto, Taavi; Sibul, Hiljar; Mäger, Imre; Laanpere, Made; Tulp, Indrek; Langel, Ülo
2014-01-01
Chlamydia trachomatis is the most common sexually transmitted human pathogen. Infection results in minimal to no symptoms in approximately two-thirds of women and therefore often goes undiagnosed. C. trachomatis infections are a major public health concern because of the potential severe long-term consequences, including an increased risk of ectopic pregnancy, chronic pelvic pain, and infertility. To date, several point-of-care tests have been developed for C. trachomatis diagnostics. Although many of them are fast and specific, they lack the required sensitivity for large-scale application. We describe a rapid and sensitive form of detection directly from urine samples. The assay uses recombinase polymerase amplification and has a minimum detection limit of 5 to 12 pathogens per test. Furthermore, it enables detection within 20 minutes directly from urine samples without DNA purification before the amplification reaction. Initial analysis of the assay from clinical patient samples had a specificity of 100% (95% CI, 92%-100%) and a sensitivity of 83% (95% CI, 51%-97%). The whole procedure is fairly simple and does not require specific machinery, making it potentially applicable in point-of-care settings. Copyright © 2014 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Barcoding of live human PBMC for multiplexed mass cytometry*
Mei, Henrik E.; Leipold, Michael D.; Schulz, Axel Ronald; Chester, Cariad; Maecker, Holden T.
2014-01-01
Mass cytometry is developing as a means of multiparametric single cell analysis. Here, we present an approach to barcoding separate live human PBMC samples for combined preparation and acquisition on a CyTOF® instrument. Using six different anti-CD45 antibody (Ab) conjugates labeled with Pd104, Pd106, Pd108, Pd110, In113, and In115, respectively, we barcoded up to 20 samples with unique combinations of exactly three different CD45 Ab tags. Cell events carrying more than or less than three different tags were excluded from analyses during Boolean data deconvolution, allowing for precise sample assignment and the electronic removal of cell aggregates. Data from barcoded samples matched data from corresponding individually stained and acquired samples, at cell event recoveries similar to individual sample analyses. The approach greatly reduced technical noise and minimizes unwanted cell doublet events in mass cytometry data, and reduces wet work and antibody consumption. It also eliminates sample-to-sample carryover and the requirement of instrument cleaning between samples, thereby effectively reducing overall instrument runtime. Hence, CD45-barcoding facilitates accuracy of mass cytometric immunophenotyping studies, thus supporting biomarker discovery efforts, and should be applicable to fluorescence flow cytometry as well. PMID:25609839
MicroRaman measurements for nuclear fuel reprocessing applications
Casella, Amanda; Lines, Amanda; Nelson, Gilbert; ...
2016-12-01
Treatment and reuse of used nuclear fuel is a key component in closing the nuclear fuel cycle. Solvent extraction reprocessing methods that have been developed contain various steps tailored to the separation of specific radionuclides, which are highly dependent upon solution properties. The instrumentation used to monitor these processes must be robust, require little or no maintenance, and be able to withstand harsh environments such as high radiation fields and aggressive chemical matrices. Our group has been investigating the use of optical spectroscopy for the on-line monitoring of actinides, lanthanides, and acid strength within fuel reprocessing streams. This paper willmore » focus on the development and application of a new MicroRaman probe for on-line real-time monitoring of the U(VI)/nitrate ion/nitric acid in solutions relevant to used nuclear fuel reprocessing. Previous research has successfully demonstrated the applicability on the macroscopic scale, using sample probes requiring larger solution volumes. In an effort to minimize waste and reduce dose to personnel, we have modified this technique to allow measurement at the microfluidic scale using a Raman microprobe. Under the current sampling environment, Raman samples typically require upwards of 10 mL and larger. Using the new sampling system, we can sample volumes at 10 μL or less, which is a scale reduction of over 1,000 fold in sample size. Finally, this paper will summarize our current work in this area including: comparisons between the macroscopic and microscopic probes for detection limits, optimized channel focusing, and application in a flow cell with varying levels of HNO 3, and UO 2(NO 3) 2.« less
Calcium kinetics with microgram stable isotope doses and saliva sampling
NASA Technical Reports Server (NTRS)
Smith, S. M.; Wastney, M. E.; Nyquist, L. E.; Shih, C. Y.; Wiesmann, H.; Nillen, J. L.; Lane, H. W.
1996-01-01
Studies of calcium kinetics require administration of tracer doses of calcium and subsequent repeated sampling of biological fluids. This study was designed to develop techniques that would allow estimation of calcium kinetics by using small (micrograms) doses of isotopes instead of the more common large (mg) doses to minimize tracer perturbation of the system and reduce cost, and to explore the use of saliva sampling as an alternative to blood sampling. Subjects received an oral dose (133 micrograms) of 43Ca and an i.v. dose (7.7 micrograms) of 46Ca. Isotopic enrichment in blood, urine, saliva and feces was well above thermal ionization mass spectrometry measurement precision up to 170 h after dosing. Fractional calcium absorptions determined from isotopic ratios in blood, urine and saliva were similar. Compartmental modeling revealed that kinetic parameters determined from serum or saliva data were similar, decreasing the necessity for blood samples. It is concluded from these results that calcium kinetics can be assessed with micrograms doses of stable isotopes, thereby reducing tracer costs and with saliva samples, thereby reducing the amount of blood needed.
Guo, Jiin-Huarng; Luh, Wei-Ming
2009-05-01
When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.
Tudela, Rebeca; Ribas-Agustí, Albert; Buxaderas, Susana; Riu-Aumatell, Montserrat; Castellari, Massimo; López-Tamames, Elvira
2016-06-15
An ultrahigh-performance liquid chromatography (UHPLC)-tandem mass spectrometry (MS/MS) method was developed for the simultaneous determination of nine target indoles in sparkling wines. The proposed method requires minimal sample pretreatment, and its performance parameters (accuracy, repeatability, LOD, and matrix effect) indicate that it is suitable for routine analysis. Four indoles were found at detectable levels in commercial Cava samples: 5-methoxytryptophol (5MTL), tryptophan (TRP), tryptophan ethyl ester (TEE), and N-acetylserotonin (NSER). Two of them, NSER and 5MTL, are reported here for the first time in sparkling wines, with values of 0.3-2 and 0.29-29.2 μg/L, respectively. In the same samples, the contents of melatonin (MEL), serotonin (SER), 5-hydroxytryptophan (5-OHTRP), 5-hydroxyindole-3-acetic acid (5OHIA), and 5-methoxy-3-indoleacetic acid (5MIA) were all below the corresponding limits of detection.
Importance sampling variance reduction for the Fokker–Planck rarefied gas particle method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collyer, B.S., E-mail: benjamin.collyer@gmail.com; London Mathematical Laboratory, 14 Buckingham Street, London WC2N 6DF; Connaughton, C.
The Fokker–Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find thatmore » our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.« less
Experiment kits for processing biological samples inflight on SLS-2
NASA Technical Reports Server (NTRS)
Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.
1995-01-01
This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.
Mars rover sample return: An exobiology science scenario
NASA Technical Reports Server (NTRS)
Rosenthal, D. A.; Sims, M. H.; Schwartz, Deborah E.; Nedell, S. S.; Mckay, Christopher P.; Mancinelli, Rocco L.
1988-01-01
A mission designed to collect and return samples from Mars will provide information regarding its composition, history, and evolution. At the same time, a sample return mission generates a technical challenge. Sophisticated, semi-autonomous, robotic spacecraft systems must be developed in order to carry out complex operations at the surface of a very distant planet. An interdisciplinary effort was conducted to consider how much a Mars mission can be realistically structured to maximize the planetary science return. The focus was to concentrate on a particular set of scientific objectives (exobiology), to determine the instrumentation and analyses required to search for biological signatures, and to evaluate what analyses and decision making can be effectively performed by the rover in order to minimize the overhead of constant communication between Mars and the Earth. Investigations were also begun in the area of machine vision to determine whether layered sedimentary structures can be recognized autonomously, and preliminary results are encouraging.
Chirollo, Claudia; Radovnikovic, Anita; Veneziano, Vincenzo; Marrone, Raffaele; Pepe, Tiziana; Danaher, Martin; Anastasio, Aniello
2014-01-01
The aim of this study was to measure the persistence of residues of the pyrethroid insecticide α-cypermethrin (ACYP) in the milk of lactating donkeys following pour-on treatment. Milk was collected from animals (n = 7) before the treatment and at 12, 24, 36, 48, 60, 72 and 84 h post-treatment. The last sampling was taken 7 days post-treatment (168 h). Milk samples were analysed by ultra-high-performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). The analytical method was validated following requirements of Commission Decision 2002/657/EC. All samples showed levels of ACYP below the maximum residue limit (MRL) of 20 μg kg(-1) established for bovine milk (Commission Regulation (EU) No. 37/2010). The results demonstrate that there is minimal partitioning of ACYP into milk in lactating donkeys from pour-on treatment.
Radiometry in medicine and biology
NASA Astrophysics Data System (ADS)
Nahm, Kie-Bong; Choi, Eui Y.
2012-10-01
Diagnostics in medicine plays a critical role in helping medical professionals deliver proper diagnostic decisions. Most samples in this trade are of the human origin and a great portion of methodologies practiced in biology labs is shared in clinical diagnostic laboratories as well. Most clinical tests are quantitative in nature and recent increase in interests in preventive medicine requires the determination of minimal concentration of target analyte: they exist in small quantities at the early stage of various diseases. Radiometry or the use of optical radiation is the most trusted and reliable means of converting biologic concentrations into quantitative physical quantities. Since optical energy is readily available in varying energies (or wavelengths), the appropriate combination of light and the sample absorption properties provides reliable information about the sample concentration through Beer-Lambert law to a decent precision. In this article, the commonly practiced techniques in clinical and biology labs are reviewed from the standpoint of radiometry.
Imaging samples larger than the field of view: the SLS experience
NASA Astrophysics Data System (ADS)
Vogiatzis Oikonomidis, Ioannis; Lovric, Goran; Cremona, Tiziana P.; Arcadu, Filippo; Patera, Alessandra; Schittny, Johannes C.; Stampanoni, Marco
2017-06-01
Volumetric datasets with micrometer spatial and sub-second temporal resolutions are nowadays routinely acquired using synchrotron X-ray tomographic microscopy (SRXTM). Although SRXTM technology allows the examination of multiple samples with short scan times, many specimens are larger than the field-of-view (FOV) provided by the detector. The extension of the FOV in the direction perpendicular to the rotation axis remains non-trivial. We present a method that can efficiently increase the FOV merging volumetric datasets obtained by region-of-interest tomographies in different 3D positions of the sample with a minimal amount of artefacts and with the ability to handle large amounts of data. The method has been successfully applied for the three-dimensional imaging of a small number of mouse lung acini of intact animals, where pixel sizes down to the micrometer range and short exposure times are required.
SRRF: Universal live-cell super-resolution microscopy.
Culley, Siân; Tosheva, Kalina L; Matos Pereira, Pedro; Henriques, Ricardo
2018-08-01
Super-resolution microscopy techniques break the diffraction limit of conventional optical microscopy to achieve resolutions approaching tens of nanometres. The major advantage of such techniques is that they provide resolutions close to those obtainable with electron microscopy while maintaining the benefits of light microscopy such as a wide palette of high specificity molecular labels, straightforward sample preparation and live-cell compatibility. Despite this, the application of super-resolution microscopy to dynamic, living samples has thus far been limited and often requires specialised, complex hardware. Here we demonstrate how a novel analytical approach, Super-Resolution Radial Fluctuations (SRRF), is able to make live-cell super-resolution microscopy accessible to a wider range of researchers. We show its applicability to live samples expressing GFP using commercial confocal as well as laser- and LED-based widefield microscopes, with the latter achieving long-term timelapse imaging with minimal photobleaching. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Origin and Correction of Magnetic Field Inhomogeneity at the Interface in Biphasic NMR Samples
Martin, Bryan T.; Chingas, G. C.
2012-01-01
The use of susceptibility matching to minimize spectral distortion of biphasic samples layered in a standard 5 mm NMR tube is described. The approach uses magic angle spinning (MAS) to first extract chemical shift differences by suppressing bulk magnetization. Then, using biphasic coaxial samples, magnetic susceptibilities are matched by titration with a paramagnetic salt. The matched phases are then layered in a standard NMR tube where they can be shimmed and examined. Line widths of two distinct spectral lines, selected to characterize homogeneity in each phase, are simultaneously optimized. Two-dimensional distortion-free, slice-resolved spectra of an octanol/water system illustrate the method. These data are obtained using a 2D stepped-gradient pulse sequence devised for this application. Advantages of this sequence over slice-selective methods are that acquisition efficiency is increased and processing requires only conventional software. PMID:22459062
Microsample analyses via DBS: challenges and opportunities.
Henion, Jack; Oliveira, Regina V; Chace, Donald H
2013-10-01
The use of DBS is an appealing approach to employing microsampling techniques for the bioanalysis of samples, as has been demonstrated for the past 50 years in the metabolic screening of metabolites and diseases. In addition to its minimally invasive sample collection procedures and its economical merits, DBS microsampling benefits from the very high sensitivity, selectivity and multianalyte capabilities of LC-MS, which has been especially well demonstrated in newborn screening applications. Only a few microliters of a biological fluid are required for analysis, which also translates to significantly reduced demands on clinical samples from patients or from animals. Recently, the pharmaceutical industry and other arenas have begun to explore the utility and practicality of DBS microsampling. This review discusses the basis for why DBS techniques are likely to be part of the future, as well as offering insights into where these benefits may be realized.
NASA Astrophysics Data System (ADS)
Vogt, Carla; Contradi, S.; Rohde, E.
1997-09-01
Capillary elctrophoresis is a modern separation technique, especially the extremely high efficiencies and minimal requirements with regard to buffers, samples and solvents lead to a dramatic increase of applications in the last few years. This paper offers an introduction to the technique of micellar elektrokinetic chromatography as a special kind of capillary electrophoresis. Caffeine and other purine compounds have been determined in foodstuff (tea, coffee, cocoa) as well as in pharmaceutical formulations. Different sample preparation procedures which have been developed with regard to the special properties of the sample matrices are discussed in the paper.This preparation facilitates the separation in many cases. So students have to solve a relatively simple separation problem by variation of buffer pH, buffer components and separation parameters. By doing a calibration for the analyzed purine compounds they will learn about reproducibility in capillary electrophoresis.
Improving small-angle X-ray scattering data for structural analyses of the RNA world
Rambo, Robert P.; Tainer, John A.
2010-01-01
Defining the shape, conformation, or assembly state of an RNA in solution often requires multiple investigative tools ranging from nucleotide analog interference mapping to X-ray crystallography. A key addition to this toolbox is small-angle X-ray scattering (SAXS). SAXS provides direct structural information regarding the size, shape, and flexibility of the particle in solution and has proven powerful for analyses of RNA structures with minimal requirements for sample concentration and volumes. In principle, SAXS can provide reliable data on small and large RNA molecules. In practice, SAXS investigations of RNA samples can show inconsistencies that suggest limitations in the SAXS experimental analyses or problems with the samples. Here, we show through investigations on the SAM-I riboswitch, the Group I intron P4-P6 domain, 30S ribosomal subunit from Sulfolobus solfataricus (30S), brome mosaic virus tRNA-like structure (BMV TLS), Thermotoga maritima asd lysine riboswitch, the recombinant tRNAval, and yeast tRNAphe that many problems with SAXS experiments on RNA samples derive from heterogeneity of the folded RNA. Furthermore, we propose and test a general approach to reducing these sample limitations for accurate SAXS analyses of RNA. Together our method and results show that SAXS with synchrotron radiation has great potential to provide accurate RNA shapes, conformations, and assembly states in solution that inform RNA biological functions in fundamental ways. PMID:20106957
Butler, Owen; Musgrove, Darren; Stacey, Peter
2014-01-01
Workers can be exposed to fume, arising from welding activities, which contain toxic metals and metalloids. Occupational hygienists need to assess and ultimately minimize such exposure risks. The monitoring of the concentration of particles in workplace air is one assessment approach whereby fume, from representative welding activities, is sampled onto a filter and returned to a laboratory for analysis. Inductively coupled plasma-atomic emission spectrometry and inductively coupled plasma-mass spectrometry are generally employed as instrumental techniques of choice for the analysis of such filter samples. An inherent difficulty, however, with inductively coupled plasma-based analytical techniques is that they typically require a sample to be presented for analysis in the form of a solution. The efficiency of the required dissolution step relies heavily upon the skill and experience of the analyst involved. A useful tool in assessing the efficacy of this dissolution step would be the availability and subsequent analysis of welding fume reference materials with stated elemental concentrations and matrices that match as closely as possible the matrix composition of welding fume samples submitted to laboratories for analysis. This article describes work undertaken at the Health and Safety Laboratory to prepare and certify two new bulk welding fume reference materials that can be routinely used by analysts to assess the performance of the digestion procedures they employ in their laboratories. PMID:24499055
Butler, Owen; Musgrove, Darren; Stacey, Peter
2014-01-01
Workers can be exposed to fume, arising from welding activities, which contain toxic metals and metalloids. Occupational hygienists need to assess and ultimately minimize such exposure risks. The monitoring of the concentration of particles in workplace air is one assessment approach whereby fume, from representative welding activities, is sampled onto a filter and returned to a laboratory for analysis. Inductively coupled plasma-atomic emission spectrometry and inductively coupled plasma-mass spectrometry are generally employed as instrumental techniques of choice for the analysis of such filter samples. An inherent difficulty, however, with inductively coupled plasma-based analytical techniques is that they typically require a sample to be presented for analysis in the form of a solution. The efficiency of the required dissolution step relies heavily upon the skill and experience of the analyst involved. A useful tool in assessing the efficacy of this dissolution step would be the availability and subsequent analysis of welding fume reference materials with stated elemental concentrations and matrices that match as closely as possible the matrix composition of welding fume samples submitted to laboratories for analysis. This article describes work undertaken at the Health and Safety Laboratory to prepare and certify two new bulk welding fume reference materials that can be routinely used by analysts to assess the performance of the digestion procedures they employ in their laboratories.
Where do the Field Plots Belong? A Multiple-Constraint Sampling Design for the BigFoot Project
NASA Astrophysics Data System (ADS)
Kennedy, R. E.; Cohen, W. B.; Kirschbaum, A. A.; Gower, S. T.
2002-12-01
A key component of a MODIS validation project is effective characterization of biophysical measures on the ground. Fine-grain ecological field measurements must be placed strategically to capture variability at the scale of the MODIS imagery. Here we describe the BigFoot project's revised sampling scheme, designed to simultaneously meet three important goals: capture landscape variability, avoid spatial autocorrelation between field plots, and minimize time and expense of field sampling. A stochastic process places plots in clumped constellations to reduce field sampling costs, while minimizing spatial autocorrelation. This stochastic process is repeated, creating several hundred realizations of plot constellations. Each constellation is scored and ranked according to its ability to match landscape variability in several Landsat-based spectral indices, and its ability to minimize field sampling costs. We show how this approach has recently been used to place sample plots at the BigFoot project's two newest study areas, one in a desert system and one in a tundra system. We also contrast this sampling approach to that already used at the four prior BigFoot project sites.
Cybersonics: Tapping into Technology
NASA Technical Reports Server (NTRS)
2001-01-01
With the assistance of Small Business Innovation Research (SBIR) funding from NASA's Jet Propulsion Laboratory, Cybersonics, Inc., developed an ultrasonic drill with applications ranging from the medical industry to space exploration. The drill, which has the ability to take a core sample of the hardest granite or perform the most delicate diagnostic medical procedure, is a lightweight, ultrasonic device made to fit in the palm of the hand. Piezoelectric actuators, which have only two moving parts and no gears or motors, drive the components of the device, enabling it to operate in a wide range of temperatures. The most remarkable aspect of the drill is its ability to penetrate even the hardest rock with minimal force application. The ultrasonic device requires 20 to 30 times less force than standard rotating drills, allowing it to be safely guided by hand during operation. Also, the drill is operable at a level as low as three watts of power, where conventional drills require more than three times this level. Potential future applications for the ultrasonic drill include rock and soil sampling, medical procedures that involve core sampling or probing, landmine detection, building and construction, and space exploration. Cybersonics, Inc. developed an ultrasonic drill with applications ranging from the medical industry to space exploration.
Wang, Zhuoyu; Dendukuri, Nandini; Pai, Madhukar; Joseph, Lawrence
2017-11-01
When planning a study to estimate disease prevalence to a pre-specified precision, it is of interest to minimize total testing cost. This is particularly challenging in the absence of a perfect reference test for the disease because different combinations of imperfect tests need to be considered. We illustrate the problem and a solution by designing a study to estimate the prevalence of childhood tuberculosis in a hospital setting. All possible combinations of 3 commonly used tuberculosis tests, including chest X-ray, tuberculin skin test, and a sputum-based test, either culture or Xpert, are considered. For each of the 11 possible test combinations, 3 Bayesian sample size criteria, including average coverage criterion, average length criterion and modified worst outcome criterion, are used to determine the required sample size and total testing cost, taking into consideration prior knowledge about the accuracy of the tests. In some cases, the required sample sizes and total testing costs were both reduced when more tests were used, whereas, in other examples, lower costs are achieved with fewer tests. Total testing cost should be formally considered when designing a prevalence study.
NASA Astrophysics Data System (ADS)
Sanz, Miguel; Ramos, Gonzalo; Moral, Andoni; Pérez, Carlos; Belenguer, Tomás; del Rosario Canchal, María; Zuluaga, Pablo; Rodriguez, Jose Antonio; Santiago, Amaia; Rull, Fernando; Instituto Nacional de Técnica Aeroespacial (INTA); Ingeniería de Sistemas para la Defesa de España S.A. (ISDEFE)
2016-10-01
Raman Laser Spectrometer (RLS) is the Pasteur Payload instruments of the ExoMars mission, within the ESA's Aurora Exploration Programme, that will perform for the first time in an out planetary mission Raman spectroscopy. RLS is composed by SPU (Spectrometer Unit), iOH (Internal Optical Head), and ICEU (Instrument Control and Excitation Unit). iOH focuses the excitation laser on the samples (excitation path), and collects the Raman emission from the sample (collection path, composed on collimation system and filtering system). The original design presented a high laser trace reaching to the detector, and although a certain level of laser trace was required for calibration purposes, the high level degrades the Signal to Noise Ratio confounding some Raman peaks.The investigation revealing that the laser trace was not properly filtered as well as the iOH opto-mechanical redesign are reported on. After the study of the Long Pass Filters Optical Density (OD) as a function of the filtering stage to the detector distance, a new set of filters (Notch filters) was decided to be evaluated. Finally, and in order to minimize the laser trace, a new collection path design (mainly consisting on that the collimation and filtering stages are now separated in two barrels, and on the kind of filters to be used) was required. Distance between filters and collimation stage first lens was increased, increasing the OD. With this new design and using two Notch filters, the laser trace was reduced to assumable values, as can be observed in the functional test comparison also reported on this paper.
Environmental scanning electron microscopy in cell biology.
McGregor, J E; Staniewicz, L T L; Guthrie Neé Kirk, S E; Donald, A M
2013-01-01
Environmental scanning electron microscopy (ESEM) (1) is an imaging technique which allows hydrated, insulating samples to be imaged under an electron beam. The resolution afforded by this technique is higher than conventional optical microscopy but lower than conventional scanning electron microscopy (CSEM). The major advantage of the technique is the minimal sample preparation needed, making ESEM quick to use and the images less susceptible to the artifacts that the extensive sample preparation usually required for CSEM may introduce. Careful manipulation of both the humidity in the microscope chamber and the beam energy are nevertheless essential to prevent dehydration and beam damage artifacts. In some circumstances it is possible to image live cells in the ESEM (2).In the following sections we introduce the fundamental principles of ESEM imaging before presenting imaging protocols for plant epidermis, mammalian cells, and bacteria. In the first two cases samples are imaged using the secondary electron (topographic) signal, whereas a transmission technique is employed to image bacteria.
Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.
2014-01-01
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416
Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L
2014-07-25
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2014 CFR
2014-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2013 CFR
2013-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2012 CFR
2012-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
5 CFR 581.203 - Information minimally required to accompany legal process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...
Sauerbeck, Andrew; Pandya, Jignesh; Singh, Indrapal; Bittman, Kevin; Readnower, Ryan; Bing, Guoying; Sullivan, Patrick
2012-01-01
The analysis of mitochondrial bioenergetic function typically has required 50–100 μg of protein per sample and at least 15 min per run when utilizing a Clark-type oxygen electrode. In the present work we describe a method utilizing the Seahorse Biosciences XF24 Flux Analyzer for measuring mitochondrial oxygen consumption simultaneously from multiple samples and utilizing only 5 μg of protein per sample. Utilizing this method we have investigated whether regionally based differences exist in mitochondria isolated from the cortex, striatum, hippocampus, and cerebellum. Analysis of basal mitochondrial bioenergetics revealed that minimal differences exist between the cortex, striatum, and hippocampus. However, the cerebellum exhibited significantly slower basal rates of Complex I and Complex II dependent oxygen consumption (p < 0.05). Mitochondrial inhibitors affected enzyme activity proportionally across all samples tested and only small differences existed in the effect of inhibitors on oxygen consumption. Investigation of the effect of rotenone administration on Complex I dependent oxygen consumption revealed that exposure to 10 pM rotenone led to a clear time dependent decrease in oxygen consumption beginning 12 min after administration (p < 0.05). These studies show that the utilization of this microplate based method for analysis of mitochondrial bioenergetics is effective at quantifying oxygen consumption simultaneously from multiple samples. Additionally, these studies indicate that minimal regional differences exist in mitochondria isolated from the cortex, striatum, or hippocampus. Furthermore, utilization of the mitochondrial inhibitors suggests that previous work indicating regionally specific deficits following systemic mitochondrial toxin exposure may not be the result of differences in the individual mitochondria from the affected regions. PMID:21402103
Wu, Dongrui; Lance, Brent J; Parsons, Thomas D
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.
Gebler, J.B.
2004-01-01
The related topics of spatial variability of aquatic invertebrate community metrics, implications of spatial patterns of metric values to distributions of aquatic invertebrate communities, and ramifications of natural variability to the detection of human perturbations were investigated. Four metrics commonly used for stream assessment were computed for 9 stream reaches within a fairly homogeneous, minimally impaired stream segment of the San Pedro River, Arizona. Metric variability was assessed for differing sampling scenarios using simple permutation procedures. Spatial patterns of metric values suggest that aquatic invertebrate communities are patchily distributed on subsegment and segment scales, which causes metric variability. Wide ranges of metric values resulted in wide ranges of metric coefficients of variation (CVs) and minimum detectable differences (MDDs), and both CVs and MDDs often increased as sample size (number of reaches) increased, suggesting that any particular set of sampling reaches could yield misleading estimates of population parameters and effects that can be detected. Mean metric variabilities were substantial, with the result that only fairly large differences in metrics would be declared significant at ?? = 0.05 and ?? = 0.20. The number of reaches required to obtain MDDs of 10% and 20% varied with significance level and power, and differed for different metrics, but were generally large, ranging into tens and hundreds of reaches. Study results suggest that metric values from one or a small number of stream reach(es) may not be adequate to represent a stream segment, depending on effect sizes of interest, and that larger sample sizes are necessary to obtain reasonable estimates of metrics and sample statistics. For bioassessment to progress, spatial variability may need to be investigated in many systems and should be considered when designing studies and interpreting data.
Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188
Metabolomic profiling in perinatal asphyxia: a promising new field.
Denihan, Niamh M; Boylan, Geraldine B; Murray, Deirdre M
2015-01-01
Metabolomics, the latest "omic" technology, is defined as the comprehensive study of all low molecular weight biochemicals, "metabolites" present in an organism. As a systems biology approach, metabolomics has huge potential to progress our understanding of perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy, by uniquely detecting rapid biochemical pathway alterations in response to the hypoxic environment. The study of metabolomic biomarkers in the immediate neonatal period is not a trivial task and requires a number of specific considerations, unique to this disease and population. Recruiting a clearly defined cohort requires standardised multicentre recruitment with broad inclusion criteria and the participation of a range of multidisciplinary staff. Minimally invasive biospecimen collection is a priority for biomarker discovery. Umbilical cord blood presents an ideal medium as large volumes can be easily extracted and stored and the sample is not confounded by postnatal disease progression. Pristine biobanking and phenotyping are essential to ensure the validity of metabolomic findings. This paper provides an overview of the current state of the art in the field of metabolomics in perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy. We detail the considerations required to ensure high quality sampling and analysis, to support scientific progression in this important field.
Metabolomic Profiling in Perinatal Asphyxia: A Promising New Field
Denihan, Niamh M.; Boylan, Geraldine B.; Murray, Deirdre M.
2015-01-01
Metabolomics, the latest “omic” technology, is defined as the comprehensive study of all low molecular weight biochemicals, “metabolites” present in an organism. As a systems biology approach, metabolomics has huge potential to progress our understanding of perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy, by uniquely detecting rapid biochemical pathway alterations in response to the hypoxic environment. The study of metabolomic biomarkers in the immediate neonatal period is not a trivial task and requires a number of specific considerations, unique to this disease and population. Recruiting a clearly defined cohort requires standardised multicentre recruitment with broad inclusion criteria and the participation of a range of multidisciplinary staff. Minimally invasive biospecimen collection is a priority for biomarker discovery. Umbilical cord blood presents an ideal medium as large volumes can be easily extracted and stored and the sample is not confounded by postnatal disease progression. Pristine biobanking and phenotyping are essential to ensure the validity of metabolomic findings. This paper provides an overview of the current state of the art in the field of metabolomics in perinatal asphyxia and neonatal hypoxic-ischaemic encephalopathy. We detail the considerations required to ensure high quality sampling and analysis, to support scientific progression in this important field. PMID:25802843
Biological Sterilization of Returned Mars Samples
NASA Technical Reports Server (NTRS)
Allen, C. C.; Albert, F. G.; Combie, J.; Bodnar, R. J.; Hamilton, V. E.; Jolliff, B. L.; Kuebler, K.; Wang, A.; Lindstrom, D. J.; Morris, P. A.
1999-01-01
Martian rock and soil, collected by robotic spacecraft, will be returned to terrestrial laboratories early in the next century. Current plans call for the samples to be immediately placed into biological containment and tested for signs of present or past life and biological hazards. It is recommended that "Controlled distribution of unsterilized materials from Mars should occur only if rigorous analyses determine that the materials do not constitute a biological hazard. If any portion of the sample is removed from containment prior to completion of these analyses it should first be sterilized." While sterilization of Mars samples may not be required, an acceptable method must be available before the samples are returned to Earth. The sterilization method should be capable of destroying a wide range of organisms with minimal effects on the geologic samples. A variety of biological sterilization techniques and materials are currently in use, including dry heat, high pressure steam, gases, plasmas and ionizing radiation. Gamma radiation is routinely used to inactivate viruses and destroy bacteria in medical research. Many commercial sterilizers use Co-60 , which emits gamma photons of 1.17 and 1.33 MeV. Absorbed doses of approximately 1 Mrad (10(exp 8) ergs/g) destroy most bacteria. This study investigates the effects of lethal doses of Co-60 gamma radiation on materials similar to those anticipated to be returned from Mars. The goals are to determine the gamma dose required to kill microorganisms in rock and soil samples and to determine the effects of gamma sterilization on the samples' isotopic, chemical and physical properties. Additional information is contained in the original extended abstract.
Léger, Julie; Tandé, Didier; Plouzeau, Chloé; Valentin, Anne Sophie; Jolivet-Gougeon, Anne; Lemarié, Carole; Kempf, Marie; Héry-Arnaud, Geneviève; Bret, Laurent; Juvin, Marie Emmanuelle; Giraudeau, Bruno; Burucoa, Christophe
2015-01-01
Although numerous perioperative samples and culture media are required to diagnose prosthetic joint infection (PJI), their exact number and types have not yet been definitely determined with a high level of proof. We conducted a prospective multicenter study to determine the minimal number of samples and culture media required for accurate diagnosis of PJI. Over a 2-year period, consecutive patients with clinical signs suggesting PJI were included, with five perioperative samples per patient. The bacteriological and PJI diagnosis criteria were assessed using a random selection of two, three, or four samples and compared with those obtained using the recommended five samples (references guidelines). The results obtained with two or three culture media were then compared with those obtained with five culture media for both criteria. The times-to-positivity of the different culture media were calculated. PJI was confirmed in 215/264 suspected cases, with a bacteriological criterion in 192 (89%). The PJI was monomicrobial (85%) or polymicrobial (15%). Percentages of agreement of 98.1% and 99.7%, respectively, for the bacteriological criterion and confirmed PJI diagnosis were obtained when four perioperative samples were considered. The highest percentages of agreement were obtained with the association of three culture media, a blood culture bottle, a chocolate agar plate, and Schaedler broth, incubated for 5, 7, and 14 days, respectively. This new procedure leads to significant cost saving. Our prospective multicenter study showed that four samples seeded on three culture media are sufficient for diagnosing PJI. PMID:26637380
Bémer, Pascale; Léger, Julie; Tandé, Didier; Plouzeau, Chloé; Valentin, Anne Sophie; Jolivet-Gougeon, Anne; Lemarié, Carole; Kempf, Marie; Héry-Arnaud, Geneviève; Bret, Laurent; Juvin, Marie Emmanuelle; Giraudeau, Bruno; Corvec, Stéphane; Burucoa, Christophe
2016-02-01
Although numerous perioperative samples and culture media are required to diagnose prosthetic joint infection (PJI), their exact number and types have not yet been definitely determined with a high level of proof. We conducted a prospective multicenter study to determine the minimal number of samples and culture media required for accurate diagnosis of PJI. Over a 2-year period, consecutive patients with clinical signs suggesting PJI were included, with five perioperative samples per patient. The bacteriological and PJI diagnosis criteria were assessed using a random selection of two, three, or four samples and compared with those obtained using the recommended five samples (references guidelines). The results obtained with two or three culture media were then compared with those obtained with five culture media for both criteria. The times-to-positivity of the different culture media were calculated. PJI was confirmed in 215/264 suspected cases, with a bacteriological criterion in 192 (89%). The PJI was monomicrobial (85%) or polymicrobial (15%). Percentages of agreement of 98.1% and 99.7%, respectively, for the bacteriological criterion and confirmed PJI diagnosis were obtained when four perioperative samples were considered. The highest percentages of agreement were obtained with the association of three culture media, a blood culture bottle, a chocolate agar plate, and Schaedler broth, incubated for 5, 7, and 14 days, respectively. This new procedure leads to significant cost saving. Our prospective multicenter study showed that four samples seeded on three culture media are sufficient for diagnosing PJI. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
2013-01-01
Background Many problems in protein modeling require obtaining a discrete representation of the protein conformational space as an ensemble of conformations. In ab-initio structure prediction, in particular, where the goal is to predict the native structure of a protein chain given its amino-acid sequence, the ensemble needs to satisfy energetic constraints. Given the thermodynamic hypothesis, an effective ensemble contains low-energy conformations which are similar to the native structure. The high-dimensionality of the conformational space and the ruggedness of the underlying energy surface currently make it very difficult to obtain such an ensemble. Recent studies have proposed that Basin Hopping is a promising probabilistic search framework to obtain a discrete representation of the protein energy surface in terms of local minima. Basin Hopping performs a series of structural perturbations followed by energy minimizations with the goal of hopping between nearby energy minima. This approach has been shown to be effective in obtaining conformations near the native structure for small systems. Recent work by us has extended this framework to larger systems through employment of the molecular fragment replacement technique, resulting in rapid sampling of large ensembles. Methods This paper investigates the algorithmic components in Basin Hopping to both understand and control their effect on the sampling of near-native minima. Realizing that such an ensemble is reduced before further refinement in full ab-initio protocols, we take an additional step and analyze the quality of the ensemble retained by ensemble reduction techniques. We propose a novel multi-objective technique based on the Pareto front to filter the ensemble of sampled local minima. Results and conclusions We show that controlling the magnitude of the perturbation allows directly controlling the distance between consecutively-sampled local minima and, in turn, steering the exploration towards conformations near the native structure. For the minimization step, we show that the addition of Metropolis Monte Carlo-based minimization is no more effective than a simple greedy search. Finally, we show that the size of the ensemble of sampled local minima can be effectively and efficiently reduced by a multi-objective filter to obtain a simpler representation of the probed energy surface. PMID:24564970
Efficiency of unconstrained minimization techniques in nonlinear analysis
NASA Technical Reports Server (NTRS)
Kamat, M. P.; Knight, N. F., Jr.
1978-01-01
Unconstrained minimization algorithms have been critically evaluated for their effectiveness in solving structural problems involving geometric and material nonlinearities. The algorithms have been categorized as being zeroth, first, or second order depending upon the highest derivative of the function required by the algorithm. The sensitivity of these algorithms to the accuracy of derivatives clearly suggests using analytically derived gradients instead of finite difference approximations. The use of analytic gradients results in better control of the number of minimizations required for convergence to the exact solution.
Why minimally invasive skin sampling techniques? A bright scientific future.
Wang, Christina Y; Maibach, Howard I
2011-03-01
There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.
Astigmatism corrected common path probe for optical coherence tomography.
Singh, Kanwarpal; Yamada, Daisuke; Tearney, Guillermo
2017-03-01
Optical coherence tomography (OCT) catheters for intraluminal imaging are subject to various artifacts due to reference-sample arm dispersion imbalances and sample arm beam astigmatism. The goal of this work was to develop a probe that minimizes such artifacts. Our probe was fabricated using a single mode fiber at the tip of which a glass spacer and graded index objective lens were spliced to achieve the desired focal distance. The signal was reflected using a curved reflector to correct for astigmatism caused by the thin, protective, transparent sheath that surrounds the optics. The probe design was optimized using Zemax, a commercially available optical design software. Common path interferometric operation was achieved using Fresnel reflection from the tip of the focusing graded index objective lens. The performance of the probe was tested using a custom designed spectrometer-based OCT system. The probe achieved an axial resolution of 15.6 μm in air, a lateral resolution 33 μm, and a sensitivity of 103 dB. A scattering tissue phantom was imaged to test the performance of the probe for astigmatism correction. Images of the phantom confirmed that this common-path, astigmatism-corrected OCT imaging probe had minimal artifacts in the axial, and lateral dimensions. In this work, we developed an astigmatism-corrected, common path probe that minimizes artifacts associated with standard OCT probes. This design may be useful for OCT applications that require high axial and lateral resolutions. Lasers Surg. Med. 49:312-318, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Minimal T-wave representation and its use in the assessment of drug arrhythmogenicity.
Shakibfar, Saeed; Graff, Claus; Kanters, Jørgen K; Nielsen, Jimmi; Schmidt, Samuel; Struijk, Johannes J
2017-05-01
Recently, numerous models and techniques have been developed for analyzing and extracting features from the T wave which could be used as biomarkers for drug-induced abnormalities. The majority of these techniques and algorithms use features that determine readily apparent characteristics of the T wave, such as duration, area, amplitude, and slopes. In the present work the T wave was down-sampled to a minimal rate, such that a good reconstruction was still possible. The entire T wave was then used as a feature vector to assess drug-induced repolarization effects. The ability of the samples or combinations of samples obtained from the minimal T-wave representation to correctly classify a group of subjects before and after receiving d,l-sotalol 160 mg and 320 mg was evaluated using a linear discriminant analysis (LDA). The results showed that a combination of eight samples from the minimal T-wave representation can be used to identify normal from abnormal repolarization significantly better compared to the heart rate-corrected QT interval (QTc). It was further indicated that the interval from the peak of the T wave to the end of the T wave (Tpe) becomes relatively shorter after I K r inhibition by d,l-sotalol and that the most pronounced repolarization changes were present in the ascending segment of the minimal T-wave representation. The minimal T-wave representation can potentially be used as a new tool to identify normal from abnormal repolarization in drug safety studies. © 2016 Wiley Periodicals, Inc.
EFFECT OF SHORT-TERM ART INTERRUPTION ON LEVELS OF INTEGRATED HIV DNA.
Strongin, Zachary; Sharaf, Radwa; VanBelzen, D Jake; Jacobson, Jeffrey M; Connick, Elizabeth; Volberding, Paul; Skiest, Daniel J; Gandhi, Rajesh T; Kuritzkes, Daniel R; O'Doherty, Una; Li, Jonathan Z
2018-03-28
Analytic treatment interruption (ATI) studies are required to evaluate strategies aimed at achieving ART-free HIV remission, but the impact of ATI on the viral reservoir remains unclear. We validated a DNA size selection-based assay for measuring levels of integrated HIV DNA and applied it to assess the effects of short-term ATI on the HIV reservoir. Samples from participants from four AIDS Clinical Trials Group (ACTG) ATI studies were assayed for integrated HIV DNA levels. Cryopreserved PBMCs were obtained for 12 participants with available samples pre-ATI and approximately 6 months after ART resumption. Four participants also had samples available during the ATI. The median duration of ATI was 12 weeks. Validation of the HIV Integrated DNA size-Exclusion (HIDE) assay was performed using samples spiked with unintegrated HIV DNA, HIV-infected cell lines, and participant PBMCs. The HIDE assay eliminated 99% of unintegrated HIV DNA species and strongly correlated with the established Alu- gag assay. For the majority of individuals, integrated DNA levels increased during ATI and subsequently declined upon ART resumption. There was no significant difference in levels of integrated HIV DNA between the pre- and post-ATI time points, with the median ratio of post:pre-ATI HIV DNA levels of 0.95. Using a new integrated HIV DNA assay, we found minimal change in the levels of integrated HIV DNA in participants who underwent an ATI followed by 6 months of ART. This suggests that short-term ATI can be conducted without a significant impact on levels of integrated proviral DNA in the peripheral blood. IMPORTANCE Interventions aimed at achieving sustained antiretroviral therapy (ART)-free HIV remission require treatment interruption trials to assess their efficacy. However, these trials are accompanied by safety concerns related to the expansion of the viral reservoir. We validated an assay that uses an automated DNA size-selection platform for quantifying levels of integrated HIV DNA and is less sample- and labor-intensive than current assays. Using stored samples from AIDS Clinical Trials Group studies, we found that short-term ART discontinuation had minimal impact on integrated HIV DNA levels after ART resumption, providing reassurance about the reservoir effects of short-term treatment interruption trials. Copyright © 2018 American Society for Microbiology.
Revesz, Kinga M.; Landwehr, Jurate M.
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 ± 20 µg) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H3PO4/CaCO3) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H3PO4/CaCO3 reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 °C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was ≤0.1 and ≤0.2 per mill or ‰, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for δ18O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2012-07-01 2012-07-01 false Requirements for...
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2014-07-01 2014-07-01 false Requirements for...
36 CFR 228.8 - Requirements for environmental protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... shall be conducted so as, where feasible, to minimize adverse environmental impacts on National Forest..., shall either be removed from National Forest lands or disposed of or treated so as to minimize, so far... 36 Parks, Forests, and Public Property 2 2013-07-01 2013-07-01 false Requirements for...
40 CFR 63.6605 - What are my general requirements for complying with this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... maintain any affected source, including associated air pollution control equipment and monitoring equipment, in a manner consistent with safety and good air pollution control practices for minimizing emissions. The general duty to minimize emissions does not require you to make any further efforts to reduce...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995, 44 U.S.C. 3506(c)(2)(A). This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the proposed information collection for updating Radiation Sampling and Exposure Records.
Selective electron spin resonance measurements of micrometer-scale thin samples on a substrate
NASA Astrophysics Data System (ADS)
Dikarov, Ekaterina; Fehr, Matthias; Schnegg, Alexander; Lips, Klaus; Blank, Aharon
2013-11-01
An approach to the selective observation of paramagnetic centers in thin samples or surfaces with electron spin resonance (ESR) is presented. The methodology is based on the use of a surface microresonator that enables the selective obtention of ESR data from thin layers with minimal background signals from the supporting substrate. An experimental example is provided, which measures the ESR signal from a 1.2 µm polycrystalline silicon layer on a glass substrate used in modern solar-cell technology. The ESR results obtained with the surface microresonator show the effective elimination of background signals, especially at low cryogenic temperatures, compared to the use of a conventional resonator. The surface microresonator also facilitates much higher absolute spin sensitivity, requiring much smaller surfaces for the measurement.
Responsible gambling: general principles and minimal requirements.
Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc
2011-12-01
Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program.
PLA realizations for VLSI state machines
NASA Technical Reports Server (NTRS)
Gopalakrishnan, S.; Whitaker, S.; Maki, G.; Liu, K.
1990-01-01
A major problem associated with state assignment procedures for VLSI controllers is obtaining an assignment that produces minimal or near minimal logic. The key item in Programmable Logic Array (PLA) area minimization is the number of unique product terms required by the design equations. This paper presents a state assignment algorithm for minimizing the number of product terms required to implement a finite state machine using a PLA. Partition algebra with predecessor state information is used to derive a near optimal state assignment. A maximum bound on the number of product terms required can be obtained by inspecting the predecessor state information. The state assignment algorithm presented is much simpler than existing procedures and leads to the same number of product terms or less. An area-efficient PLA structure implemented in a 1.0 micron CMOS process is presented along with a summary of the performance for a controller implemented using this design procedure.
Integrated science and engineering for the OSIRIS-REx asteroid sample return mission
NASA Astrophysics Data System (ADS)
Lauretta, D.
2014-07-01
Introduction: The Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) asteroid sample return mission will survey near-Earth asteroid (101955) Bennu to understand its physical, mineralogical, and chemical properties, assess its resource potential, refine the impact hazard, and return a sample of this body to the Earth [1]. This mission is scheduled for launch in 2016 and will rendezvous with the asteroid in 2018. Sample return to the Earth follows in 2023. The OSIRIS-REx mission has the challenge of visiting asteroid Bennu, characterizing it at global and local scales, then selecting the best site on the asteroid surface to acquire a sample for return to the Earth. Minimizing the risk of exploring an unknown world requires a tight integration of science and engineering to inform flight system and mission design. Defining the Asteroid Environment: We have performed an extensive astronomical campaign in support of OSIRIS-REx. Lightcurve and phase function observations were obtained with UA Observatories telescopes located in southeastern Arizona during the 2005--2006 and 2011--2012 apparitions [2]. We observed Bennu using the 12.6-cm radar at the Arecibo Observatory in 1999, 2005, and 2011 and the 3.5-cm radar at the Goldstone tracking station in 1999 and 2005 [3]. We conducted near-infrared measurements using the NASA Infrared Telescope Facility at the Mauna Kea Observatory in Hawaii in September 2005 [4]. Additional spectral observations were obtained in July 2011 and May 2012 with the Magellan 6.5-m telescope [5]. We used the Spitzer space telescope to observe Bennu in May 2007 [6]. The extensive knowledge gained as a result of our telescopic characterization of Bennu was critical in the selection of this object as the OSIRIS-REx mission target. In addition, we use these data, combined with models of the asteroid, to constrain over 100 different asteroid parameters covering orbital, bulk, rotational, radar, photometric, spectroscopic, thermal, regolith, and asteroid environmental properties. We have captured this information in a mission configuration-controlled document called the Design Reference Asteroid. This information is used across the project to establish the environmental requirements for the flight system and for overall mission design. Maintaining a Pristine Sample: OSIRIS-REx is driven by the top-level science objective to return >60 g of pristine, carbonaceous regolith from asteroid Bennu. We define a "pristine sample" to mean that no foreign material introduced into the sample hampers our scientific analysis. Basically, we know that some contamination will take place --- we just have to document it so that we can subtract it from our analysis of the returned sample. Engineering contamination requirements specify cleanliness in terms of particle counts and thin- films residues --- scientists define it in terms of bulk elemental and organic abundances. After initial discussions with our Contamination Engineers, we agreed on known, albeit challenging, particle and thin-film contamination levels for the Touch-and-Go Sample Acquisition Mechanism (TAGSAM) and the Sample Return Capsule. These levels are achieved using established cleaning procedures while minimizing interferences for sample analysis. Selecting a Sample Site: The Sample Site Selection decision is based on four key data products: Deliverability, Safety, Sampleability, and Science Value Maps. Deliverability quantifies the probability that the Flight Dynamics team can deliver the spacecraft to the desired location on the asteroid surface. Safety maps assess candidate sites against the capabilities of the spacecraft. Sampleability requires an assessment of the asteroid surface properties vs. TAGSAM capabilities. Scientific value maximizes the probability that the collected sample contains organics and volatiles and can be placed in a geological context definitive enough to determine sample history. Science and engineering teams work collaboratively to produce these key decision-making maps.
76 FR 30550 - Federal Management Regulation; Change in Consumer Price Index Minimal Value
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... Minimal Value AGENCY: Office of Governmentwide Policy, GSA. ACTION: Final rule. SUMMARY: Pursuant to 5 U.S.C. 7342, at three-year intervals following January 1, 1981, the minimal value for foreign gifts must... required consultation has been completed and the minimal value has been increased to $350 or less as of...
Optimal sampling strategies for detecting zoonotic disease epidemics.
Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W
2014-06-01
The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.
Zallman, Leah; Nardin, Rachel; Sayah, Assaad; McCormick, Danny
2015-10-29
Under the Massachusetts health reform, low income residents (those with incomes below 150 % of the Federal Poverty Level [FPL]) were eligible for Medicaid and health insurance exchange-based plans with minimal cost-sharing and no premiums. Those with slightly higher incomes (150 %-300 % FPL) were eligible for exchange-based plans that required cost-sharing and premium payments. We conducted face to face surveys in four languages with a convenience sample of 976 patients seeking care at three hospital emergency departments five years after Massachusetts reform. We compared perceived affordability of insurance, financial burden, and satisfaction among low cost sharing plan recipients (recipients of Medicaid and insurance exchange-based plans with minimal cost-sharing and no premiums), high cost sharing plan recipients (recipients of exchange-based plans that required cost-sharing and premium payments) and the commercially insured. We found that despite having higher incomes, higher cost-sharing plan recipients were less satisfied with their insurance plans and perceived more difficulty affording their insurance than those with low cost-sharing plans. Higher cost-sharing plan recipients also reported more difficulty affording medical and non-medical health care as well as insurance premiums than those with commercial insurance. In contrast, patients with low cost-sharing public plans reported higher plan satisfaction and less financial concern than the commercially insured. Policy makers with responsibility for the benefit design of public insurance available under health care reforms in the U.S. should calibrate cost-sharing to income level so as to minimize difficulty affording care and financial burdens.
Collecting Quality Infrared Spectra from Microscopic Samples of Suspicious Powders in a Sealed Cell.
Kammrath, Brooke W; Leary, Pauline E; Reffner, John A
2017-03-01
The infrared (IR) microspectroscopical analysis of samples within a sealed-cell containing barium fluoride is a critical need when identifying toxic agents or suspicious powders of unidentified composition. The dispersive nature of barium fluoride is well understood and experimental conditions can be easily adjusted during reflection-absorption measurements to account for differences in focus between the visible and IR regions of the spectrum. In most instances, the ability to collect a viable spectrum is possible when using the sealed cell regardless of whether visible or IR focus is optimized. However, when IR focus is optimized, it is possible to collect useful data from even smaller samples. This is important when a minimal sample is available for analysis or the desire to minimize risk of sample exposure is important. While the use of barium fluoride introduces dispersion effects that are unavoidable, it is possible to adjust instrument settings when collecting IR spectra in the reflection-absorption mode to compensate for dispersion and minimize impact on the quality of the sample spectrum.
A method to improve the range resolution in stepped frequency continuous wave radar
NASA Astrophysics Data System (ADS)
Kaczmarek, Paweł
2018-04-01
In the paper one of high range resolution methods - Aperture Sampling - was analysed. Unlike MUSIC based techniques it proved to be very efficient in terms of achieving unambiguous synthetic range profile for ultra-wideband stepped frequency continuous wave radar. Assuming that minimal distance required to separate two targets in depth (distance) corresponds to -3 dB width of received echo, AS provided a 30,8 % improvement in range resolution in analysed scenario, when compared to results of applying IFFT. Output data is far superior in terms of both improved range resolution and reduced side lobe level than used typically in this area Inverse Fourier Transform. Furthermore it does not require prior knowledge or an estimate of number of targets to be detected in a given scan.
Experimental demonstration of cheap and accurate phase estimation
NASA Astrophysics Data System (ADS)
Rudinger, Kenneth; Kimmel, Shelby; Lobser, Daniel; Maunz, Peter
We demonstrate experimental implementation of robust phase estimation (RPE) to learn the phases of X and Y rotations on a trapped Yb+ ion qubit.. Unlike many other phase estimation protocols, RPE does not require ancillae nor near-perfect state preparation and measurement operations. Additionally, its computational requirements are minimal. Via RPE, using only 352 experimental samples per phase, we estimate phases of implemented gates with errors as small as 10-4 radians, as validated using gate set tomography. We also demonstrate that these estimates exhibit Heisenberg scaling in accuracy. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Steady state method to determine unsaturated hydraulic conductivity at the ambient water potential
HUbbell, Joel M.
2014-08-19
The present invention relates to a new laboratory apparatus for measuring the unsaturated hydraulic conductivity at a single water potential. One or more embodiments of the invented apparatus can be used over a wide range of water potential values within the tensiometric range, requires minimal laboratory preparation, and operates unattended for extended periods with minimal supervision. The present invention relates to a new laboratory apparatus for measuring the unsaturated hydraulic conductivity at a single water potential. One or more embodiments of the invented apparatus can be used over a wide range of water potential values within the tensiometric range, requires minimal laboratory preparation, and operates unattended for extended periods with minimal supervision.
NASA Astrophysics Data System (ADS)
Wei, Yiping; Chen, Liru; Zhou, Wei; Chingin, Konstantin; Ouyang, Yongzhong; Zhu, Tenggao; Wen, Hua; Ding, Jianhua; Xu, Jianjun; Chen, Huanwen
2015-05-01
Tissue spray ionization mass spectrometry (TSI-MS) directly on small tissue samples has been shown to provide highly specific molecular information. In this study, we apply this method to the analysis of 38 pairs of human lung squamous cell carcinoma tissue (cancer) and adjacent normal lung tissue (normal). The main components of pulmonary surfactants, dipalmitoyl phosphatidylcholine (DPPC, m/z 757.47), phosphatidylcholine (POPC, m/z 782.52), oleoyl phosphatidylcholine (DOPC, m/z 808.49), and arachidonic acid stearoyl phosphatidylcholine (SAPC, m/z 832.43), were identified using high-resolution tandem mass spectrometry. Monte Carlo sampling partial least squares linear discriminant analysis (PLS-LDA) was used to distinguish full-mass-range mass spectra of cancer samples from the mass spectra of normal tissues. With 5 principal components and 30 - 40 Monte Carlo samplings, the accuracy of cancer identification in matched tissue samples reached 94.42%. Classification of a tissue sample required less than 1 min, which is much faster than the analysis of frozen sections. The rapid, in situ diagnosis with minimal sample consumption provided by TSI-MS is advantageous for surgeons. TSI-MS allows them to make more informed decisions during surgery.
Laser Induced Breakdown Spectroscopy of Glass and Crystal Samples
NASA Astrophysics Data System (ADS)
Sharma, Prakash; Sandoval, Alejandra; Carter, Michael; Kumar, Akshaya
2015-03-01
Different types of quartz crystals and rare earth ions doped glasses have been identified using the laser induced breakdown spectroscopy (LIBS) technique. LIBS is a real time technique, can be used to identify samples in solid, liquid and gas phases. The advantage of LIBS technique is that no sample preparation is required and laser causes extremely minimal damage to the sample surface. The LIBS spectrum of silicate glasses, prepared by sol-gel method and doped with different concentration of rare earth ions, has been recorded. The limit of detection of rare earth ions in glass samples has been calculated. Total 10 spectrums of each sample were recorded and then averaged to get a final spectrum. The ocean optics LIBS2500 plus spectrometer along with a Q- switched Nd: YAG laser (Quantel, Big Sky) were used to record the LIBS spectrum. This spectrometer can analyze the sample in the spectral range of 200 nm to 980 nm. The spectrum was processed by OOILIBS-plus (v1.0) software. This study has application in the industry where different crystals can be easily identified before they go for shaping and polishing. Also, concentration of rare earth ions in glass can be monitored in real time for quality control.
Ashtiani, Dariush; Venugopal, Hari; Belousoff, Matthew; Spicer, Bradley; Mak, Johnson; Neild, Adrian; de Marco, Alex
2018-04-06
Cryo-Electron Microscopy (cryo-EM) has become an invaluable tool for structural biology. Over the past decade, the advent of direct electron detectors and automated data acquisition has established cryo-EM as a central method in structural biology. However, challenges remain in the reliable and efficient preparation of samples in a manner which is compatible with high time resolution. The delivery of sample onto the grid is recognized as a critical step in the workflow as it is a source of variability and loss of material due to the blotting which is usually required. Here, we present a method for sample delivery and plunge freezing based on the use of Surface Acoustic Waves to deploy 6-8 µm droplets to the EM grid. This method minimises the sample dead volume and ensures vitrification within 52.6 ms from the moment the sample leaves the microfluidics chip. We demonstrate a working protocol to minimize the atomised volume and apply it to plunge freeze three different samples and provide proof that no damage occurs due to the interaction between the sample and the acoustic waves. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Mukhopadhyay, V.
1988-01-01
A generic procedure for the parameter optimization of a digital control law for a large-order flexible flight vehicle or large space structure modeled as a sampled data system is presented. A linear quadratic Guassian type cost function was minimized, while satisfying a set of constraints on the steady-state rms values of selected design responses, using a constrained optimization technique to meet multiple design requirements. Analytical expressions for the gradients of the cost function and the design constraints on mean square responses with respect to the control law design variables are presented.
Matching technique yields optimum LNA performance. [Low Noise Amplifiers
NASA Technical Reports Server (NTRS)
Sifri, J. D.
1986-01-01
The present article is concerned with a case in which an optimum noise figure and unconditional stability have been designed into a 2.385-GHz low-noise preamplifier via an unusual method for matching the input with a suspended line. The results obtained with several conventional line-matching techniques were not satisfactory. Attention is given to the minimization of thermal noise, the design procedure, requirements for a high-impedance line, a sampling of four matching networks, the noise figure of the single-line matching network as a function of frequency, and the approaches used to achieve unconditional stability.
Dunn, Kevin H.; Tsai, Candace Su-Jung; Woskie, Susan R.; Bennett, James S.; Garcia, Alberto; Ellenbecker, Michael J.
2015-01-01
The most commonly reported control used to minimize workplace exposures to nanomaterials is the chemical fume hood. Studies have shown, however, that significant releases of nanoparticles can occur when materials are handled inside fume hoods. This study evaluated the performance of a new commercially available nano fume hood using three different test protocols. Tracer gas, tracer nanoparticle, and nanopowder handling protocols were used to evaluate the hood. A static test procedure using tracer gas (sulfur hexafluoride) and nanoparticles as well as an active test using an operator handling nanoalumina were conducted. A commercially available particle generator was used to produce sodium chloride tracer nanoparticles. Containment effectiveness was evaluated by sampling both in the breathing zone (BZ) of a mannequin and operator as well as across the hood opening. These containment tests were conducted across a range of hood face velocities (60, 80, and 100 feet/minute) and with the room ventilation system turned off and on. For the tracer gas and tracer nanoparticle tests, leakage was much more prominent on the left side of the hood (closest to the room supply air diffuser) although some leakage was noted on the right side and in the BZ sample locations. During the tracer gas and tracer nanoparticle tests, leakage was primarily noted when the room air conditioner was on for both the low and medium hood exhaust air flows. When the room air conditioner was turned off, the static tracer gas tests showed good containment across most test conditions. The tracer gas and nanoparticle test results were well correlated showing hood leakage under the same conditions and at the same sample locations. The impact of a room air conditioner was demonstrated with containment being adversely impacted during the use of room air ventilation. The tracer nanoparticle approach is a simple method requiring minimal setup and instrumentation. However, the method requires the reduction in background concentrations to allow for increased sensitivity. PMID:25175285
Dunn, Kevin H; Tsai, Candace Su-Jung; Woskie, Susan R; Bennett, James S; Garcia, Alberto; Ellenbecker, Michael J
2014-01-01
The most commonly reported control used to minimize workplace exposures to nanomaterials is the chemical fume hood. Studies have shown, however, that significant releases of nanoparticles can occur when materials are handled inside fume hoods. This study evaluated the performance of a new commercially available nano fume hood using three different test protocols. Tracer gas, tracer nanoparticle, and nanopowder handling protocols were used to evaluate the hood. A static test procedure using tracer gas (sulfur hexafluoride) and nanoparticles as well as an active test using an operator handling nanoalumina were conducted. A commercially available particle generator was used to produce sodium chloride tracer nanoparticles. Containment effectiveness was evaluated by sampling both in the breathing zone (BZ) of a mannequin and operator as well as across the hood opening. These containment tests were conducted across a range of hood face velocities (60, 80, and 100 ft/min) and with the room ventilation system turned off and on. For the tracer gas and tracer nanoparticle tests, leakage was much more prominent on the left side of the hood (closest to the room supply air diffuser) although some leakage was noted on the right side and in the BZ sample locations. During the tracer gas and tracer nanoparticle tests, leakage was primarily noted when the room air conditioner was on for both the low and medium hood exhaust airflows. When the room air conditioner was turned off, the static tracer gas tests showed good containment across most test conditions. The tracer gas and nanoparticle test results were well correlated showing hood leakage under the same conditions and at the same sample locations. The impact of a room air conditioner was demonstrated with containment being adversely impacted during the use of room air ventilation. The tracer nanoparticle approach is a simple method requiring minimal setup and instrumentation. However, the method requires the reduction in background concentrations to allow for increased sensitivity.
Nikiforova, Marina N; Mercurio, Stephanie; Wald, Abigail I; Barbi de Moura, Michelle; Callenberg, Keith; Santana-Santos, Lucas; Gooding, William E; Yip, Linwah; Ferris, Robert L; Nikiforov, Yuri E
2018-04-15
Molecular tests have clinical utility for thyroid nodules with indeterminate fine-needle aspiration (FNA) cytology, although their performance requires further improvement. This study evaluated the analytical performance of the newly created ThyroSeq v3 test. ThyroSeq v3 is a DNA- and RNA-based next-generation sequencing assay that analyzes 112 genes for a variety of genetic alterations, including point mutations, insertions/deletions, gene fusions, copy number alterations, and abnormal gene expression, and it uses a genomic classifier (GC) to separate malignant lesions from benign lesions. It was validated in 238 tissue samples and 175 FNA samples with known surgical follow-up. Analytical performance studies were conducted. In the training tissue set of samples, ThyroSeq GC detected more than 100 genetic alterations, including BRAF, RAS, TERT, and DICER1 mutations, NTRK1/3, BRAF, and RET fusions, 22q loss, and gene expression alterations. GC cutoffs were established to distinguish cancer from benign nodules with 93.9% sensitivity, 89.4% specificity, and 92.1% accuracy. This correctly classified most papillary, follicular, and Hurthle cell lesions, medullary thyroid carcinomas, and parathyroid lesions. In the FNA validation set, the GC sensitivity was 98.0%, the specificity was 81.8%, and the accuracy was 90.9%. Analytical accuracy studies demonstrated a minimal required nucleic acid input of 2.5 ng, a 12% minimal acceptable tumor content, and reproducible test results under variable stress conditions. The ThyroSeq v3 GC analyzes 5 different classes of molecular alterations and provides high accuracy for detecting all common types of thyroid cancer and parathyroid lesions. The analytical sensitivity, specificity, and robustness of the test have been successfully validated and indicate its suitability for clinical use. Cancer 2018;124:1682-90. © 2018 American Cancer Society. © 2018 American Cancer Society.
High-concentration zeta potential measurements using light-scattering techniques
Kaszuba, Michael; Corbett, Jason; Watson, Fraser Mcneil; Jones, Andrew
2010-01-01
Zeta potential is the key parameter that controls electrostatic interactions in particle dispersions. Laser Doppler electrophoresis is an accepted method for the measurement of particle electrophoretic mobility and hence zeta potential of dispersions of colloidal size materials. Traditionally, samples measured by this technique have to be optically transparent. Therefore, depending upon the size and optical properties of the particles, many samples will be too concentrated and will require dilution. The ability to measure samples at or close to their neat concentration would be desirable as it would minimize any changes in the zeta potential of the sample owing to dilution. However, the ability to measure turbid samples using light-scattering techniques presents a number of challenges. This paper discusses electrophoretic mobility measurements made on turbid samples at high concentration using a novel cell with reduced path length. Results are presented on two different sample types, titanium dioxide and a polyurethane dispersion, as a function of sample concentration. For both of the sample types studied, the electrophoretic mobility results show a gradual decrease as the sample concentration increases and the possible reasons for these observations are discussed. Further, a comparison of the data against theoretical models is presented and discussed. Conclusions and recommendations are made from the zeta potential values obtained at high concentrations. PMID:20732896
NASA Technical Reports Server (NTRS)
Fries, M. D.; Allen, C. C.; Calaway, M. J.; Evans, C. A.; Stansbery, E. K.
2015-01-01
Curation of NASA's astromaterials sample collections is a demanding and evolving activity that supports valuable science from NASA missions for generations, long after the samples are returned to Earth. For example, NASA continues to loan hundreds of Apollo program samples to investigators every year and those samples are often analyzed using instruments that did not exist at the time of the Apollo missions themselves. The samples are curated in a manner that minimizes overall contamination, enabling clean, new high-sensitivity measurements and new science results over 40 years after their return to Earth. As our exploration of the Solar System progresses, upcoming and future NASA sample return missions will return new samples with stringent contamination control, sample environmental control, and Planetary Protection requirements. Therefore, an essential element of a healthy astromaterials curation program is a research and development (R&D) effort that characterizes and employs new technologies to maintain current collections and enable new missions - an Advanced Curation effort. JSC's Astromaterials Acquisition & Curation Office is continually performing Advanced Curation research, identifying and defining knowledge gaps about research, development, and validation/verification topics that are critical to support current and future NASA astromaterials sample collections. The following are highlighted knowledge gaps and research opportunities.
Rapid determination of 226Ra in environmental samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sherrod L.; Culligan, Brian K.
A new rapid method for the determination of {sup 228}Ra in natural water samples has been developed at the SRNL/EBL (Savannah River National Lab/ Environmental Bioassay Laboratory) that can be used for emergency response or routine samples. While gamma spectrometry can be employed with sufficient detection limits to determine {sup 228}Ra in solid samples (via {sup 228}Ac) , radiochemical methods that employ gas flow proportional counting techniques typically provide lower MDA (Minimal Detectable Activity) levels for the determination of {sup 228}Ra in water samples. Most radiochemical methods for {sup 228}Ra collect and purify {sup 228}Ra and allow for {sup 228}Acmore » daughter ingrowth for ~36 hours. In this new SRNL/EBL approach, {sup 228}Ac is collected and purified from the water sample without waiting to eliminate this delay. The sample preparation requires only about 4 hours so that {sup 228}Ra assay results on water samples can be achieved in < 6 hours. The method uses a rapid calcium carbonate precipitation enhanced with a small amount of phosphate added to enhance chemical yields (typically >90%), followed by rapid cation exchange removal of calcium. Lead, bismuth, uranium, thorium and protactinium isotopes are also removed by the cation exchange separation. {sup 228}Ac is eluted from the cation resin directly onto a DGA Resin cartridge attached to the bottom of the cation column to purify {sup 228}Ac. DGA Resin also removes lead and bismuth isotopes, along with Sr isotopes and {sup 90}Y. La is used to determine {sup 228}Ac chemical yield via ICP-MS, but {sup 133}Ba can also be used instead if ICP-MS assay is not available. Unlike some older methods, no lead or strontium holdback carriers or continual readjustment of sample pH is required.« less
Irradiation treatment of minimally processed carrots for ensuring microbiological safety
NASA Astrophysics Data System (ADS)
Ashraf Chaudry, Muhammad; Bibi, Nizakat; Khan, Misal; Khan, Maazullah; Badshah, Amal; Jamil Qureshi, Muhammad
2004-09-01
Minimally processed fruits and vegetables are very common in developed countries and are gaining popularity in developing countries due to their convenience and freshness. However, minimally processing may result in undesirable changes in colour, taste and appearance due to the transfer of microbes from skin to the flesh. Irradiation is a well-known technology for elimination of microbial contamination. Food irradiation has been approved by 50 countries and is being applied commercially in USA. The purpose of this study was to evaluate the effect of irradiation on the quality of minimally processed carrots. Fresh carrots were peeled, sliced and PE packaged. The samples were irradiated (0, 0.5, 1.0, 2.0, 2.5, 3.0 kGy) and stored at 5°C for 2 weeks. The samples were analyzed for hardness, organoleptic acceptance and microbial load at 0, 7th and 15th day. The mean firmness of the control and all irradiated samples remained between 4.31 and 4.42 kg of force, showing no adverse effect of radiation dose. The effect of storage (2 weeks) was significant ( P< 0.05) with values ranging between 4.28 and 4.39 kg of force. The total bacterial counts at 5°C for non-irradiated and 0.5 kGy irradiated samples were 6.3×10 5 cfu/g, 3.0×10 2 and few colonies(>10) in all other irradiated samples(1.0, 2.0, 2.5 and 3.0 kGy) after 2 weeks storage. No coliform or E. coli were detected in any of the samples (radiated or control) immediately after irradiation and during the entire storage period in minimally processed carrots. A dose of 2.0 kGy completely controlled the fungal and bacterial counts. The irradiated samples (2.0 kGy) were also acceptable sensorially.
Wu, Mixia; Shu, Yu; Li, Zhaohai; Liu, Aiyi
2016-01-01
A sequential design is proposed to test whether the accuracy of a binary diagnostic biomarker meets the minimal level of acceptance. The accuracy of a binary diagnostic biomarker is a linear combination of the marker’s sensitivity and specificity. The objective of the sequential method is to minimize the maximum expected sample size under the null hypothesis that the marker’s accuracy is below the minimal level of acceptance. The exact results of two-stage designs based on Youden’s index and efficiency indicate that the maximum expected sample sizes are smaller than the sample sizes of the fixed designs. Exact methods are also developed for estimation, confidence interval and p-value concerning the proposed accuracy index upon termination of the sequential testing. PMID:26947768
Satkunasivam, Raj; Tsai, Sheaumei; Syan, Sumeet; Bernhard, Jean-Christophe; de Castro Abreu, Andre Luis; Chopra, Sameer; Berger, Andre K; Lee, Dennis; Hung, Andrew J; Cai, Jie; Desai, Mihir M; Gill, Inderbir S
2015-10-01
Anatomic partial nephrectomy (PN) techniques aim to decrease or eliminate global renal ischemia. To report the technical feasibility of completely unclamped "minimal-margin" robotic PN. We also illustrate the stepwise evolution of anatomic PN surgery with related outcomes data. This study was a retrospective analysis of 179 contemporary patients undergoing anatomic PN at a tertiary academic institution between October 2009 and February 2013. Consecutive consented patients were grouped into three cohorts: group 1, with superselective clamping and developmental-curve experience (n = 70); group 2, with superselective clamping and mature experience (n = 60); and group 3, which had completely unclamped, minimal-margin PN (n = 49). Patients in groups 1 and 2 underwent superselective tumor-specific devascularization, whereas patients in group 3 underwent completely unclamped minimal-margin PN adjacent to the tumor edge, a technique that takes advantage of the radially oriented intrarenal architecture and anatomy. Primary outcomes assessed the technical feasibility of robotic, completely unclamped, minimal-margin PN; short-term changes in estimated glomerular filtration rate (eGFR); and development of new-onset chronic kidney disease (CKD) stage >3. Secondary outcome measures included perioperative variables, 30-d complications, and histopathologic outcomes. Demographic data were similar among groups. For similarly sized tumors (p = 0.13), percentage of kidney preserved was greater (p = 0.047) and margin width was narrower (p = 0.0004) in group 3. In addition, group 3 had less blood loss (200, 225, and 150ml; p = 0.04), lower transfusion rates (21%, 23%, and 4%; p = 0.008), and shorter hospital stay (p = 0.006), whereas operative time and 30-d complication rates were similar. At 1-mo postoperatively, median percentage reduction in eGFR was similar (7.6%, 0%, and 3.0%; p = 0.53); however, new-onset CKD stage >3 occurred less frequently in group 3 (23%, 10%, and 2%; p = 0.003). Study limitations included retrospective analysis, small sample size, and short follow-up. We developed an anatomically based technique of robotic, unclamped, minimal-margin PN. This evolution from selective clamped to unclamped PN may further optimize functional outcomes but requires external validation and longer follow-up. The technical evolution of partial nephrectomy surgery is aimed at eliminating global renal damage from the cessation of blood flow. An unclamped minimal-margin technique is described and may offer renal functional advantage but requires long-term follow-up and validation at other institutions. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
2011-01-01
Background One of the most critical problems about antimicrobial therapy is the increasing resistance to antibiotics. Previous studies have shown that there is a direct relation between erroneous prescription, dosage, route, duration of the therapy and the antibiotics resistance. Other important point is the uncertainty about the quality of the prescribed medicines. Some physicians believe that generic drugs are not as effective as innovator ones, so it is very important to have evidence that shows that all commercialized drugs are suitable for therapeutic use. Methods Microbial assays were used to establish the potency, the Minimal Inhibitory Concentrations (MICs), the Minimal Bactericidal Concentration (MBCs), the critical concentrations, and the production of spontaneous mutants that are resistant to vancomycin. Results The microbial assay was validated in order to determine the Vancomycin potency of the tasted samples. All the products showed that have potency values between 90 - 115% (USP requirement). The products behave similarly because the MICs, The MBCs, the critical concentrations, the critical concentrations ratios between standard and samples, and the production of spontaneous mutants don't have significant differences. Conclusions All products analyzed by microbiological tests, show that both trademarks and generics do not have statistical variability and the answer of antimicrobial activity Show also that they are pharmaceutical equivalents. PMID:21777438
Diaz, Jorge A; Silva, Edelberto; Arias, Maria J; Garzón, María
2011-07-21
One of the most critical problems about antimicrobial therapy is the increasing resistance to antibiotics. Previous studies have shown that there is a direct relation between erroneous prescription, dosage, route, duration of the therapy and the antibiotics resistance. Other important point is the uncertainty about the quality of the prescribed medicines. Some physicians believe that generic drugs are not as effective as innovator ones, so it is very important to have evidence that shows that all commercialized drugs are suitable for therapeutic use. Microbial assays were used to establish the potency, the Minimal Inhibitory Concentrations (MICs), the Minimal Bactericidal Concentration (MBCs), the critical concentrations, and the production of spontaneous mutants that are resistant to vancomycin. The microbial assay was validated in order to determine the Vancomycin potency of the tasted samples. All the products showed that have potency values between 90 - 115% (USP requirement). The products behave similarly because the MICs, The MBCs, the critical concentrations, the critical concentrations ratios between standard and samples, and the production of spontaneous mutants don't have significant differences. All products analyzed by microbiological tests, show that both trademarks and generics do not have statistical variability and the answer of antimicrobial activity Show also that they are pharmaceutical equivalents.
Poritz, Mark A.; Blaschke, Anne J.; Byington, Carrie L.; Meyers, Lindsay; Nilsson, Kody; Jones, David E.; Thatcher, Stephanie A.; Robbins, Thomas; Lingenfelter, Beth; Amiott, Elizabeth; Herbener, Amy; Daly, Judy; Dobrowolski, Steven F.; Teng, David H. -F.; Ririe, Kirk M.
2011-01-01
The ideal clinical diagnostic system should deliver rapid, sensitive, specific and reproducible results while minimizing the requirements for specialized laboratory facilities and skilled technicians. We describe an integrated diagnostic platform, the “FilmArray”, which fully automates the detection and identification of multiple organisms from a single sample in about one hour. An unprocessed biologic/clinical sample is subjected to nucleic acid purification, reverse transcription, a high-order nested multiplex polymerase chain reaction and amplicon melt curve analysis. Biochemical reactions are enclosed in a disposable pouch, minimizing the PCR contamination risk. FilmArray has the potential to detect greater than 100 different nucleic acid targets at one time. These features make the system well-suited for molecular detection of infectious agents. Validation of the FilmArray technology was achieved through development of a panel of assays capable of identifying 21 common viral and bacterial respiratory pathogens. Initial testing of the system using both cultured organisms and clinical nasal aspirates obtained from children demonstrated an analytical and clinical sensitivity and specificity comparable to existing diagnostic platforms. We demonstrate that automated identification of pathogens from their corresponding target amplicon(s) can be accomplished by analysis of the DNA melting curve of the amplicon. PMID:22039434
redMaGiC: selecting luminous red galaxies from the DES Science Verification data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rozo, E.
We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sam- ple of constant comoving density. Additionally, we demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshiftmore » range z ϵ [0.2,0.8]. Our fiducial sample has a comoving space density of 10 -3 (h -1Mpc) -3, and a median photo-z bias (z spec z photo) and scatter (σ z=(1 + z)) of 0.005 and 0.017 respectively.The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.« less
Rapid detection of potyviruses from crude plant extracts.
Silva, Gonçalo; Oyekanmi, Joshua; Nkere, Chukwuemeka K; Bömer, Moritz; Kumar, P Lava; Seal, Susan E
2018-04-01
Potyviruses (genus Potyvirus; family Potyviridae) are widely distributed and represent one of the most economically important genera of plant viruses. Therefore, their accurate detection is a key factor in developing efficient control strategies. However, this can sometimes be problematic particularly in plant species containing high amounts of polysaccharides and polyphenols such as yam (Dioscorea spp.). Here, we report the development of a reliable, rapid and cost-effective detection method for the two most important potyviruses infecting yam based on reverse transcription-recombinase polymerase amplification (RT-RPA). The developed method, named 'Direct RT-RPA', detects each target virus directly from plant leaf extracts prepared with a simple and inexpensive extraction method avoiding laborious extraction of high-quality RNA. Direct RT-RPA enables the detection of virus-positive samples in under 30 min at a single low operation temperature (37 °C) without the need for any expensive instrumentation. The Direct RT-RPA tests constitute robust, accurate, sensitive and quick methods for detection of potyviruses from recalcitrant plant species. The minimal sample preparation requirements and the possibility of storing RPA reagents without cold chain storage, allow Direct RT-RPA to be adopted in minimally equipped laboratories and with potential use in plant clinic laboratories and seed certification facilities worldwide. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Technical and financial evaluation of assays for progesterone in canine practice in the UK.
Moxon, R; Copley, D; England, G C W
2010-10-02
The concentration of progesterone was measured in 60 plasma samples from bitches at various stages of the oestrous cycle, using commercially available quantitative and semi-quantitative ELISA test kits, as well as by two commercial laboratories undertaking radioimmunoassay (RIA). The RIA, which was assumed to be the 'gold standard' in terms of reliability and accuracy, was the most expensive method when analysing more than one sample per week, and had the longest delay in obtaining results, but had minimal requirements for practice staff time. When compared with the RIA, the quantitative ELISA had a strong positive correlation (r=0.97, P<0.05) and a sensitivity and specificity of 70.6 per cent and 100.0 per cent, respectively, and positive and negative predictive values of 100.0 per cent and 71.0 per cent, respectively, with an overall accuracy of 90.0 per cent. This method was the least expensive when analysing five or more samples per week, but had longer turnaround times than that of the semi-quantitative ELISA and required more staff time. When compared with the RIA, the semi-quantitative ELISA had a sensitivity and specificity of 100.0 per cent and 95.5 per cent, respectively, and positive and negative predictive values of 73.9 per cent and 77.8 per cent, respectively, with an overall accuracy of 89.2 per cent. This method was more expensive than the quantitative ELISA when analysing five or more samples per week, but had the shortest turnaround time and low requirements in terms of staff time.
Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie
2013-01-01
Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.
Smartphone-Based Food Diagnostic Technologies: A Review.
Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo
2017-06-20
A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies.
Smartphone-Based Food Diagnostic Technologies: A Review
Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo
2017-01-01
A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies. PMID:28632188
Smartphones for cell and biomolecular detection.
Liu, Xiyuan; Lin, Tung-Yi; Lillehoj, Peter B
2014-11-01
Recent advances in biomedical science and technology have played a significant role in the development of new sensors and assays for cell and biomolecular detection. Generally, these efforts are aimed at reducing the complexity and costs associated with diagnostic testing so that it can be performed outside of a laboratory or hospital setting, requiring minimal equipment and user involvement. In particular, point-of-care (POC) testing offers immense potential for many important applications including medical diagnosis, environmental monitoring, food safety, and biosecurity. When coupled with smartphones, POC systems can offer portability, ease of use and enhanced functionality while maintaining performance. This review article focuses on recent advancements and developments in smartphone-based POC systems within the last 6 years with an emphasis on cell and biomolecular detection. These devices typically comprise multiple components, such as detectors, sample processors, disposable chips, batteries, and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. Researchers have demonstrated several promising approaches employing various detection schemes and device configurations, and it is expected that further developments in biosensors, battery technology and miniaturized electronics will enable smartphone-based POC technologies to become more mainstream tools in the scientific and biomedical communities.
Optimal probes for withdrawal of uncontaminated fluid samples
NASA Astrophysics Data System (ADS)
Sherwood, J. D.
2005-08-01
Withdrawal of fluid by a composite probe pushed against the face z =0 of a porous half-space z >0 is modeled assuming incompressible Darcy flow. The probe is circular, of radius a, with an inner sampling section of radius αa and a concentric outer guard probe αa
NASA Astrophysics Data System (ADS)
Yuan, Chao; Chareyre, Bruno; Darve, Félix
2016-09-01
A pore-scale model is introduced for two-phase flow in dense packings of polydisperse spheres. The model is developed as a component of a more general hydromechanical coupling framework based on the discrete element method, which will be elaborated in future papers and will apply to various processes of interest in soil science, in geomechanics and in oil and gas production. Here the emphasis is on the generation of a network of pores mapping the void space between spherical grains, and the definition of local criteria governing the primary drainage process. The pore space is decomposed by Regular Triangulation, from which a set of pores connected by throats are identified. A local entry capillary pressure is evaluated for each throat, based on the balance of capillary pressure and surface tension at equilibrium. The model reflects the possible entrapment of disconnected patches of the receding wetting phase. It is validated by a comparison with drainage experiments. In the last part of the paper, a series of simulations are reported to illustrate size and boundary effects, key questions when studying small samples made of spherical particles be it in simulations or experiments. Repeated tests on samples of different sizes give evolution of water content which are not only scattered but also strongly biased for small sample sizes. More than 20,000 spheres are needed to reduce the bias on saturation below 0.02. Additional statistics are generated by subsampling a large sample of 64,000 spheres. They suggest that the minimal sampling volume for evaluating saturation is one hundred times greater that the sampling volume needed for measuring porosity with the same accuracy. This requirement in terms of sample size induces a need for efficient computer codes. The method described herein has a low algorithmic complexity in order to satisfy this requirement. It will be well suited to further developments toward coupled flow-deformation problems in which evolution of the microstructure require frequent updates of the pore network.
Barcoding of live human peripheral blood mononuclear cells for multiplexed mass cytometry.
Mei, Henrik E; Leipold, Michael D; Schulz, Axel Ronald; Chester, Cariad; Maecker, Holden T
2015-02-15
Mass cytometry is developing as a means of multiparametric single-cell analysis. In this study, we present an approach to barcoding separate live human PBMC samples for combined preparation and acquisition on a cytometry by time of flight instrument. Using six different anti-CD45 Ab conjugates labeled with Pd104, Pd106, Pd108, Pd110, In113, and In115, respectively, we barcoded up to 20 samples with unique combinations of exactly three different CD45 Ab tags. Cell events carrying more than or less than three different tags were excluded from analyses during Boolean data deconvolution, allowing for precise sample assignment and the electronic removal of cell aggregates. Data from barcoded samples matched data from corresponding individually stained and acquired samples, at cell event recoveries similar to individual sample analyses. The approach greatly reduced technical noise and minimizes unwanted cell doublet events in mass cytometry data, and it reduces wet work and Ab consumption. It also eliminates sample-to-sample carryover and the requirement of instrument cleaning between samples, thereby effectively reducing overall instrument runtime. Hence, CD45 barcoding facilitates accuracy of mass cytometric immunophenotyping studies, thus supporting biomarker discovery efforts, and it should be applicable to fluorescence flow cytometry as well. Copyright © 2015 by The American Association of Immunologists, Inc.
Effect of Processing on Silk-Based Biomaterials: Reproducibility and Biocompatibility
Wray, Lindsay S.; Hu, Xiao; Gallego, Jabier; Georgakoudi, Irene; Omenetto, Fiorenzo G.; Schmidt, Daniel; Kaplan, David L.
2012-01-01
Silk fibroin has been successfully used as a biomaterial for tissue regeneration. In order to prepare silk fibroin biomaterials for human implantation a series of processing steps are required to purify the protein. Degumming to remove inflammatory sericin is a crucial step related to biocompatibility and variability in the material. Detailed characterization of silk fibroin degumming is reported. The degumming conditions significantly affected cell viability on the silk fibroin material and the ability to form three-dimensional porous scaffolds from the silk fibroin, but did not affect macrophage activation or β-sheet content in the materials formed. Methods are also provided to determine the content of residual sericin in silk fibroin solutions and to assess changes in silk fibroin molecular weight. Amino acid composition analysis was used to detect sericin residuals in silk solutions with a detection limit between 1.0% and 10% wt/wt, while fluorescence spectroscopy was used to reproducibly distinguish between silk samples with different molecular weights. Both methods are simple and require minimal sample volume, providing useful quality control tools for silk fibroin preparation processes. PMID:21695778
Cutting-edge analysis of extracellular microparticles using ImageStream(X) imaging flow cytometry.
Headland, Sarah E; Jones, Hefin R; D'Sa, Adelina S V; Perretti, Mauro; Norling, Lucy V
2014-06-10
Interest in extracellular vesicle biology has exploded in the past decade, since these microstructures seem endowed with multiple roles, from blood coagulation to inter-cellular communication in pathophysiology. In order for microparticle research to evolve as a preclinical and clinical tool, accurate quantification of microparticle levels is a fundamental requirement, but their size and the complexity of sample fluids present major technical challenges. Flow cytometry is commonly used, but suffers from low sensitivity and accuracy. Use of Amnis ImageStream(X) Mk II imaging flow cytometer afforded accurate analysis of calibration beads ranging from 1 μm to 20 nm; and microparticles, which could be observed and quantified in whole blood, platelet-rich and platelet-free plasma and in leukocyte supernatants. Another advantage was the minimal sample preparation and volume required. Use of this high throughput analyzer allowed simultaneous phenotypic definition of the parent cells and offspring microparticles along with real time microparticle generation kinetics. With the current paucity of reliable techniques for the analysis of microparticles, we propose that the ImageStream(X) could be used effectively to advance this scientific field.
Beaghler, M; Grasso, M
1994-11-01
Routine urothelial biopsies of the lower urinary tract are obtained using the cold cup biopsy technique. This procedure is most often performed in the surgical suite and requires rigid endoscopic access and the use of biopsy forceps and Bugbee electrodes to obtain tissue for histologic examination. A new single-step biopsy forceps has been used through the flexible cystoscope. Using a 16 F actively deflectable, flexible cystoscope and the 5.4 F Therma Jaw Hot Urologic Forceps, bladder biopsies were obtained in 27 patients for a variety of indications. This biopsy forceps allows simultaneous tissue sampling and electrocoagulation of the biopsy site, thus eliminating the need for exchange of instruments through the flexible cystoscope. Tissue samples are somewhat protected from thermal changes during coagulation through the use of a Faraday cage. Biopsies were frequently obtained in an outpatient setting, requiring only local topical anesthesia (2% lidocaine jelly). Carcinoma in situ, transitional cell carcinoma, acute and chronic inflammation, and normal bladder mucosa were differentiated histologically. Using this technique, lower urinary tract urothelial mapping can be performed safely in the office with minimal patient discomfort.
Villanelli, Fabio; Giocaliere, Elisa; Malvagia, Sabrina; Rosati, Anna; Forni, Giulia; Funghini, Silvia; Shokry, Engy; Ombrone, Daniela; Della Bona, Maria Luisa; Guerrini, Renzo; la Marca, Giancarlo
2015-02-02
Phenytoin (PHT) is one of the most commonly used anticonvulsant drugs for the treatment of epilepsy and bipolar disorders. The large amount of plasma required by conventional methods for drug quantification makes mass spectrometry combined with dried blood spot (DBS) sampling crucial for pediatric patients where therapeutic drug monitoring or pharmacokinetic studies may be difficult to realize. DBS represents a new convenient sampling support requiring minimally invasive blood drawing and providing long-term stability of samples and less expensive shipment and storage. The aim of this study was to develop a LC-MS/MS method for the quantification of PHT on DBS. This analytical method was validated and gave good linearity (r(2)=0.999) in the range of 0-100mg/l. LOQ and LOD were 1.0mg/l and 0.3mg/l, respectively. The drug extraction from paper was performed in a few minutes using a mixture composed of organic solvent for 80%. The recovery ranged from 85 to 90%; PHT in DBS showed to be stable at different storage temperatures for one month. A good correlation was also obtained between PHT plasma and DBS concentrations. This method is both precise and accurate and appears to be particularly suitable to monitor treatment with a simple and convenient sample collection procedure. Copyright © 2014 Elsevier B.V. All rights reserved.
Temporal and spatial resolution required for imaging myocardial function
NASA Astrophysics Data System (ADS)
Eusemann, Christian D.; Robb, Richard A.
2004-05-01
4-D functional analysis of myocardial mechanics is an area of significant interest and research in cardiology and vascular/interventional radiology. Current multidimensional analysis is limited by insufficient temporal resolution of x-ray and magnetic resonance based techniques, but recent improvements in system design holds hope for faster and higher resolution scans to improve images of moving structures allowing more accurate functional studies, such as in the heart. This paper provides a basis for the requisite temporal and spatial resolution for useful imaging during individual segments of the cardiac cycle. Multiple sample rates during systole and diastole are compared to determine an adequate sample frequency to reduce regional myocardial tracking errors. Concurrently, out-of-plane resolution has to be sufficiently high to minimize partial volume effect. Temporal resolution and out-of-plane spatial resolution are related factors that must be considered together. The data used for this study is a DSR dynamic volume image dataset with high temporal and spatial resolution using implanted fiducial markers to track myocardial motion. The results of this study suggest a reduced exposure and scan time for x-ray and magnetic resonance imaging methods, since a lower sample rate during systole is sufficient, whereas the period of rapid filling during diastole requires higher sampling. This could potentially reduce the cost of these procedures and allow higher patient throughput.
Porsdam Mann, Sebastian; Sahakian, Barbara J.
2016-01-01
Advances in data science allow for sophisticated analysis of increasingly large datasets. In the medical context, large volumes of data collected for healthcare purposes are contained in electronic health records (EHRs). The real-life character and sheer amount of data contained in them make EHRs an attractive resource for public health and biomedical research. However, medical records contain sensitive information that could be misused by third parties. Medical confidentiality and respect for patients' privacy and autonomy protect patient data, barring access to health records unless consent is given by the data subject. This creates a situation in which much of the beneficial records-based research is prevented from being used or is seriously undermined, because the refusal of consent by some patients introduces a systematic deviation, known as selection bias, from a representative sample of the general population, thus distorting research findings. Although research exemptions for the requirement of informed consent exist, they are rarely used in practice due to concerns over liability and a general culture of caution. In this paper, we argue that the problem of research access to sensitive data can be understood as a tension between the medical duties of confidentiality and beneficence. We attempt to show that the requirement of informed consent is not appropriate for all kinds of records-based research by distinguishing studies involving minimal risk from those that feature moderate or greater risks. We argue that the duty of easy rescue—the principle that persons should benefit others when this can be done at no or minimal risk to themselves—grounds the removal of consent requirements for minimally risky records-based research. Drawing on this discussion, we propose a risk-adapted framework for the facilitation of ethical uses of health data for the benefit of society. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336803
Porsdam Mann, Sebastian; Savulescu, Julian; Sahakian, Barbara J
2016-12-28
Advances in data science allow for sophisticated analysis of increasingly large datasets. In the medical context, large volumes of data collected for healthcare purposes are contained in electronic health records (EHRs). The real-life character and sheer amount of data contained in them make EHRs an attractive resource for public health and biomedical research. However, medical records contain sensitive information that could be misused by third parties. Medical confidentiality and respect for patients' privacy and autonomy protect patient data, barring access to health records unless consent is given by the data subject. This creates a situation in which much of the beneficial records-based research is prevented from being used or is seriously undermined, because the refusal of consent by some patients introduces a systematic deviation, known as selection bias, from a representative sample of the general population, thus distorting research findings. Although research exemptions for the requirement of informed consent exist, they are rarely used in practice due to concerns over liability and a general culture of caution. In this paper, we argue that the problem of research access to sensitive data can be understood as a tension between the medical duties of confidentiality and beneficence. We attempt to show that the requirement of informed consent is not appropriate for all kinds of records-based research by distinguishing studies involving minimal risk from those that feature moderate or greater risks. We argue that the duty of easy rescue-the principle that persons should benefit others when this can be done at no or minimal risk to themselves-grounds the removal of consent requirements for minimally risky records-based research. Drawing on this discussion, we propose a risk-adapted framework for the facilitation of ethical uses of health data for the benefit of society.This article is part of the themed issue 'The ethical impact of data science'. © 2015 The Authors.
Diet optimization methods can help translate dietary guidelines into a cancer prevention food plan.
Masset, Gabriel; Monsivais, Pablo; Maillot, Matthieu; Darmon, Nicole; Drewnowski, Adam
2009-08-01
Mathematical diet optimization models are used to create food plans that best resemble current eating habits while meeting prespecified nutrition and cost constraints. This study used linear programming to generate food plans meeting the key 2007 dietary recommendations issued by the World Cancer Research Fund/American Institute of Cancer Research (WCRF/AICR). The models were constructed to minimize deviations in food intake between the observed and the WCRF/AICR-recommended diets. Consumption constraints were imposed to prevent food plans from including unreasonable amounts of food from a single group. Consumption norms for nutrients and food groups were taken from dietary intake data for a sample of adult men and women (n = 161) in the Pacific Northwest. Food plans meeting the WCRF/AICR dietary guidelines numbers 3-5 and 7 were lower in refined grains and higher in vegetables and fruits than the existing diets. For this group, achieving cancer prevention goals required little modification of existing diets and had minimal impact on diet quality and cost. By contrast, the need to meet all nutritional needs through diet alone (guideline no. 8) required a large food volume increase and dramatic shifts from the observed food intake patterns. Putting dietary guidelines into practice may require the creation of detailed food plans that are sensitive to existing consumption patterns and food costs. Optimization models provide an elegant mathematical solution that can help determine whether sets of dietary guidelines are achievable by diverse U.S. population subgroups.
Ha, Ji Won; Hahn, Jong Hoon
2017-02-01
Acupuncture sample injection is a simple method to deliver well-defined nanoliter-scale sample plugs in PDMS microfluidic channels. This acupuncture injection method in microchip CE has several advantages, including minimization of sample consumption, the capability of serial injections of different sample solutions into the same microchannel, and the capability of injecting sample plugs into any desired position of a microchannel. Herein, we demonstrate that the simple and cost-effective acupuncture sample injection method can be used for PDMS microchip-based field amplified sample stacking in the most simplified straight channel by applying a single potential. We achieved the increase in electropherogram signals for the case of sample stacking. Furthermore, we present that microchip CGE of ΦX174 DNA-HaeⅢ digest can be performed with the acupuncture injection method on a glass microchip while minimizing sample loss and voltage control hardware. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2014 CFR
2014-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2013 CFR
2013-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
43 CFR 3272.12 - What environmental protection measures must I include in my utilization plan?
Code of Federal Regulations, 2012 CFR
2012-10-01
... resources; (6) Minimize air and noise pollution; and (7) Minimize hazards to public health and safety during... operations to ensure that they comply with the requirements of § 3200.4, and applicable noise, air, and water... may require you to collect data concerning existing air and water quality, noise, seismicity...
Efficiencies of Tritium (3H) bubbling systems.
Duda, Jean-Marie; Le Goff, Pierre; Leblois, Yoan; Ponsard, Samuel
2018-09-01
Bubbling systems are among the devices most used by nuclear operators to measure atmospheric tritium activity in their facilities or the neighbouring environment. However, information about trapping efficiency and bubbling system oxidation is not accessible and/or, at best, only minimally supported by demonstrations in actual operating conditions. In order to evaluate easily these parameters and thereby meet actual normative and regulatory requirements, a statistical study was carried out over 2000 monitoring records from the CEA Valduc site. From this data collection obtained over recent years of monitoring the CEA Valduc facilities and environment, a direct relation was highlighted between the 3H-samplers trapping efficiency of tritium as tritiated water and the sampling time and conditions of use: temperature and atmospheric moisture. It was thus demonstrated that this efficiency originated from two sources. The first one is intrinsic to the bubbling system operating parameters and the sampling time. That part applies equally to all four bubblers. The second part, however, is specific to the first bubbler. In essence, it depends on the sampling time and the sampled air characteristics. It was also highlighted that the water volume variation in the first bubbler, between the beginning and the end of the sampling process, is directly related to the average water concentration of the sampled air. In this way, it was possible to model the variations in trapping efficiency of the 3H-samplers relative to the sampling time and the water volume variation in the first bubbler. This model makes it possible to obtain the quantities required to comply with the current standards governing the monitoring of radionuclides in the environment and to associate an uncertainty concerning the measurements as well as the sampling parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.
Current trends in treatment of obesity in Karachi and possibilities of cost minimization.
Hussain, Mirza Izhar; Naqvi, Baqir Shyum
2015-03-01
Our study finds out drug usage trends in over weight and obese patients without any compelling indications in Karachi, looks for deviations of current practices from evidence based antihypertensive therapeutic guidelines and identifies not only cost minimization opportunities but also communication strategies to improve patients' awareness and compliance to achieve therapeutic goal. In present study two sets were used. Randomized stratified independent surveys were conducted in hospital doctors and family physicians (general practitioners), using pretested questionnaires. Sample size was 100. Statistical analysis was conducted on Statistical Package for Social Science (SPSS). Opportunities of cost minimization were also analyzed. One the basis of doctors' feedback, preference is given to non-pharmacologic management of obesity. Mass media campaign and media usage were recommended to increase patients awareness and patients' education along with strengthening family support systems was recommended for better compliance of the patients to doctor's advice. Local therapeutic guidelines for weight reduction were not found. Feedbacks showed that global therapeutic guidelines were followed by the doctors practicing in the community and hospitals in Karachi. However, high price branded drugs were used instead of low priced generic therapeutic equivalents. Patient's education is required for better awareness and improving patients' compliance. The doctors found preferring brand leaders instead of low cost options. This trend increases cost of therapy by 0.59 to 4.17 times. Therefore, there are great opportunities for cost minimization by using evidence-based clinically effective and safe medicines.
An autonomous payload controller for the Space Shuttle
NASA Technical Reports Server (NTRS)
Hudgins, J. I.
1979-01-01
The Autonomous Payload Control (APC) system discussed in the present paper was designed on the basis of such criteria as minimal cost of implementation, minimal space required in the flight-deck area, simple operation with verification of the results, minimal additional weight, minimal impact on Orbiter design, and minimal impact on Orbiter payload integration. In its present configuration, the APC provides a means for the Orbiter crew to control as many as 31 autononous payloads. The avionics and human engineering aspects of the system are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-21
...The Department of Labor, as part of its continuing effort to reduce paperwork and respondent burden, conducts a pre-clearance consultation program to provide the general public and Federal agencies with an opportunity to comment on proposed and continuing collections of information in accordance with the Paperwork Reduction Act of 1995 [44 U.S.C. 3506(c)(2)(A)]. This program helps to assure that requested data can be provided in the desired format, reporting burden (time and financial resources) is minimized, collection instruments are clearly understood, and the impact of collection requirements on respondents can be properly assessed. Currently, the Mine Safety and Health Administration (MSHA) is soliciting comments concerning the extension of the information collection for Radiation Sampling and Exposure Records, 30 CFR 57.5037 and 57.5040.
Stambaugh, Corey; Durand, Mathieu; Kemiktarak, Utku; Lawall, John
2014-08-01
The material properties of silicon nitride (SiN) play an important role in the performance of SiN membranes used in optomechanical applications. An optimum design of a subwavelength high-contrast grating requires accurate knowledge of the membrane thickness and index of refraction, and its performance is ultimately limited by material absorption. Here we describe a cavity-enhanced method to measure the thickness and complex index of refraction of dielectric membranes with small, but nonzero, absorption coefficients. By determining Brewster's angle and an angle at which reflection is minimized by means of destructive interference, both the real part of the index of refraction and the sample thickness can be measured. A comparison of the losses in the empty cavity and the cavity containing the dielectric sample provides a measurement of the absorption.
Arsenyev, P A; Trezvov, V V; Saratovskaya, N V
1997-01-01
This work represents a method, which allows to determine phase composition of calcium hydroxylapatite basing on its infrared spectrum. The method uses factor analysis of the spectral data of calibration set of samples to determine minimal number of factors required to reproduce the spectra within experimental error. Multiple linear regression is applied to establish correlation between factor scores of calibration standards and their properties. The regression equations can be used to predict the property value of unknown sample. The regression model was built for determination of beta-tricalcium phosphate content in hydroxylapatite. Statistical estimation of quality of the model was carried out. Application of the factor analysis on spectral data allows to increase accuracy of beta-tricalcium phosphate determination and expand the range of determination towards its less concentration. Reproducibility of results is retained.
Rogers, Stephen C.; Gibbons, Lindsey B.; Griffin, Sherraine; Doctor, Allan
2012-01-01
This chapter summarizes the principles of RSNO measurement in the gas phase, utilizing ozone-based chemiluminescence and the copper cysteine (2C) ± carbon monoxide (3C) reagent. Although an indirect method for quantifying RSNOs, this assay represents one of the most robust methodologies available. It exploits the NO• detection sensitivity of ozone based chemiluminscence, which is within the range required to detect physiological concentrations of RSNO metabolites. Additionally, the specificity of the copper cysteine (2C and 3C) reagent for RSNOs negates the need for sample pretreatment, thereby minimizing the likelihood of sample contamination (false positive results), NO species inter-conversion, or the loss of certain highly labile RSNO species. Herein, we outline the principles of this methodology, summarizing key issues, potential pitfalls and corresponding solutions. PMID:23116707
Wason, James M. S.; Mander, Adrian P.
2012-01-01
Two-stage designs are commonly used for Phase II trials. Optimal two-stage designs have the lowest expected sample size for a specific treatment effect, for example, the null value, but can perform poorly if the true treatment effect differs. Here we introduce a design for continuous treatment responses that minimizes the maximum expected sample size across all possible treatment effects. The proposed design performs well for a wider range of treatment effects and so is useful for Phase II trials. We compare the design to a previously used optimal design and show it has superior expected sample size properties. PMID:22651118
Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza
2014-01-01
ABSTRACT The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however, other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran. PMID:25395902
Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza; Khaniki, Gholam Reza Jahed
2014-09-01
The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however, other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran.
Reconnaissance and Autonomy for Small Robots (RASR) team: MAGIC 2010 challenge
NASA Astrophysics Data System (ADS)
Lacaze, Alberto; Murphy, Karl; Del Giorno, Mark; Corley, Katrina
2012-06-01
The Reconnaissance and Autonomy for Small Robots (RASR) team developed a system for the coordination of groups of unmanned ground vehicles (UGVs) that can execute a variety of military relevant missions in dynamic urban environments. Historically, UGV operations have been primarily performed via tele-operation, requiring at least one dedicated operator per robot, and requiring substantial real-time bandwidth to accomplish those missions. Our team goal was to develop a system that can provide long-term value to the war-fighter, utilizing MAGIC-2010 as a stepping stone. To that end, we self-imposed a set of constraints that would force us to develop technology that could readily be used by the military in the near term: • Use a relevant (deployed) platform • Use low-cost, reliable sensors • Develop an expandable and modular control system with innovative software algorithms to minimize the computing footprint required • Minimize required communications bandwidth and handle communication losses • Minimize additional power requirements to maximize battery life and mission duration
A compact, fast UV photometer for measurement of ozone from research aircraft
NASA Astrophysics Data System (ADS)
Gao, R. S.; Ballard, J.; Watts, L. A.; Thornberry, T. D.; Ciciora, S. J.; McLaughlin, R. J.; Fahey, D. W.
2012-09-01
In situ measurements of atmospheric ozone (O3) are performed routinely from many research aircraft platforms. The most common technique depends on the strong absorption of ultraviolet (UV) light by ozone. As atmospheric science advances to the widespread use of unmanned aircraft systems (UASs), there is an increasing requirement for minimizing instrument space, weight, and power while maintaining instrument accuracy, precision and time response. The design and use of a new, dual-beam, UV photometer instrument for in situ O3 measurements is described. A polarization optical-isolator configuration is utilized to fold the UV beam inside the absorption cells, yielding a 60-cm absorption length with a 30-cm cell. The instrument has a fast sampling rate (2 Hz at <200 hPa, 1 Hz at 200-500 hPa, and 0.5 Hz at ≥ 500 hPa), high accuracy (3% excluding operation in the 300-450 hPa range, where the accuracy may be degraded to about 5%), and excellent precision (1.1 × 1010 O3 molecules cm-3 at 2 Hz, which corresponds to 3.0 ppb at 200 K and 100 hPa, or 0.41 ppb at 273 K and 1013 hPa). The size (36 l), weight (18 kg), and power (50-200 W) make the instrument suitable for many UASs and other airborne platforms. Inlet and exhaust configurations are also described for ambient sampling in the troposphere and lower stratosphere (1000-50 hPa) that control the sample flow rate to maximize time response while minimizing loss of precision due to induced turbulence in the sample cell. In-flight and laboratory intercomparisons with existing O3 instruments show that measurement accuracy is maintained in flight.
Stein, Eric D; White, Bryan P; Mazor, Raphael D; Miller, Peter E; Pilgrim, Erik M
2013-01-01
Molecular methods, such as DNA barcoding, have the potential to enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biomonitoring using benthic macroinvertebrates. Using higher volumes or concentrations of ethanol, requirements for shorter holding times, or the need to include additional filtering may increase cost and logistical constraints to existing biomonitoring programs. To address this issue we evaluated the efficacy of various ethanol-based sample preservation methods at maintaining DNA integrity. We evaluated a series of methods that were minimally modified from typical field protocols in order to identify an approach that can be readily incorporated into existing monitoring programs. Benthic macroinvertebrates were collected from a minimally disturbed stream in southern California, USA and subjected to one of six preservation treatments. Ten individuals from five taxa were selected from each treatment and processed to produce DNA barcodes from the mitochondrial gene cytochrome c oxidase I (COI). On average, we obtained successful COI sequences (i.e. either full or partial barcodes) for between 93-99% of all specimens across all six treatments. As long as samples were initially preserved in 95% ethanol, successful sequencing of COI barcodes was not affected by a low dilution ratio of 2∶1, transfer to 70% ethanol, presence of abundant organic matter, or holding times of up to six months. Barcoding success varied by taxa, with Leptohyphidae (Ephemeroptera) producing the lowest barcode success rate, most likely due to poor PCR primer efficiency. Differential barcoding success rates have the potential to introduce spurious results. However, routine preservation methods can largely be used without adverse effects on DNA integrity.
Stein, Eric D.; White, Bryan P.; Mazor, Raphael D.; Miller, Peter E.; Pilgrim, Erik M.
2013-01-01
Molecular methods, such as DNA barcoding, have the potential to enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biomonitoring using benthic macroinvertebrates. Using higher volumes or concentrations of ethanol, requirements for shorter holding times, or the need to include additional filtering may increase cost and logistical constraints to existing biomonitoring programs. To address this issue we evaluated the efficacy of various ethanol-based sample preservation methods at maintaining DNA integrity. We evaluated a series of methods that were minimally modified from typical field protocols in order to identify an approach that can be readily incorporated into existing monitoring programs. Benthic macroinvertebrates were collected from a minimally disturbed stream in southern California, USA and subjected to one of six preservation treatments. Ten individuals from five taxa were selected from each treatment and processed to produce DNA barcodes from the mitochondrial gene cytochrome c oxidase I (COI). On average, we obtained successful COI sequences (i.e. either full or partial barcodes) for between 93–99% of all specimens across all six treatments. As long as samples were initially preserved in 95% ethanol, successful sequencing of COI barcodes was not affected by a low dilution ratio of 2∶1, transfer to 70% ethanol, presence of abundant organic matter, or holding times of up to six months. Barcoding success varied by taxa, with Leptohyphidae (Ephemeroptera) producing the lowest barcode success rate, most likely due to poor PCR primer efficiency. Differential barcoding success rates have the potential to introduce spurious results. However, routine preservation methods can largely be used without adverse effects on DNA integrity. PMID:23308097
LST and instrument considerations. [modular design
NASA Technical Reports Server (NTRS)
Levin, G. M.
1974-01-01
In order that the LST meet its scientific objectives and also be a National Astronomical Space Facility during the 1980's and 1990's, broad requirements have been levied by the scientific community. These scientific requirements can be directly translated into design requirements and specifications for the scientific instruments. The instrument ensemble design must be consistent with a 15-year operational lifetime. Downtime for major repair/refurbishment or instrument updating must be minimized. The overall efficiency and performance of the instruments should be maximized. Modularization of instruments and instrument subsystems, some degree of on-orbit servicing (both repair and replacement), on-axis location, minimizing the number of reflections within instruments, minimizing polarization effects, and simultaneous operation of the F/24 camera with other instruments, are just a few of the design guidelines and specifications which can and will be met in order that these broader scientific requirements be satisfied.-
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maisel, B.E.; Hunt, G.T.; Devaney, R.J. Jr.
EPA`s Brownfields Economic Redevelopment Initiative has sparked renewal of industrial and commercial parcels otherwise idled or under-utilized because of real or perceived environmental contamination. In certain cases, restoring such parcels to productive economic use requires a redevelopment effort protective of human health and welfare through minimizing offsite migration of environmental contaminants during cleanup, demolition and remediation activities. To support these objectives, an air monitoring program is often required as an integral element of a comprehensive brownfields redevelopment effort. This paper presents a strategic framework for design and execution of an ambient air monitoring program in support of a brownfields remediationmore » effort ongoing in Lawrence, MA. Based on site characterization, the program included sample collection and laboratory analysis of ambient air samples for polychlorinated biphenyls (PCBs), polychlorinated dibenzodioxins and polychlorinated dibenzofurans (PCDDs/PCDFs), total suspended particulate (TSP), inhalable particulate (PM10), and lead. The program included four monitoring phases, identified as background, wintertime, demolition/remediation and post-demolition. Air sampling occurred over a 16 month period during 1996--97, during which time nine sampling locations were utilized to produce approximately 1,500 ambient air samples. Following strict data review and validation procedures, ambient air data interpretation focused on the following: evaluation of upwind/downwind sample pairs, comparison of ambient levels to existing regulatory standards, relation of ambient levels to data reported in the open literature, and, determination of normal seasonal variations in existing background burden, comparison of ambient levels measured during site activity to background levels.« less
Glass wool filters for concentrating waterborne viruses and agricultural zoonotic pathogens
Millen, Hana T.; Gonnering, Jordan C.; Berg, Ryan K.; Spencer, Susan K.; Jokela, William E.; Pearce, John M.; Borchardt, Jackson S.; Borchardt, Mark A.
2012-01-01
The key first step in evaluating pathogen levels in suspected contaminated water is concentration. Concentration methods tend to be specific for a particular pathogen group, for example US Environmental Protection Agency Method 1623 for Giardia and Cryptosporidium1, which means multiple methods are required if the sampling program is targeting more than one pathogen group. Another drawback of current methods is the equipment can be complicated and expensive, for example the VIRADEL method with the 1MDS cartridge filter for concentrating viruses2. In this article we describe how to construct glass wool filters for concentrating waterborne pathogens. After filter elution, the concentrate is amenable to a second concentration step, such as centrifugation, followed by pathogen detection and enumeration by cultural or molecular methods. The filters have several advantages. Construction is easy and the filters can be built to any size for meeting specific sampling requirements. The filter parts are inexpensive, making it possible to collect a large number of samples without severely impacting a project budget. Large sample volumes (100s to 1,000s L) can be concentrated depending on the rate of clogging from sample turbidity. The filters are highly portable and with minimal equipment, such as a pump and flow meter, they can be implemented in the field for sampling finished drinking water, surface water, groundwater, and agricultural runoff. Lastly, glass wool filtration is effective for concentrating a variety of pathogen types so only one method is necessary. Here we report on filter effectiveness in concentrating waterborne human enterovirus, Salmonella enterica, Cryptosporidium parvum, and avian influenza virus.
Glass Wool Filters for Concentrating Waterborne Viruses and Agricultural Zoonotic Pathogens
Millen, Hana T.; Gonnering, Jordan C.; Berg, Ryan K.; Spencer, Susan K.; Jokela, William E.; Pearce, John M.; Borchardt, Jackson S.; Borchardt, Mark A.
2012-01-01
The key first step in evaluating pathogen levels in suspected contaminated water is concentration. Concentration methods tend to be specific for a particular pathogen group, for example US Environmental Protection Agency Method 1623 for Giardia and Cryptosporidium1, which means multiple methods are required if the sampling program is targeting more than one pathogen group. Another drawback of current methods is the equipment can be complicated and expensive, for example the VIRADEL method with the 1MDS cartridge filter for concentrating viruses2. In this article we describe how to construct glass wool filters for concentrating waterborne pathogens. After filter elution, the concentrate is amenable to a second concentration step, such as centrifugation, followed by pathogen detection and enumeration by cultural or molecular methods. The filters have several advantages. Construction is easy and the filters can be built to any size for meeting specific sampling requirements. The filter parts are inexpensive, making it possible to collect a large number of samples without severely impacting a project budget. Large sample volumes (100s to 1,000s L) can be concentrated depending on the rate of clogging from sample turbidity. The filters are highly portable and with minimal equipment, such as a pump and flow meter, they can be implemented in the field for sampling finished drinking water, surface water, groundwater, and agricultural runoff. Lastly, glass wool filtration is effective for concentrating a variety of pathogen types so only one method is necessary. Here we report on filter effectiveness in concentrating waterborne human enterovirus, Salmonella enterica, Cryptosporidium parvum, and avian influenza virus. PMID:22415031
Impact of specimen adequacy on the assessment of renal allograft biopsy specimens.
Cimen, S; Geldenhuys, L; Guler, S; Imamoglu, A; Molinari, M
2016-01-01
The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.
NASA Astrophysics Data System (ADS)
Hamelin, Elizabeth I.; Blake, Thomas A.; Perez, Jonas W.; Crow, Brian S.; Shaner, Rebecca L.; Coleman, Rebecca M.; Johnson, Rudolph C.
2016-05-01
Public health response to large scale chemical emergencies presents logistical challenges for sample collection, transport, and analysis. Diagnostic methods used to identify and determine exposure to chemical warfare agents, toxins, and poisons traditionally involve blood collection by phlebotomists, cold transport of biomedical samples, and costly sample preparation techniques. Use of dried blood spots, which consist of dried blood on an FDA-approved substrate, can increase analyte stability, decrease infection hazard for those handling samples, greatly reduce the cost of shipping/storing samples by removing the need for refrigeration and cold chain transportation, and be self-prepared by potentially exposed individuals using a simple finger prick and blood spot compatible paper. Our laboratory has developed clinical assays to detect human exposures to nerve agents through the analysis of specific protein adducts and metabolites, for which a simple extraction from a dried blood spot is sufficient for removing matrix interferents and attaining sensitivities on par with traditional sampling methods. The use of dried blood spots can bridge the gap between the laboratory and the field allowing for large scale sample collection with minimal impact on hospital resources while maintaining sensitivity, specificity, traceability, and quality requirements for both clinical and forensic applications.
Development of Standardized Material Testing Protocols for Prosthetic Liners
Cagle, John C.; Reinhall, Per G.; Hafner, Brian J.; Sanders, Joan E.
2017-01-01
A set of protocols was created to characterize prosthetic liners across six clinically relevant material properties. Properties included compressive elasticity, shear elasticity, tensile elasticity, volumetric elasticity, coefficient of friction (CoF), and thermal conductivity. Eighteen prosthetic liners representing the diverse range of commercial products were evaluated to create test procedures that maximized repeatability, minimized error, and provided clinically meaningful results. Shear and tensile elasticity test designs were augmented with finite element analysis (FEA) to optimize specimen geometries. Results showed that because of the wide range of available liner products, the compressive elasticity and tensile elasticity tests required two test maxima; samples were tested until they met either a strain-based or a stress-based maximum, whichever was reached first. The shear and tensile elasticity tests required that no cyclic conditioning be conducted because of limited endurance of the mounting adhesive with some liner materials. The coefficient of friction test was based on dynamic coefficient of friction, as it proved to be a more reliable measurement than static coefficient of friction. The volumetric elasticity test required that air be released beneath samples in the test chamber before testing. The thermal conductivity test best reflected the clinical environment when thermal grease was omitted and when liner samples were placed under pressure consistent with load bearing conditions. The developed procedures provide a standardized approach for evaluating liner products in the prosthetics industry. Test results can be used to improve clinical selection of liners for individual patients and guide development of new liner products. PMID:28233885
A Concept for Power Cycling the Electronics of CALICE-AHCAL with the Train Structure of ILC
NASA Astrophysics Data System (ADS)
Göottlicher, Peter; The Calice-Collaboration
Particle flow algorithm calorimetry requires high granularity three-dimensional readout. The tight power requirement of 40 μW/channel is reached by enabling readout ASIC currents only during beam delivery, corresponding to a 1% duty cycle. EMI noise caused by current switching needs to be minimized by the power system and this paper presents ideas, simulations and first measurements for minimizing disturbances. A carefully design of circuits, printed circuit boards, grounding scheme and use of floating supplies allows current loops to be closed locally, stabilized voltages and minimal currents in the metal structures.
Bockman, Alexander; Fackler, Cameron; Xiang, Ning
2015-04-01
Acoustic performance for an interior requires an accurate description of the boundary materials' surface acoustic impedance. Analytical methods may be applied to a small class of test geometries, but inverse numerical methods provide greater flexibility. The parameter estimation problem requires minimizing prediction vice observed acoustic field pressure. The Bayesian-network sampling approach presented here mitigates other methods' susceptibility to noise inherent to the experiment, model, and numerics. A geometry agnostic method is developed here and its parameter estimation performance is demonstrated for an air-backed micro-perforated panel in an impedance tube. Good agreement is found with predictions from the ISO standard two-microphone, impedance-tube method, and a theoretical model for the material. Data by-products exclusive to a Bayesian approach are analyzed to assess sensitivity of the method to nuisance parameters.
Depth assisted compression of full parallax light fields
NASA Astrophysics Data System (ADS)
Graziosi, Danillo B.; Alpaslan, Zahir Y.; El-Ghoroury, Hussein S.
2015-03-01
Full parallax light field displays require high pixel density and huge amounts of data. Compression is a necessary tool used by 3D display systems to cope with the high bandwidth requirements. One of the formats adopted by MPEG for 3D video coding standards is the use of multiple views with associated depth maps. Depth maps enable the coding of a reduced number of views, and are used by compression and synthesis software to reconstruct the light field. However, most of the developed coding and synthesis tools target linearly arranged cameras with small baselines. Here we propose to use the 3D video coding format for full parallax light field coding. We introduce a view selection method inspired by plenoptic sampling followed by transform-based view coding and view synthesis prediction to code residual views. We determine the minimal requirements for view sub-sampling and present the rate-distortion performance of our proposal. We also compare our method with established video compression techniques, such as H.264/AVC, H.264/MVC, and the new 3D video coding algorithm, 3DV-ATM. Our results show that our method not only has an improved rate-distortion performance, it also preserves the structure of the perceived light fields better.
A Novel Stimulus Artifact Removal Technique for High-Rate Electrical Stimulation
Heffer, Leon F; Fallon, James B
2008-01-01
Electrical stimulus artifact corrupting electrophysiological recordings often make the subsequent analysis of the underlying neural response difficult. This is particularly evident when investigating short-latency neural activity in response to high-rate electrical stimulation. We developed and evaluated an off-line technique for the removal of stimulus artifact from electrophysiological recordings. Pulsatile electrical stimulation was presented at rates of up to 5000 pulses/s during extracellular recordings of guinea pig auditory nerve fibers. Stimulus artifact was removed by replacing the sample points at each stimulus artifact event with values interpolated along a straight line, computed from neighbouring sample points. This technique required only that artifact events be identifiable and that the artifact duration remained less than both the inter-stimulus interval and the time course of the action potential. We have demonstrated that this computationally efficient sample-and-interpolate technique removes the stimulus artifact with minimal distortion of the action potential waveform. We suggest that this technique may have potential applications in a range of electrophysiological recording systems. PMID:18339428
Eichmann, Cordula; Parson, Walther
2008-09-01
The traditional protocol for forensic mitochondrial DNA (mtDNA) analyses involves the amplification and sequencing of the two hypervariable segments HVS-I and HVS-II of the mtDNA control region. The primers usually span fragment sizes of 300-400 bp each region, which may result in weak or failed amplification in highly degraded samples. Here we introduce an improved and more stable approach using shortened amplicons in the fragment range between 144 and 237 bp. Ten such amplicons were required to produce overlapping fragments that cover the entire human mtDNA control region. These were co-amplified in two multiplex polymerase chain reactions and sequenced with the individual amplification primers. The primers were carefully selected to minimize binding on homoplasic and haplogroup-specific sites that would otherwise result in loss of amplification due to mis-priming. The multiplexes have successfully been applied to ancient and forensic samples such as bones and teeth that showed a high degree of degradation.
NASA Astrophysics Data System (ADS)
Miley, H.; Forrester, J. B.; Greenwood, L. R.; Keillor, M. E.; Eslinger, P. W.; Regmi, R.; Biegalski, S.; Erikson, L. E.
2013-12-01
The aerosol samples taken from the CTBT International Monitoring Systems stations are measured in the field with a minimum detectable concentration (MDC) of ~30 microBq/m3 of Ba-140. This is sufficient to detect far less than 1 kt of aerosol fission products in the atmosphere when the station is in the plume from such an event. Recent thinking about minimizing the potential source region (PSR) from a detection has led to a desire for a multi-station or multi-time period detection. These would be connected through the concept of ';event formation', analogous to event formation in seismic event study. However, to form such events, samples from the nearest neighbors of the detection would require re-analysis with a more sensitive laboratory to gain a substantially lower MDC, and potentially find radionuclide concentrations undetected by the station. The authors will present recent laboratory work with air filters showing various cost effective means for enhancing laboratory sensitivity.
Giudice, Valentina; Feng, Xingmin; Kajigaya, Sachiko; Young, Neal S.; Biancotto, Angélique
2017-01-01
Fluorescent cell barcoding (FCB) is a cell-based multiplexing technique for high-throughput flow cytometry. Barcoded samples can be stained and acquired collectively, minimizing staining variability and antibody consumption, and decreasing required sample volumes. Combined with functional measurements, FCB can be used for drug screening, signaling profiling, and cytokine detection, but technical issues are present. We optimized the FCB technique for routine utilization using DyLight 350, DyLight 800, Pacific Orange, and CBD500 for barcoding six, nine, or 36 human peripheral blood specimens. Working concentrations of FCB dyes ranging from 0 to 500 μg/ml were tested, and viability dye staining was optimized to increase robustness of data. A five-color staining with surface markers for Vβ usage analysis in CD4+ and CD8+ T cells was achieved in combination with nine sample barcoding. We provide improvements of the FCB technique that should be useful for multiplex drug screening and for lymphocyte characterization and perturbations in the diagnosis and during the course of disease. PMID:28692789
Zhu, Ying; Piehowski, Paul D; Zhao, Rui; Chen, Jing; Shen, Yufeng; Moore, Ronald J; Shukla, Anil K; Petyuk, Vladislav A; Campbell-Thompson, Martha; Mathews, Clayton E; Smith, Richard D; Qian, Wei-Jun; Kelly, Ryan T
2018-02-28
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Between Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui
Nanoscale or single cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here, we report the development of a nanoPOTS (Nanodroplet Processing in One pot for Trace Samples) platform as a major advance in overall sensitivity. NanoPOTS dramatically enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive LC-MS, nanoPOTS allows identification of ~1500 to ~3,000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Between Runsmore » algorithm of MaxQuant, >3000 proteins were consistently identified from as few as 10 cells. Furthermore, we demonstrate robust quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here in this paper, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Betweenmore » Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigeti, David Edward; Williams, Brian J.; Parsons, D. Kent
2016-10-18
Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances domore » not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.« less
Zhu, Ying; Piehowski, Paul D.; Zhao, Rui; ...
2018-02-28
Nanoscale or single-cell technologies are critical for biomedical applications. However, current mass spectrometry (MS)-based proteomic approaches require samples comprising a minimum of thousands of cells to provide in-depth profiling. Here in this paper, we report the development of a nanoPOTS (nanodroplet processing in one pot for trace samples) platform for small cell population proteomics analysis. NanoPOTS enhances the efficiency and recovery of sample processing by downscaling processing volumes to <200 nL to minimize surface losses. When combined with ultrasensitive liquid chromatography-MS, nanoPOTS allows identification of ~1500 to ~3000 proteins from ~10 to ~140 cells, respectively. By incorporating the Match Betweenmore » Runs algorithm of MaxQuant, >3000 proteins are consistently identified from as few as 10 cells. Furthermore, we demonstrate quantification of ~2400 proteins from single human pancreatic islet thin sections from type 1 diabetic and control donors, illustrating the application of nanoPOTS for spatially resolved proteome measurements from clinical tissues.« less
Xun-Ping, W; An, Z
2017-07-27
Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.
Optimal Inspection of Imports to Prevent Invasive Pest Introduction.
Chen, Cuicui; Epanchin-Niell, Rebecca S; Haight, Robert G
2018-03-01
The United States imports more than 1 billion live plants annually-an important and growing pathway for introduction of damaging nonnative invertebrates and pathogens. Inspection of imports is one safeguard for reducing pest introductions, but capacity constraints limit inspection effort. We develop an optimal sampling strategy to minimize the costs of pest introductions from trade by posing inspection as an acceptance sampling problem that incorporates key features of the decision context, including (i) simultaneous inspection of many heterogeneous lots, (ii) a lot-specific sampling effort, (iii) a budget constraint that limits total inspection effort, (iv) inspection error, and (v) an objective of minimizing cost from accepted defective units. We derive a formula for expected number of accepted infested units (expected slippage) given lot size, sample size, infestation rate, and detection rate, and we formulate and analyze the inspector's optimization problem of allocating a sampling budget among incoming lots to minimize the cost of slippage. We conduct an empirical analysis of live plant inspection, including estimation of plant infestation rates from historical data, and find that inspections optimally target the largest lots with the highest plant infestation rates, leaving some lots unsampled. We also consider that USDA-APHIS, which administers inspections, may want to continue inspecting all lots at a baseline level; we find that allocating any additional capacity, beyond a comprehensive baseline inspection, to the largest lots with the highest infestation rates allows inspectors to meet the dual goals of minimizing the costs of slippage and maintaining baseline sampling without substantial compromise. © 2017 Society for Risk Analysis.
Flow-through Fourier transform infrared sensor for total hydrocarbons determination in water.
Pérez-Palacios, David; Armenta, Sergio; Lendl, Bernhard
2009-09-01
A new flow-through Fourier transform infrared (FT-IR) sensor for oil in water analysis based on solid-phase spectroscopy on octadecyl (C18) silica particles has been developed. The C18 non-polar sorbent is placed inside the sensor and is able to retain hydrocarbons from water samples. The system does not require the use of chlorinated solvents, reducing the environmental impact, and the minimal sample handling stages serve to ensure sample integrity whilst reducing exposure of the analyst to any toxic hydrocarbons present within the samples. Fourier transform infrared (FT-IR) spectra were recorded by co-adding 32 scans at a resolution of 4 cm(-1) and the band located at 1462 cm(-1) due to the CH(2) bending was integrated from 1475 to 1450 cm(-1) using a baseline correction established between 1485 and 1440 cm(-1) using the areas as analytical signal. The technique, which provides a limit of detection (LOD) of 22 mg L(-1) and a precision expressed as relative standard deviation (RSD) lower than 5%, is considerably rapid and allows for a high level of automation.
Statistical modelling as an aid to the design of retail sampling plans for mycotoxins in food.
MacArthur, Roy; MacDonald, Susan; Brereton, Paul; Murray, Alistair
2006-01-01
A study has been carried out to assess appropriate statistical models for use in evaluating retail sampling plans for the determination of mycotoxins in food. A compound gamma model was found to be a suitable fit. A simulation model based on the compound gamma model was used to produce operating characteristic curves for a range of parameters relevant to retail sampling. The model was also used to estimate the minimum number of increments necessary to minimize the overall measurement uncertainty. Simulation results showed that measurements based on retail samples (for which the maximum number of increments is constrained by cost) may produce fit-for-purpose results for the measurement of ochratoxin A in dried fruit, but are unlikely to do so for the measurement of aflatoxin B1 in pistachio nuts. In order to produce a more accurate simulation, further work is required to determine the degree of heterogeneity associated with batches of food products. With appropriate parameterization in terms of physical and biological characteristics, the systems developed in this study could be applied to other analyte/matrix combinations.
Cross-sensor iris recognition through kernel learning.
Pillai, Jaishanker K; Puertas, Maria; Chellappa, Rama
2014-01-01
Due to the increasing popularity of iris biometrics, new sensors are being developed for acquiring iris images and existing ones are being continuously upgraded. Re-enrolling users every time a new sensor is deployed is expensive and time-consuming, especially in applications with a large number of enrolled users. However, recent studies show that cross-sensor matching, where the test samples are verified using data enrolled with a different sensor, often lead to reduced performance. In this paper, we propose a machine learning technique to mitigate the cross-sensor performance degradation by adapting the iris samples from one sensor to another. We first present a novel optimization framework for learning transformations on iris biometrics. We then utilize this framework for sensor adaptation, by reducing the distance between samples of the same class, and increasing it between samples of different classes, irrespective of the sensors acquiring them. Extensive evaluations on iris data from multiple sensors demonstrate that the proposed method leads to improvement in cross-sensor recognition accuracy. Furthermore, since the proposed technique requires minimal changes to the iris recognition pipeline, it can easily be incorporated into existing iris recognition systems.
Chandu, Dilip; Paul, Sudakshina; Parker, Mathew; Dudin, Yelena; King-Sitzes, Jennifer; Perez, Tim; Mittanck, Don W; Shah, Manali; Glenn, Kevin C; Piepenburg, Olaf
2016-01-01
Testing for the presence of genetically modified material in seed samples is of critical importance for all stakeholders in the agricultural industry, including growers, seed manufacturers, and regulatory bodies. While rapid antibody-based testing for the transgenic protein has fulfilled this need in the past, the introduction of new variants of a given transgene demands new diagnostic regimen that allows distinguishing different traits at the nucleic acid level. Although such molecular tests can be performed by PCR in the laboratory, their requirement for expensive equipment and sophisticated operation have prevented its uptake in point-of-use applications. A recently developed isothermal DNA amplification technique, recombinase polymerase amplification (RPA), combines simple sample preparation and amplification work-flow procedures with the use of minimal detection equipment in real time. Here, we report the development of a highly sensitive and specific RPA-based detection system for Genuity Roundup Ready 2 Yield (RR2Y) material in soybean (Glycine max) seed samples and present the results of studies applying the method in both laboratory and field-type settings.
New minimally access hydrocelectomy.
Saber, Aly
2011-02-01
To ascertain the acceptability of minimally access hydrocelectomy through a 2-cm incision and the outcome in terms of morbidity reduction and recurrence rate. Although controversy exists regarding the treatment of hydrocele, hydrocelectomy remains the treatment of choice for hydroceles. However, the standard surgical procedures for hydrocele can cause postoperative discomfort and complications. A total of 42 adult patients, aged 18-56 years, underwent hydrocelectomy as an outpatient procedure using a 2-cm scrotal skin incision and excision of only a small disk of the parietal tunica vaginalis. The operative time was 12-18 minutes (mean 15). The outcome measures included patient satisfaction and postoperative complications. This procedure requires minor dissection and minimal manipulation during treatment. It also resulted in no recurrence and minimal complications and required a short operative time. Copyright © 2011 Elsevier Inc. All rights reserved.
Halling, V W; Jones, M F; Bestrom, J E; Wold, A D; Rosenblatt, J E; Smith, T F; Cockerill, F R
1999-10-01
Recently, a treponema-specific immunoglobulin G (IgG) enzyme immunoassay (EIA), the CAPTIA Syphilis-G (Trinity Biotech, Jamestown, N.Y.), has become available as a diagnostic test for syphilis. A total of 89 stored sera previously tested by the fluorescent treponemal antibody absorption (FTA-ABS) IgG assay were evaluated by the CAPTIA EIA. The FTA-ABS IgG procedure was performed by technologists unblinded to results of rapid plasmid reagin (RPR) testing of the same specimens. Borderline CAPTIA-positive samples (antibody indices of >/=0.650 and =0.900) were retested; if the second analysis produced an index of >0.900, the sample was considered positive. Thirteen of 89 (15%) samples had discrepant results. Compared to the FTA-ABS assay, the CAPTIA EIA had a sensitivity and specificity and positive and negative predictive values of 70.7, 97.9, 96.7, and 79.7%, respectively. In another analysis, discrepancies between results were resolved by repeated FTA-ABS testing (technologists were blinded to previous RPR results) and patient chart reviews. Seven CAPTIA-negative samples which were previously interpreted (unblinded) as minimally reactive by the FTA method were subsequently interpreted (blinded) as nonreactive. One other discrepant sample (CAPTIA negative and FTA-ABS positive [at an intensity of 3+], unblinded) was FTA negative with repeated testing (blinded). For the five remaining discrepant samples, chart reviews indicated that one patient (CAPTIA negative and FTA-ABS positive [minimally reactive], blinded) had possible syphilis. These five samples were also evaluated and found to be negative by another treponema-specific test, the Treponema pallidum microhemagglutination assay. Therefore, after repeated testing and chart reviews, 2 of the 89 (2%) samples had discrepant results; the adjusted sensitivity, specificity, and positive and negative predictive values were 96.7, 98.3, 96.7, and 98.3%, respectively. This study demonstrates that the CAPTIA IgG EIA is a reliable method for syphilis testing and that personnel performing tests which require subjective interpretation, like the FTA-ABS test, may be biased by RPR test results.
Halling, V. W.; Jones, M. F.; Bestrom, J. E.; Wold, A. D.; Rosenblatt, J. E.; Smith, T. F.; Cockerill, F. R.
1999-01-01
Recently, a treponema-specific immunoglobulin G (IgG) enzyme immunoassay (EIA), the CAPTIA Syphilis-G (Trinity Biotech, Jamestown, N.Y.), has become available as a diagnostic test for syphilis. A total of 89 stored sera previously tested by the fluorescent treponemal antibody absorption (FTA-ABS) IgG assay were evaluated by the CAPTIA EIA. The FTA-ABS IgG procedure was performed by technologists unblinded to results of rapid plasmid reagin (RPR) testing of the same specimens. Borderline CAPTIA-positive samples (antibody indices of ≥0.650 and ≤0.900) were retested; if the second analysis produced an index of >0.900, the sample was considered positive. Thirteen of 89 (15%) samples had discrepant results. Compared to the FTA-ABS assay, the CAPTIA EIA had a sensitivity and specificity and positive and negative predictive values of 70.7, 97.9, 96.7, and 79.7%, respectively. In another analysis, discrepancies between results were resolved by repeated FTA-ABS testing (technologists were blinded to previous RPR results) and patient chart reviews. Seven CAPTIA-negative samples which were previously interpreted (unblinded) as minimally reactive by the FTA method were subsequently interpreted (blinded) as nonreactive. One other discrepant sample (CAPTIA negative and FTA-ABS positive [at an intensity of 3+], unblinded) was FTA negative with repeated testing (blinded). For the five remaining discrepant samples, chart reviews indicated that one patient (CAPTIA negative and FTA-ABS positive [minimally reactive], blinded) had possible syphilis. These five samples were also evaluated and found to be negative by another treponema-specific test, the Treponema pallidum microhemagglutination assay. Therefore, after repeated testing and chart reviews, 2 of the 89 (2%) samples had discrepant results; the adjusted sensitivity, specificity, and positive and negative predictive values were 96.7, 98.3, 96.7, and 98.3%, respectively. This study demonstrates that the CAPTIA IgG EIA is a reliable method for syphilis testing and that personnel performing tests which require subjective interpretation, like the FTA-ABS test, may be biased by RPR test results. PMID:10488183
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Low-Friction, High-Stiffness Joint for Uniaxial Load Cell
NASA Technical Reports Server (NTRS)
Lewis, James L.; Le, Thang; Carroll, Monty B.
2007-01-01
A universal-joint assembly has been devised for transferring axial tension or compression to a load cell. To maximize measurement accuracy, the assembly is required to minimize any moments and non-axial forces on the load cell and to exhibit little or no hysteresis. The requirement to minimize hysteresis translates to a requirement to maximize axial stiffness (including minimizing backlash) and a simultaneous requirement to minimize friction. In practice, these are competing requirements, encountered repeatedly in efforts to design universal joints. Often, universal-joint designs represent compromises between these requirements. The improved universal-joint assembly contains two universal joints, each containing two adjustable pairs of angular-contact ball bearings. One might be tempted to ask why one could not use simple ball-and-socket joints rather than something as complex as universal joints containing adjustable pairs of angularcontact ball bearings. The answer is that ball-and-socket joints do not offer sufficient latitude to trade stiffness versus friction: the inevitable result of an attempt to make such a trade in a ball-and-socket joint is either too much backlash or too much friction. The universal joints are located at opposite ends of an axial subassembly that contains the load cell. The axial subassembly includes an axial shaft, an axial housing, and a fifth adjustable pair of angular-contact ball bearings that allows rotation of the axial housing relative to the shaft. The preload on each pair of angular-contact ball bearings can be adjusted to obtain the required stiffness with minimal friction, tailored for a specific application. The universal joint at each end affords two degrees of freedom, allowing only axial force to reach the load cell regardless of application of moments and non-axial forces. The rotational joint on the axial subassembly affords a fifth degree of freedom, preventing application of a torsion load to the load cell.
Preservation of samples for dissolved mercury
Hamlin, S.N.
1989-01-01
Water samples for dissolved mercury requires special treatment because of the high chemical mobility and volatility of this element. Widespread use of mercury and its compounds has provided many avenues for contamination of water. Two laboratory tests were done to determine the relative permeabilities of glass and plastic sample bottles to mercury vapor. Plastic containers were confirmed to be quite permeable to airborne mercury, glass containers were virtually impermeable. Methods of preservation include the use of various combinations of acids, oxidants, and complexing agents. The combination of nitric acid and potassium dichromate successfully preserved mercury in a large variety of concentrations and dissolved forms. Because this acid-oxidant preservative acts as a sink for airborne mercury and plastic containers are permeable to mercury vapor, glass bottles are preferred for sample collection. To maintain a healthy work environment and minimize the potential for contamination of water samples, mercury and its compounds are isolated from the atmosphere while in storage. Concurrently, a program to monitor environmental levels of mercury vapor in areas of potential contamination is needed to define the extent of mercury contamination and to assess the effectiveness of mercury clean-up procedures.Water samples for dissolved mercury require special treatment because of the high chemical mobility and volatility of this element. Widespread use of mercury and its compounds has provided many avenues for contamination of water. Two laboratory tests were done to determine the relative permeabilities of glass and plastic sample bottles to mercury vapor. Plastic containers were confirmed to be quite permeable to airborne mercury, glass containers were virtually impermeable. Methods of preservation include the use of various combinations of acids, oxidants, and complexing agents. The combination of nitric acid and potassium dichromate successfully preserved mercury in a large variety of concentrations and dissolved forms.
Bryce, Thomas N.; Dijkers, Marcel P.
2015-01-01
Background: Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. Objective: To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. Methods: A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Results: Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. Conclusion: This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device. PMID:26364280
Separation-Compliant, Optimal Routing and Control of Scheduled Arrivals in a Terminal Airspace
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.; Davis, Damek; Isaacson, Douglas R.
2013-01-01
We address the problem of navigating a set (fleet) of aircraft in an aerial route network so as to bring each aircraft to its destination at a specified time and with minimal distance separation assured between all aircraft at all times. The speed range, initial position, required destination, and required time of arrival at destination for each aircraft are assumed provided. Each aircraft's movement is governed by a controlled differential equation (state equation). The problem consists in choosing for each aircraft a path in the route network and a control strategy so as to meet the constraints and reach the destination at the required time. The main contribution of the paper is a model that allows to recast this problem as a decoupled collection of problems in classical optimal control and is easily generalized to the case when inertia cannot be neglected. Some qualitative insight into solution behavior is obtained using the Pontryagin Maximum Principle. Sample numerical solutions are computed using a numerical optimal control solver. The proposed model is first step toward increasing the fidelity of continuous time control models of air traffic in a terminal airspace. The Pontryagin Maximum Principle implies the polygonal shape of those portions of the state trajectories away from those states in which one or more aircraft pair are at minimal separation. The model also confirms the intuition that, the narrower the allowed speed ranges of the aircraft, the smaller the space of optimal solutions, and that an instance of the optimal control problem may not have a solution at all (i.e., no control strategy that meets the separation requirement and other constraints).
Kozlowski, Allan J; Bryce, Thomas N; Dijkers, Marcel P
2015-01-01
Powered exoskeletons have been demonstrated as being safe for persons with spinal cord injury (SCI), but little is known about how users learn to manage these devices. To quantify the time and effort required by persons with SCI to learn to use an exoskeleton for assisted walking. A convenience sample was enrolled to learn to use the first-generation Ekso powered exoskeleton to walk. Participants were given up to 24 weekly sessions of instruction. Data were collected on assistance level, walking distance and speed, heart rate, perceived exertion, and adverse events. Time and effort was quantified by the number of sessions required for participants to stand up, walk for 30 minutes, and sit down, initially with minimal and subsequently with contact guard assistance. Of 22 enrolled participants, 9 screen-failed, and 7 had complete data. All of these 7 were men; 2 had tetraplegia and 5 had motor-complete injuries. Of these, 5 participants could stand, walk, and sit with contact guard or close supervision assistance, and 2 required minimal to moderate assistance. Walk times ranged from 28 to 94 minutes with average speeds ranging from 0.11 to 0.21 m/s. For all participants, heart rate changes and reported perceived exertion were consistent with light to moderate exercise. This study provides preliminary evidence that persons with neurological weakness due to SCI can learn to walk with little or no assistance and light to somewhat hard perceived exertion using a powered exoskeleton. Persons with different severities of injury, including those with motor complete C7 tetraplegia and motor incomplete C4 tetraplegia, may be able to learn to use this device.
A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0
NASA Technical Reports Server (NTRS)
DeChant, Lawrence J.; Nadell, Shari-Beth
1999-01-01
A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.
Chemical fractionation-enhanced structural characterization of marine dissolved organic matter
NASA Astrophysics Data System (ADS)
Arakawa, N.; Aluwihare, L.
2016-02-01
Describing the molecular fingerprint of dissolved organic matter (DOM) requires sample processing methods and separation techniques that can adequately minimize its complexity. We have employed acid hydrolysis as a way to make the subcomponents of marine solid phase-extracted (PPL) DOM more accessible to analytical techniques. Using a combination of NMR and chemical derivatization or reduction analyzed by comprehensive (GCxGC) gas chromatography, we observed chemical features strikingly similar to terrestrial DOM. In particular, we observed reduced alicylic hydrocarbons believed to be the backbone of previously identified carboxylic rich alicyclic material (CRAM). Additionally, we found carbohydrates, amino acids and small lipids and acids.
Lebedev, Vyacheslav; Bartlett, Joshua H.; Malyzhenkov, Alexander; ...
2017-12-06
Here, we present a novel compact design for a multichannel atomic oven which generates collimated beams of refractory atoms for fieldable laser spectroscopy. Using this resistively heated crucible, we demonstrate spectroscopy of an erbium sample at 1300 °C with improved isotopic resolution with respect to a single-channel design. In addition, our oven has a high thermal efficiency. By minimizing the surface area of the crucible, we achieve 2000 °C at 140 W of applied electrical power. As a result, the design does not require any active cooling and is compact enough to allow for its incorporation into fieldable instruments.
On framing the research question and choosing the appropriate research design.
Parfrey, Patrick S; Ravani, Pietro
2015-01-01
Clinical epidemiology is the science of human disease investigation with a focus on diagnosis, prognosis, and treatment. The generation of a reasonable question requires definition of patients, interventions, controls, and outcomes. The goal of research design is to minimize error, to ensure adequate samples, to measure input and output variables appropriately, to consider external and internal validities, to limit bias, and to address clinical as well as statistical relevance. The hierarchy of evidence for clinical decision-making places randomized controlled trials (RCT) or systematic review of good quality RCTs at the top of the evidence pyramid. Prognostic and etiologic questions are best addressed with longitudinal cohort studies.
Reliability-Based Control Design for Uncertain Systems
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.
On framing the research question and choosing the appropriate research design.
Parfrey, Patrick; Ravani, Pietro
2009-01-01
Clinical epidemiology is the science of human disease investigation with a focus on diagnosis, prognosis, and treatment. The generation of a reasonable question requires the definition of patients, interventions, controls, and outcomes. The goal of research design is to minimize error, ensure adequate samples, measure input and output variables appropriately, consider external and internal validities, limit bias, and address clinical as well as statistical relevance. The hierarchy of evidence for clinical decision making places randomized controlled trials (RCT) or systematic review of good quality RCTs at the top of the evidence pyramid. Prognostic and etiologic questions are best addressed with longitudinal cohort studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lebedev, Vyacheslav; Bartlett, Joshua H.; Malyzhenkov, Alexander
Here, we present a novel compact design for a multichannel atomic oven which generates collimated beams of refractory atoms for fieldable laser spectroscopy. Using this resistively heated crucible, we demonstrate spectroscopy of an erbium sample at 1300 °C with improved isotopic resolution with respect to a single-channel design. In addition, our oven has a high thermal efficiency. By minimizing the surface area of the crucible, we achieve 2000 °C at 140 W of applied electrical power. As a result, the design does not require any active cooling and is compact enough to allow for its incorporation into fieldable instruments.
Minimization of reflection cracks in flexible pavements.
DOT National Transportation Integrated Search
1977-01-01
This report describes the performance of fabrics used under overlays in an effort to minimize longitudinal and alligator cracking in flexible pavements. It is concluded, although the sample size is small, that the treatments will extend the pavement ...
NASA Astrophysics Data System (ADS)
Imada, Keita; Nakamura, Katsuhiko
This paper describes recent improvements to Synapse system for incremental learning of general context-free grammars (CFGs) and definite clause grammars (DCGs) from positive and negative sample strings. An important feature of our approach is incremental learning, which is realized by a rule generation mechanism called “bridging” based on bottom-up parsing for positive samples and the search for rule sets. The sizes of rule sets and the computation time depend on the search strategies. In addition to the global search for synthesizing minimal rule sets and serial search, another method for synthesizing semi-optimum rule sets, we incorporate beam search to the system for synthesizing semi-minimal rule sets. The paper shows several experimental results on learning CFGs and DCGs, and we analyze the sizes of rule sets and the computation time.
Interpretive biases in chronic insomnia: an investigation using a priming paradigm.
Ree, Melissa J; Harvey, Allison G
2006-09-01
Disorder-congruent interpretations of ambiguous stimuli characterize several psychological disorders and have been implicated in their maintenance. Models of insomnia have highlighted the importance of cognitive processes, but the possibility that biased interpretations are important has been minimally investigated. Hence, a priming methodology was employed to investigate the presence of an interpretive bias in insomnia. A sample of 78 participants, differing in the presence of a diagnosis of insomnia, severity of sleep disturbance, and sleepiness, was required to read ambiguous sentences and make a lexical decision about target words that followed. Sleepiness at the time of the experiment was associated with the likelihood with which participants made insomnia and threat consistent interpretations of ambiguous sentences. The results suggest that there is a general bias towards threatening interpretations when individuals are sleepy and suggests that cognitive accounts of insomnia require revision to include a role for interpretative bias when people are sleepy. Future research is required to investigate whether this interpretive bias plays a causal role in the maintenance of insomnia.
Lower Limits on Aperture Size for an ExoEarth Detecting Coronagraphic Mission
NASA Technical Reports Server (NTRS)
Stark, Christopher C.; Roberge, Aki; Mandell, Avi; Clampin, Mark; Domagal-Goldman, Shawn D.; McElwain, Michael W.; Stapelfeldt, Karl R.
2015-01-01
The yield of Earth-like planets will likely be a primary science metric for future space-based missions that will drive telescope aperture size. Maximizing the exoEarth candidate yield is therefore critical to minimizing the required aperture. Here we describe a method for exoEarth candidate yield maximization that simultaneously optimizes, for the first time, the targets chosen for observation, the number of visits to each target, the delay time between visits, and the exposure time of every observation. This code calculates both the detection time and multiwavelength spectral characterization time required for planets. We also refine the astrophysical assumptions used as inputs to these calculations, relying on published estimates of planetary occurrence rates as well as theoretical and observational constraints on terrestrial planet sizes and classical habitable zones. Given these astrophysical assumptions, optimistic telescope and instrument assumptions, and our new completeness code that produces the highest yields to date, we suggest lower limits on the aperture size required to detect and characterize a statistically motivated sample of exoEarths.
Cash Management in the United States Marine Corps.
1984-12-01
procedures and requires that such departments and agencies conduct financial * activities in a manner that will make cash holding require- ments...balances so as to minimize the overall cost of holding cash" [Ref. 3: 2). Simply stated, effective cash management implies the minimization of cash...balances held, as opposed to invested , as well as timely receipt and disbursement of government funds. B. ORGANIZATION RESPONSIBILITY FOR FINANCIAL
ERIC Educational Resources Information Center
Kroeze, Willemieke; Oenema, Anke; Dagnelie, Pieter C.; Brug, Johannes
2008-01-01
This study investigated the minimally required feedback elements of a computer-tailored dietary fat reduction intervention to be effective in improving fat intake. In all 588 Healthy Dutch adults were randomly allocated to one of four conditions in an randomized controlled trial: (i) feedback on dietary fat intake [personal feedback (P feedback)],…
The ethical use of existing samples for genome research.
Bathe, Oliver F; McGuire, Amy L
2009-10-01
Modern biobanking efforts consist of prospective collections of tissues linked to clinical data for patients who have given informed consent for the research use of their specimens and data, including their DNA. In such efforts, patient autonomy and privacy are well respected because of the prospective nature of the informed consent process. However, one of the richest sources of tissue for research continues to be the millions of archived samples collected by pathology departments during normal clinical care or for research purposes without specific consent for future research or genetic analysis. Because specific consent was not obtained a priori, issues related to individual privacy and autonomy are much more complicated. A framework for accessing these existing samples and related clinical data for research is presented. Archival tissues may be accessed only when there is a reasonable likelihood of generating beneficial and scientifically valid information. To minimize risks, databases containing information related to the tissue and to clinical data should be coded, no personally identifying phenotypic information should be included, and access should be restricted to bona fide researchers for legitimate research purposes. These precautions, if implemented appropriately, should ensure that the research use of archival tissue and data are no more than minimal risk. A waiver of the requirement for informed consent would then be justified if reconsent is shown to be impracticable. A waiver of consent should not be granted, however, if there is a significant risk to privacy, if the proposed research use is inconsistent with the original consent (where there is one), or if the potential harm from a privacy breach is considerable.
Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.M.; Rogan, Peter K.
2017-01-01
Accurate digital image analysis of abnormal microscopic structures relies on high quality images and on minimizing the rates of false positive (FP) and negative objects in images. Cytogenetic biodosimetry detects dicentric chromosomes (DCs) that arise from exposure to ionizing radiation, and determines radiation dose received based on DC frequency. Improvements in automated DC recognition increase the accuracy of dose estimates by reclassifying FP DCs as monocentric chromosomes or chromosome fragments. We also present image segmentation methods to rank high quality digital metaphase images and eliminate suboptimal metaphase cells. A set of chromosome morphology segmentation methods selectively filtered out FP DCs arising primarily from sister chromatid separation, chromosome fragmentation, and cellular debris. This reduced FPs by an average of 55% and was highly specific to these abnormal structures (≥97.7%) in three samples. Additional filters selectively removed images with incomplete, highly overlapped, or missing metaphase cells, or with poor overall chromosome morphologies that increased FP rates. Image selection is optimized and FP DCs are minimized by combining multiple feature based segmentation filters and a novel image sorting procedure based on the known distribution of chromosome lengths. Applying the same image segmentation filtering procedures to both calibration and test samples reduced the average dose estimation error from 0.4 Gy to <0.2 Gy, obviating the need to first manually review these images. This reliable and scalable solution enables batch processing for multiple samples of unknown dose, and meets current requirements for triage radiation biodosimetry of high quality metaphase cell preparations. PMID:29026522
Lian, Ru; Wu, Zhongping; Lv, Xiaobao; Rao, Yulan; Li, Haiyang; Li, Jinghua; Wang, Rong; Ni, Chunfang; Zhang, Yurong
2017-10-01
Increasing in cases involving drugs of abuse leads to heavy burden for law enforcement agencies, exacerbating demand for rapid screening technique. In this study, atmospheric pressure ionization technologies including direct analysis in real time (DART) ion source coupled to a time-of-flight mass spectrometer (DART-TOF-MS)as well asdopant-assisted positive photoionization ion mobility spectrometry (DAPP-IMS) without radioactivity were utilized together as the powerful analytical tool for the rapid screening and identification of 53 abused drugs.The limits of detection (LOD) were 0.05-2μg/mL when using DART-TOF-MS and 0.02-2μg when using DAPP-IMS which could satisfy the actual requirement in forensic science laboratory. The advantages of this method included fast response, high-throughput potential, high specificity, and minimal sample preparation. A screening library of reduced mobility (K 0 ), accurate mass of informative precursor ion ([M+H] + ) and fragment ions was established respectively by employing a bench-top DAPP-IMS and TOF-MS in-source collision induced dissociation (CID) mode. Then the standardized screening procedure was developed with criteria for the confirmation of positive result. A total of 50 seized drug samples provided by local forensic laboratory we reanalyzed to testify the utility of the method. This study suggests that a method combing DART-TOF-MS and DAPP-IMS is promising for the rapid screening and identification of abused drugs with minimal sample preparation and absence of chromatography. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl
2018-06-01
In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevcik, R. S.; Hyman, D. A.; Basumallich, L.
2013-01-01
A technique for carbohydrate analysis for bioprocess samples has been developed, providing reduced analysis time compared to current practice in the biofuels R&D community. The Thermofisher CarboPac SA10 anion-exchange column enables isocratic separation of monosaccharides, sucrose and cellobiose in approximately 7 minutes. Additionally, use of a low-volume (0.2 mL) injection valve in combination with a high-volume detection cell minimizes the extent of sample dilution required to bring sugar concentrations into the linear range of the pulsed amperometric detector (PAD). Three laboratories, representing academia, industry, and government, participated in an interlaboratory study which analyzed twenty-one opportunistic samples representing biomass pretreatment, enzymaticmore » saccharification, and fermentation samples. The technique's robustness, linearity, and interlaboratory reproducibility were evaluated and showed excellent-to-acceptable characteristics. Additionally, quantitation by the CarboPac SA10/PAD was compared with the current practice method utilizing a HPX-87P/RID. While these two methods showed good agreement a statistical comparison found significant quantitation difference between them, highlighting the difference between selective and universal detection modes.« less
Inspection error and its adverse effects - A model with implications for practitioners
NASA Technical Reports Server (NTRS)
Collins, R. D., Jr.; Case, K. E.; Bennett, G. K.
1978-01-01
Inspection error has clearly been shown to have adverse effects upon the results desired from a quality assurance sampling plan. These effects upon performance measures have been well documented from a statistical point of view. However, little work has been presented to convince the QC manager of the unfavorable cost consequences resulting from inspection error. This paper develops a very general, yet easily used, mathematical cost model. The basic format of the well-known Guthrie-Johns model is used. However, it is modified as required to assess the effects of attributes sampling errors of the first and second kind. The economic results, under different yet realistic conditions, will no doubt be of interest to QC practitioners who face similar problems daily. Sampling inspection plans are optimized to minimize economic losses due to inspection error. Unfortunately, any error at all results in some economic loss which cannot be compensated for by sampling plan design; however, improvements over plans which neglect the presence of inspection error are possible. Implications for human performance betterment programs are apparent, as are trade-offs between sampling plan modification and inspection and training improvements economics.
Isothermal Amplification Methods for the Detection of Nucleic Acids in Microfluidic Devices
Zanoli, Laura Maria; Spoto, Giuseppe
2012-01-01
Diagnostic tools for biomolecular detection need to fulfill specific requirements in terms of sensitivity, selectivity and high-throughput in order to widen their applicability and to minimize the cost of the assay. The nucleic acid amplification is a key step in DNA detection assays. It contributes to improving the assay sensitivity by enabling the detection of a limited number of target molecules. The use of microfluidic devices to miniaturize amplification protocols reduces the required sample volume and the analysis times and offers new possibilities for the process automation and integration in one single device. The vast majority of miniaturized systems for nucleic acid analysis exploit the polymerase chain reaction (PCR) amplification method, which requires repeated cycles of three or two temperature-dependent steps during the amplification of the nucleic acid target sequence. In contrast, low temperature isothermal amplification methods have no need for thermal cycling thus requiring simplified microfluidic device features. Here, the use of miniaturized analysis systems using isothermal amplification reactions for the nucleic acid amplification will be discussed. PMID:25587397
Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.
Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei
2017-10-25
The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.
Analysis of problems and failures in the measurement of soil-gas radon concentration.
Neznal, Martin; Neznal, Matěj
2014-07-01
Long-term experience in the field of soil-gas radon concentration measurements allows to describe and explain the most frequent causes of failures, which can appear in practice when various types of measurement methods and soil-gas sampling techniques are used. The concept of minimal sampling depth, which depends on the volume of the soil-gas sample and on the soil properties, is shown in detail. Consideration of minimal sampling depth at the time of measurement planning allows to avoid the most common mistakes. The ways how to identify influencing parameters, how to avoid a dilution of soil-gas samples by the atmospheric air, as well as how to recognise inappropriate sampling methods are discussed. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.
ERIC Educational Resources Information Center
McCarthy, John C.; And Others
1993-01-01
Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…
Sorzano, Carlos Oscars S; Pérez-De-La-Cruz Moreno, Maria Angeles; Burguet-Castell, Jordi; Montejo, Consuelo; Ros, Antonio Aguilar
2015-06-01
Pharmacokinetics (PK) applications can be seen as a special case of nonlinear, causal systems with memory. There are cases in which prior knowledge exists about the distribution of the system parameters in a population. However, for a specific patient in a clinical setting, we need to determine her system parameters so that the therapy can be personalized. This system identification is performed many times by measuring drug concentrations in plasma. The objective of this work is to provide an irregular sampling strategy that minimizes the uncertainty about the system parameters with a fixed amount of samples (cost constrained). We use Monte Carlo simulations to estimate the average Fisher's information matrix associated to the PK problem, and then estimate the sampling points that minimize the maximum uncertainty associated to system parameters (a minimax criterion). The minimization is performed employing a genetic algorithm. We show that such a sampling scheme can be designed in a way that is adapted to a particular patient and that it can accommodate any dosing regimen as well as it allows flexible therapeutic strategies. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G
2015-07-01
Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The use of bulk EC a gradient as an exhaustive variable, known at any node of an interpolation grid, has allowed the optimization of the sampling scheme, distinguishing among areas with different priority levels.
Ho, Sut Kam; Garcia, Dario Machado
2017-04-01
A two-pulse laser-excited atomic fluorescence (LEAF) technique at 193 nm wavelength was applied to the analysis of indium tin oxide (ITO) layer on polyethylene terephthalate (PET) film. Fluorescence emissions from analytes were induced from plumes generated by first laser pulse. Using this approach, non-selective LEAF can be accomplished for simultaneous multi-element analysis and it overcomes the handicap of strict requirement for laser excitation wavelength. In this study, experimental conditions including laser fluences, times for gating and time delay between pulses were optimized to reveal high sensitivity with minimal sample destruction and penetration. With weak laser fluences of 100 and 125 mJ/cm 2 for 355 and 193 nm pulses, detection limits were estimated to be 0.10% and 0.43% for Sn and In, respectively. In addition, the relation between fluorescence emissions and number of laser shots was investigated; reproducible results were obtained for Sn and In. It shows the feasibility of depth profiling by this technique. Morphologies of samples were characterized at various laser fluences and number of shots to examine the accurate penetration. Images of craters were also investigated using scanning electron microscopy (SEM). The results demonstrate the imperceptible destructiveness of film after laser shot. With such weak laser fluences and minimal destructiveness, this LEAF technique is suitable for thin-film analysis.
Riley, Paul W.; Gallea, Benoit; Valcour, Andre
2017-01-01
Background: Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. Methods: The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Results: Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. Conclusions: To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process. PMID:28706751
Riley, Paul W; Gallea, Benoit; Valcour, Andre
2017-01-01
Testing coagulation factor activities requires that multiple dilutions be assayed and analyzed to produce a single result. The slope of the line created by plotting measured factor concentration against sample dilution is evaluated to discern the presence of inhibitors giving rise to nonparallelism. Moreover, samples producing results on initial dilution falling outside the analytic measurement range of the assay must be tested at additional dilutions to produce reportable results. The complexity of this process has motivated a large clinical reference laboratory to develop advanced computer algorithms with automated reflex testing rules to complete coagulation factor analysis. A method was developed for autoverification of coagulation factor activity using expert rules developed with on an off the shelf commercially available data manager system integrated into an automated coagulation platform. Here, we present an approach allowing for the autoverification and reporting of factor activity results with greatly diminished technologist effort. To the best of our knowledge, this is the first report of its kind providing a detailed procedure for implementation of autoverification expert rules as applied to coagulation factor activity testing. Advantages of this system include ease of training for new operators, minimization of technologist time spent, reduction of staff fatigue, minimization of unnecessary reflex tests, optimization of turnaround time, and assurance of the consistency of the testing and reporting process.
Waste reduction plan for The Oak Ridge National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, R.M.
1990-04-01
The Oak Ridge National Laboratory (ORNL) is a multipurpose Research and Development (R D) facility. These R D activities generate numerous small waste streams. Waste minimization is defined as any action that minimizes the volume or toxicity of waste by avoiding its generation or recycling. This is accomplished by material substitution, changes to processes, or recycling wastes for reuse. Waste reduction is defined as waste minimization plus treatment which results in volume or toxicity reduction. The ORNL Waste Reduction Program will include both waste minimization and waste reduction efforts. Federal regulations, DOE policies and guidelines, increased costs and liabilities associatedmore » with the management of wastes, limited disposal options and facility capacities, and public consciousness have been motivating factors for implementing comprehensive waste reduction programs. DOE Order 5820.2A, Section 3.c.2.4 requires DOE facilities to establish an auditable waste reduction program for all LLW generators. In addition, it further states that any new facilities, or changes to existing facilities, incorporate waste minimization into design considerations. A more recent DOE Order, 3400.1, Section 4.b, requires the preparation of a waste reduction program plan which must be reviewed annually and updated every three years. Implementation of a waste minimization program for hazardous and radioactive mixed wastes is sited in DOE Order 5400.3, Section 7.d.5. This document has been prepared to address these requirements. 6 refs., 1 fig., 2 tabs.« less
Leck, Kira
2006-10-01
Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.
Krleza, Jasna Lenicek; Dorotic, Adrijana; Grzunov, Ana; Maradin, Miljenka
2015-01-01
Capillary blood sampling is a medical procedure aimed at assisting in patient diagnosis, management and treatment, and is increasingly used worldwide, in part because of the increasing availability of point-of-care testing. It is also frequently used to obtain small blood volumes for laboratory testing because it minimizes pain. The capillary blood sampling procedure can influence the quality of the sample as well as the accuracy of test results, highlighting the need for immediate, widespread standardization. A recent nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia has shown that capillary sampling procedures are not standardized and that only a small proportion of Croatian laboratories comply with guidelines from the Clinical Laboratory Standards Institute (CLSI) or the World Health Organization (WHO). The aim of this document is to provide recommendations for capillary blood sampling. This document has been produced by the Working Group for Capillary Blood Sampling within the Croatian Society of Medical Biochemistry and Laboratory Medicine. Our recommendations are based on existing available standards and recommendations (WHO Best Practices in Phlebotomy, CLSI GP42-A6 and CLSI C46-A2), which have been modified based on local logistical, cultural, legal and regulatory requirements. We hope that these recommendations will be a useful contribution to the standardization of capillary blood sampling in Croatia. PMID:26524965
NASA Astrophysics Data System (ADS)
Roether, Wolfgang; Vogt, Martin; Vogel, Sandra; Sültenfuß, Jürgen
2013-06-01
We present a new method to obtain samples for the measurement of helium isotopes and neon in water, to replace the classical sampling procedure using clamped-off Cu tubing containers that we have been using so far. The new method saves the gas extraction step prior to admission to the mass spectrometer, which the classical method requires. Water is drawn into evacuated glass ampoules with subsequent flame sealing. Approximately 50% headspace is left, from which admission into the mass spectrometer occurs without further treatment. Extensive testing has shown that, with due care and with small corrections applied, the samples represent the gas concentrations in the water within ±0.07% (95% confidence level; ±0.05% with special handling). Fast evacuation is achieved by pumping on a small charge of water placed in the ampoule. The new method was successfully tested at sea in comparison with Cu-tubing sampling. We found that the ampoule samples were superior in data precision and that a lower percentage of samples were lost prior to measurement. Further measurements revealed agreement between the two methods in helium, 3He and neon within ±0.1%. The new method facilitates the dealing with large sample sets and minimizes the delay between sampling and measurement. The method is applicable also for gases other than helium and neon.
Krleza, Jasna Lenicek; Dorotic, Adrijana; Grzunov, Ana; Maradin, Miljenka
2015-01-01
Capillary blood sampling is a medical procedure aimed at assisting in patient diagnosis, management and treatment, and is increasingly used worldwide, in part because of the increasing availability of point-of-care testing. It is also frequently used to obtain small blood volumes for laboratory testing because it minimizes pain. The capillary blood sampling procedure can influence the quality of the sample as well as the accuracy of test results, highlighting the need for immediate, widespread standardization. A recent nationwide survey of policies and practices related to capillary blood sampling in medical laboratories in Croatia has shown that capillary sampling procedures are not standardized and that only a small proportion of Croatian laboratories comply with guidelines from the Clinical Laboratory Standards Institute (CLSI) or the World Health Organization (WHO). The aim of this document is to provide recommendations for capillary blood sampling. This document has been produced by the Working Group for Capillary Blood Sampling within the Croatian Society of Medical Biochemistry and Laboratory Medicine. Our recommendations are based on existing available standards and recommendations (WHO Best Practices in Phlebotomy, CLSI GP42-A6 and CLSI C46-A2), which have been modified based on local logistical, cultural, legal and regulatory requirements. We hope that these recommendations will be a useful contribution to the standardization of capillary blood sampling in Croatia.
Liu, Hongtao; Johnson, Jeffrey L.; Koval, Greg; Malnassy, Greg; Sher, Dorie; Damon, Lloyd E.; Hsi, Eric D.; Bucci, Donna Marie; Linker, Charles A.; Cheson, Bruce D.; Stock, Wendy
2012-01-01
Background In the present study, the prognostic impact of minimal residual disease during treatment on time to progression and overall survival was analyzed prospectively in patients with mantle cell lymphoma treated on the Cancer and Leukemia Group B 59909 clinical trial. Design and Methods Peripheral blood and bone marrow samples were collected during different phases of the Cancer and Leukemia Group B 59909 study for minimal residual disease analysis. Minimal residual disease status was determined by quantitative polymerase chain reaction of IgH and/or BCL-1/JH gene rearrangement. Correlation of minimal residual disease status with time to progression and overall survival was determined. In multivariable analysis, minimal residual disease, and other risk factors were correlated with time to progression. Results Thirty-nine patients had evaluable, sequential peripheral blood and bone marrow samples for minimal residual disease analysis. Using peripheral blood monitoring, 18 of 39 (46%) achieved molecular remission following induction therapy. The molecular remission rate increased from 46 to 74% after one course of intensification therapy. Twelve of 21 minimal residual disease positive patients (57%) progressed within three years of follow up compared to 4 of 18 (22%) molecular remission patients (P=0.049). Detection of minimal residual disease following induction therapy predicted disease progression with a hazard ratio of 3.7 (P=0.016). The 3-year probability of time to progression among those who were in molecular remission after induction chemotherapy was 82% compared to 48% in patients with detectable minimal residual disease. The prediction of time to progression by post-induction minimal residual disease was independent of other prognostic factors in multivariable analysis. Conclusions Detection of minimal residual disease following induction immunochemotherapy was an independent predictor of time to progression following immunochemotherapy and autologous stem cell transplantation for mantle cell lymphoma. The clinical trial was registered at ClinicalTrials.gov: NCT00020943. PMID:22102709
Design and Sizing of the Air Revitalization System for Altair Lunar Lander
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar
2009-01-01
Designing closed-loop Air Revitalization Systems (ARS) for human spaceflight applications requires a delicate balance between designing for system robustness while minimizing system power and mass requirements. This presentation will discuss the design of the ARS for the Altair Lunar Lander. The presentation will illustrate how dynamic simulations, using Aspen Custom Modeler, were used to develop a system configuration with the ability to control atmospheric conditions under a wide variety of circumstances while minimizing system mass/volume and the impact on overall power requirements for the Lander architecture.
[Surgical renal biopsies: technique, effectiveness and complications].
Pinsach Elías, L; Blasco Casares, F J; Ibarz Servió, L; Valero Milián, J; Areal Calama, J; Bucar Terrades, S; Saladié Roig, J M
1991-01-01
Retrospective study made on 140 renal surgical biopsies (RSB) performed throughout the past 4 years in our Unit. The technique's effectiveness and morbidity are emphasized and the surgical technique and type of anaesthesia described. The sample obtained was enough to perform an essay in 100% cases, and a diagnosis was reached in 98.5%. Thirty-nine patients (27.8%) presented complications, 13 (9.2%) of which were directly related to the surgical technique. No case required blood transfusion and no deaths were reported. The type of anaesthesia used was: local plus sedation in 104 (74.2%) cases, rachianaesthesia in 10 (7.1%) and general in 26 (18.5%). The same approach was used in all patients: minimal subcostal lumbotomy, using Wilde's forceps to obtain the samples. It is believed that RSB is a highly effective, low mortality procedure, easy and quick to perform, and suitable for selected patients.
Microsystems for the Capture of Low-Abundance Cells
NASA Astrophysics Data System (ADS)
Dharmasiri, Udara; Witek, Małgorzata A.; Adams, Andre A.; Soper, Steven A.
2010-07-01
Efficient selection and enumeration of low-abundance biological cells are highly important in a variety of applications. For example, the clinical utility of circulating tumor cells (CTCs) in peripheral blood is recognized as a viable biomarker for the management of various cancers, in which the clinically relevant number of CTCs per 7.5 ml of blood is two to five. Although there are several methods for isolating rare cells from a variety of heterogeneous samples, such as immunomagnetic-assisted cell sorting and fluorescence-activated cell sorting, they are fraught with challenges. Microsystem-based technologies are providing new opportunities for selecting and isolating rare cells from complex, heterogeneous samples. Such approaches involve reductions in target-cell loss, process automation, and minimization of contamination issues. In this review, we introduce different application areas requiring rare cell analysis, conventional techniques for their selection, and finally microsystem approaches for low-abundance-cell isolation and enumeration.
Optical+Near-IR Bayesian Classification of Quasars
NASA Astrophysics Data System (ADS)
Mehta, Sajjan S.; Richards, G. T.; Myers, A. D.
2011-05-01
We describe the details of an optimal Bayesian classification of quasars with combined optical+near-IR photometry from the SDSS and UKIDSS LAS surveys. Using only deep co-added SDSS photometry from the "Stripe 82" region and requiring full four-band UKIDSS detections, we reliably identify 2665 quasar candidates with a computed efficiency in excess of 99%. Relaxing the data constraints to combinations of two-band detections yields up to 6424 candidates with minimal trade-off in completeness and efficiency. The completeness and efficiency of the sample are investigated with existing spectra from the SDSS, 2SLAQ, and AUS surveys in addition to recent single-slit observations from Palomar Observatory, which revealed 22 quasars from a subsample of 29 high-z candidates. SDSS-III/BOSS observations will allow further exploration of the completeness/efficiency of the sample over 2.2
Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series
NASA Technical Reports Server (NTRS)
Vautard, R.; Ghil, M.
1989-01-01
Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.
Scanning electron microscope fine tuning using four-bar piezoelectric actuated mechanism
NASA Astrophysics Data System (ADS)
Hatamleh, Khaled S.; Khasawneh, Qais A.; Al-Ghasem, Adnan; Jaradat, Mohammad A.; Sawaqed, Laith; Al-Shabi, Mohammad
2018-01-01
Scanning Electron Microscopes are extensively used for accurate micro/nano images exploring. Several strategies have been proposed to fine tune those microscopes in the past few years. This work presents a new fine tuning strategy of a scanning electron microscope sample table using four bar piezoelectric actuated mechanisms. The introduced paper presents an algorithm to find all possible inverse kinematics solutions of the proposed mechanism. In addition, another algorithm is presented to search for the optimal inverse kinematic solution. Both algorithms are used simultaneously by means of a simulation study to fine tune a scanning electron microscope sample table through a pre-specified circular or linear path of motion. Results of the study shows that, proposed algorithms were able to minimize the power required to drive the piezoelectric actuated mechanism by a ratio of 97.5% for all simulated paths of motion when compared to general non-optimized solution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boedicker, J.; Li, L; Kline, T
2008-01-01
This article describes plug-based microfluidic technology that enables rapid detection and drug susceptibility screening of bacteria in samples, including complex biological matrices, without pre-incubation. Unlike conventional bacterial culture and detection methods, which rely on incubation of a sample to increase the concentration of bacteria to detectable levels, this method confines individual bacteria into droplets nanoliters in volume. When single cells are confined into plugs of small volume such that the loading is less than one bacterium per plug, the detection time is proportional to plug volume. Confinement increases cell density and allows released molecules to accumulate around the cell, eliminatingmore » the pre-incubation step and reducing the time required to detect the bacteria. We refer to this approach as stochastic confinement. Using the microfluidic hybrid method, this technology was used to determine the antibiogram - or chart of antibiotic sensitivity - of methicillin-resistant Staphylococcus aureus (MRSA) to many antibiotics in a single experiment and to measure the minimal inhibitory concentration (MIC) of the drug cefoxitin (CFX) against this strain. In addition, this technology was used to distinguish between sensitive and resistant strains of S. aureus in samples of human blood plasma. High-throughput microfluidic techniques combined with single-cell measurements also enable multiple tests to be performed simultaneously on a single sample containing bacteria. This technology may provide a method of rapid and effective patient-specific treatment of bacterial infections and could be extended to a variety of applications that require multiple functional tests of bacterial samples on reduced timescales.« less
Lin, Meng-Hsien; Anderson, Jonathan; Pinnaratip, Rattapol; Meng, Hao; Konst, Shari; DeRouin, Andrew J.; Rajachar, Rupak
2015-01-01
The degradation behavior of a tissue adhesive is critical to its ability to repair a wound while minimizing prolonged inflammatory response. Traditional degradation tests can be expensive to perform, as they require large numbers of samples. The potential for using magnetoelastic resonant sensors to track bioadhesive degradation behavior was investigated. Specifically, biomimetic poly(ethylene glycol)- (PEG-) based adhesive was coated onto magnetoelastic (ME) sensor strips. Adhesive-coated samples were submerged in solutions buffered at multiple pH levels (5.7, 7.4 and 10.0) at body temperature (37°C) and the degradation behavior of the adhesive was tracked wirelessly by monitoring the changes in the resonant amplitude of the sensors for over 80 days. Adhesive incubated at pH 7.4 degraded over 75 days, which matched previously published data for bulk degradation behavior of the adhesive while utilizing significantly less material (~103 times lower). Adhesive incubated at pH 10.0 degraded within 25 days while samples incubated at pH 5.7 did not completely degrade even after 80 days of incubation. As expected, the rate of degradation increased with increasing pH as the rate of ester bond hydrolysis is higher under basic conditions. As a result of requiring a significantly lower amount of samples compared to traditional methods, the ME sensing technology is highly attractive for fully characterizing the degradation behavior of tissue adhesives in a wide range of physiological conditions. PMID:26087077
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carvalho, M.L.; Amorim, P.; Marques, M.I.M.
1997-04-01
Fucus vesiculosus L. seaweeds from three estuarine stations were analyzed by X-ray fluorescence, providing results for the concentration of total K, Ca, Ti, Mn, Fe, Co, Ni, Cu, Zn, As, Br, Sr, and Pb. Four different structures of the algae (base, stipe, reproductive organs, and growing tips) were analyzed to study the differential accumulation of heavy metals by different parts of Fucus. Some elements (e.g., Cu and Fe) are preferentially accumulated in the base of the algae, whereas others (e.g., As) exhibit higher concentrations in the reproductive organs and growing tips. The pattern of accumulation in different structures is similarmore » for Cu, Zn, and Pb, but for other metals there is considerable variability in accumulation between parts of the plant. This is important in determining which structures of the plant should be used for biomonitoring. For samples collected at stations subject to differing metal loads, the relative elemental composition is approximately constant, notwithstanding significant variation in absolute values. The proportion of metals in Fucus is similar to that found in other estuaries, where metal concentrations are significantly lower. Energy-dispersive X-ray fluorescence has been shown to be a suitable technique for multielement analysis in this type of sample. No chemical pretreatment is required, minimizing sample contamination. The small amount of sample required, and the wide range of elements that can be detected simultaneously make energy-dispersive X-ray fluorescence a valuable tool for pollution studies.« less
You, David J; Geshell, Kenneth J; Yoon, Jeong-Yeol
2011-10-15
Direct and sensitive detection of foodborne pathogens from fresh produce samples was accomplished using a handheld lab-on-a-chip device, requiring little to no sample processing and enrichment steps for a near-real-time detection and truly field-deployable device. The detection of Escherichia coli K12 and O157:H7 in iceberg lettuce was achieved utilizing optimized Mie light scatter parameters with a latex particle immunoagglutination assay. The system exhibited good sensitivity, with a limit of detection of 10 CFU mL(-1) and an assay time of <6 min. Minimal pretreatment with no detrimental effects on assay sensitivity and reproducibility was accomplished with a simple and cost-effective KimWipes filter and disposable syringe. Mie simulations were used to determine the optimal parameters (particle size d, wavelength λ, and scatter angle θ) for the assay that maximize light scatter intensity of agglutinated latex microparticles and minimize light scatter intensity of the tissue fragments of iceberg lettuce, which were experimentally validated. This introduces a powerful method for detecting foodborne pathogens in fresh produce and other potential sample matrices. The integration of a multi-channel microfluidic chip allowed for differential detection of the agglutinated particles in the presence of the antigen, revealing a true field-deployable detection system with decreased assay time and improved robustness over comparable benchtop systems. Additionally, two sample preparation methods were evaluated through simulated field studies based on overall sensitivity, protocol complexity, and assay time. Preparation of the plant tissue sample by grinding resulted in a two-fold improvement in scatter intensity over washing, accompanied with a significant increase in assay time: ∼5 min (grinding) versus ∼1 min (washing). Specificity studies demonstrated binding of E. coli O157:H7 EDL933 to only O157:H7 antibody conjugated particles, with no cross-reactivity to K12. This suggests the adaptability of the system for use with a wide variety of pathogens, and the potential to detect in a variety of biological matrices with little to no sample pretreatment. Copyright © 2011 Elsevier B.V. All rights reserved.
Bowers, Len; James, Karen; Quirk, Alan; Wright, Steve; Williams, Hilary; Stewart, Duncan
2013-07-01
Although individual conflict and containment events among acute psychiatric inpatients have been studied in some detail, the relationship of these events to each other has not. In particular, little is known about the temporal order of events for individual patients. This study aimed to identify the most common pathways from event to event. A sample of 522 patients was recruited from 84 acute psychiatric wards in 31 hospital locations in London and the surrounding areas during 2009-2010. Data on the order of conflict and containment events were collected for the first two weeks of admission from patients' case notes. Event-to-event transitions were tabulated and depicted diagrammatically. Event types were tested for their most common temporal placing in sequences of events. Most conflict and containment occurs within and between events of the minimal triangle (verbal aggression, de-escalation, and PRN medication), and the majority of these event sequences conclude in no further events; a minority transition to other, more severe, events. Verbal abuse and medication refusal were more likely to start sequences of disturbed behaviour. Training in the prevention and management of violence needs to acknowledge that a gradual escalation of patient behaviour does not always occur. Verbal aggression is a critical initiator of conflict events, and requires more detailed and sustained research on optimal management and prevention strategies. Similar research is required into medication refusal by inpatients.
Soil sampling kit and a method of sampling therewith
Thompson, Cyril V.
1991-01-01
A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.
Soil sampling kit and a method of sampling therewith
Thompson, C.V.
1991-02-05
A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.
36 CFR 223.218 - Consistency with plans, environmental standards, and other management requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) Minimize soil erosion; (e) Maintain favorable conditions of water flow and quality; (f) Minimize adverse effects on, protect, or enhance other national forest resources, uses, and improvements; and (g) Deposit...
Chuang, Jane C; Emon, Jeanette M Van; Durnford, Joyce; Thomas, Kent
2005-09-15
An enzyme-linked immunosorbent assay (ELISA) method was developed to quantitatively measure 2,4-dichlorophenoxyacetic acid (2,4-D) in human urine. Samples were diluted (1:5) with phosphate-buffered saline containing 0.05% Tween and 0.02% sodium azide, with analysis by a 96-microwell plate immunoassay format. No clean up was required as dilution step minimized sample interferences. Fifty urine samples were received without identifiers from a subset of pesticide applicators and their spouses in an EPA pesticide exposure study (PES) and analyzed by the ELISA method and a conventional gas chromatography/mass spectrometry (GC/MS) procedure. For the GC/MS analysis, urine samples were extracted with acidic dichloromethane (DCM); methylated by diazomethane and fractionated by a Florisil solid phase extraction (SPE) column prior to GC/MS detection. The percent relative standard deviation (%R.S.D.) of the 96-microwell plate triplicate assays ranged from 1.2 to 22% for the urine samples. Day-to-day variation of the assay results was within +/-20%. Quantitative recoveries (>70%) of 2,4-D were obtained for the spiked urine samples by the ELISA method. Quantitative recoveries (>80%) of 2,4-D were also obtained for these samples by the GC/MS procedure. The overall method precision of these samples was within +/-20% for both the ELISA and GC/MS methods. The estimated quantification limit for 2,4-D in urine was 30ng/mL by ELISA and 0.2ng/mL by GC/MS. A higher quantification limit for the ELISA method is partly due to the requirement of a 1:5 dilution to remove the urine sample matrix effect. The GC/MS method can accommodate a 10:1 concentration factor (10mL of urine converted into 1mL organic solvent for analysis) but requires extraction, methylation and clean up on a solid phase column. The immunoassay and GC/MS data were highly correlated, with a correlation coefficient of 0.94 and a slope of 1.00. Favorable results between the two methods were achieved despite the vast differences in sample preparation. Results indicated that the ELISA method could be used as a high throughput, quantitative monitoring tool for human urine samples to identify individuals with exposure to 2,4-D above the typical background levels.
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
Direct injection GC method for measuring light hydrocarbon emissions from cooling-tower water.
Lee, Max M; Logan, Tim D; Sun, Kefu; Hurley, N Spencer; Swatloski, Robert A; Gluck, Steve J
2003-12-15
A Direct Injection GC method for quantifying low levels of light hydrocarbons (C6 and below) in cooling water has been developed. It is intended to overcome the limitations of the currently available technology. The principle of this method is to use a stripper column in a GC to strip waterfrom the hydrocarbons prior to entering the separation column. No sample preparation is required since the water sample is introduced directly into the GC. Method validation indicates that the Direct Injection GC method offers approximately 15 min analysis time with excellent precision and recovery. The calibration studies with ethylene and propylene show that both liquid and gas standards are suitable for routine calibration and calibration verification. The sampling method using zero headspace traditional VOA (Volatile Organic Analysis) vials and a sample chiller has also been validated. It is apparent that the sampling method is sufficient to minimize the potential for losses of light hydrocarbons, and samples can be held at 4 degrees C for up to 7 days with more than 93% recovery. The Direct Injection GC method also offers <1 ppb (w/v) level method detection limits for ethylene, propylene, and benzene. It is superior to the existing El Paso stripper method. In addition to lower detection limits for ethylene and propylene, the Direct Injection GC method quantifies individual light hydrocarbons in cooling water, provides better recoveries, and requires less maintenance and setup costs. Since the instrumentation and supplies are readily available, this technique could easily be established as a standard or alternative method for routine emission monitoring and leak detection of light hydrocarbons in cooling-tower water.
A Simple Application of Compressed Sensing to Further Accelerate Partially Parallel Imaging
Miao, Jun; Guo, Weihong; Narayan, Sreenath; Wilson, David L.
2012-01-01
Compressed Sensing (CS) and partially parallel imaging (PPI) enable fast MR imaging by reducing the amount of k-space data required for reconstruction. Past attempts to combine these two have been limited by the incoherent sampling requirement of CS, since PPI routines typically sample on a regular (coherent) grid. Here, we developed a new method, “CS+GRAPPA,” to overcome this limitation. We decomposed sets of equidistant samples into multiple random subsets. Then, we reconstructed each subset using CS, and averaging the results to get a final CS k-space reconstruction. We used both a standard CS, and an edge and joint-sparsity guided CS reconstruction. We tested these intermediate results on both synthetic and real MR phantom data, and performed a human observer experiment to determine the effectiveness of decomposition, and to optimize the number of subsets. We then used these CS reconstructions to calibrate the GRAPPA complex coil weights. In vivo parallel MR brain and heart data sets were used. An objective image quality evaluation metric, Case-PDM, was used to quantify image quality. Coherent aliasing and noise artifacts were significantly reduced using two decompositions. More decompositions further reduced coherent aliasing and noise artifacts but introduced blurring. However, the blurring was effectively minimized using our new edge and joint-sparsity guided CS using two decompositions. Numerical results on parallel data demonstrated that the combined method greatly improved image quality as compared to standard GRAPPA, on average halving Case-PDM scores across a range of sampling rates. The proposed technique allowed the same Case-PDM scores as standard GRAPPA, using about half the number of samples. We conclude that the new method augments GRAPPA by combining it with CS, allowing CS to work even when the k-space sampling pattern is equidistant. PMID:22902065
NanoDrop Microvolume Quantitation of Nucleic Acids
Desjardins, Philippe; Conklin, Deborah
2010-01-01
Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of sample. PMID:21189466
NASA Astrophysics Data System (ADS)
Resano, Martín; Flórez, María del Rosario; Queralt, Ignasi; Marguí, Eva
2015-03-01
This work investigates the potential of high-resolution continuum source graphite furnace atomic absorption spectrometry for the direct determination of Pd, Pt and Rh in two samples of very different nature. While analysis of active pharmaceutical ingredients is straightforward and it is feasible to minimize matrix effects, to the point that calibration can be carried out against aqueous standard solutions, the analysis of used automobile catalysts is more challenging requiring the addition of a chemical modifier (NH4F·HF) to help in releasing the analytes, a more vigorous temperature program and the use of a solid standard (CRM ERM®-EB504) for calibration. However, in both cases it was possible to obtain accurate results and precision values typically better than 10% RSD in a fast and simple way, while only two determinations are needed for the three analytes, since Pt and Rh can be simultaneously monitored in both types of samples. Overall, the methods proposed seem suited for the determination of these analytes in such types of samples, offering a greener and faster alternative that circumvents the traditional problems associated with sample digestion, requiring a small amount of sample only (0.05 mg per replicate for catalysts, and a few milligrams for the pharmaceuticals) and providing sufficient sensitivity to easily comply with regulations. The LODs achieved were 6.5 μg g- 1 (Pd), 8.3 μg g- 1 (Pt) and 9.3 μg g- 1 (Rh) for catalysts, which decreased to 0.08 μg g- 1 (Pd), 0.15 μg g- 1 (Pt) and 0.10 μg g- 1 (Rh) for pharmaceuticals.
NASA Technical Reports Server (NTRS)
Allton, Judith H.
2012-01-01
Genesis mission to capture and return to Earth solar wind samples had very stringent contamination control requirements in order to distinguish the solar atoms from terrestrial ones. Genesis mission goals were to measure solar composition for most of the periodic table, so great care was taken to avoid particulate contamination. Since the number 1 and 2 science goals were to determine the oxygen and nitrogen isotopic composition, organic contamination was minimized by tightly controlling offgassing. The total amount of solar material captured in two years is about 400 micrograms spread across one sq m. The contamination limit requirement for each of C, N, and O was <1015 atoms/sq cm. For carbon, this is equivalent to 10 ng/cm2. Extreme vigilance was used in pre-paring Genesis collectors and cleaning hardware for flight. Surface contamination on polished silicon wafers, measured in Genesis laboratory is approximately 10 ng/sq cm.
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
Centrifugal sedimentation immunoassays for multiplexed detection of enteric bacteria in ground water
Litvinov, Julia; Moen, Scott T.; Koh, Chung-Yan; ...
2016-01-01
Water-born pathogens pose significant threat to the global population and early detection plays an important role both in making drinking water safe, as well as in diagnostics and treatment of water-borne diseases. We present an innovative centrifugal microfluidic platform (SpinDx) for detection of bacterial pathogens using bead-based immunoassays. Our approach is based on binding of pathogens to antibody-functionalized capture particles followed by sedimentation of the particles through a density-media in a microfluidic disk and quantification by fluorescence microscopy. Our platform is fast (20 min), sensitive (10 3 CFU/mL), requires minimal sample preparation, and can detect multiple pathogens simultaneously with sensitivitymore » similar to that required by the EPA. We demonstrate detection of a panel of enteric bacteria (Escherichia coli, Salmonella typhimurium, Shigella, Listeria, and Campylobacter) at concentrations as low as 10 3 CFU/mL or 30 bacteria per reaction.« less
Proof of concept demonstration of optimal composite MRI endpoints for clinical trials.
Edland, Steven D; Ard, M Colin; Sridhar, Jaiashre; Cobia, Derin; Martersteck, Adam; Mesulam, M Marsel; Rogalski, Emily J
2016-09-01
Atrophy measures derived from structural MRI are promising outcome measures for early phase clinical trials, especially for rare diseases such as primary progressive aphasia (PPA), where the small available subject pool limits our ability to perform meaningfully powered trials with traditional cognitive and functional outcome measures. We investigated a composite atrophy index in 26 PPA participants with longitudinal MRIs separated by two years. Rogalski et al . [ Neurology 2014;83:1184-1191] previously demonstrated that atrophy of the left perisylvian temporal cortex (PSTC) is a highly sensitive measure of disease progression in this population and a promising endpoint for clinical trials. Using methods described by Ard et al . [ Pharmaceutical Statistics 2015;14:418-426], we constructed a composite atrophy index composed of a weighted sum of volumetric measures of 10 regions of interest within the left perisylvian cortex using weights that maximize signal-to-noise and minimize sample size required of trials using the resulting score. Sample size required to detect a fixed percentage slowing in atrophy in a two-year clinical trial with equal allocation of subjects across arms and 90% power was calculated for the PSTC and optimal composite surrogate biomarker endpoints. The optimal composite endpoint required 38% fewer subjects to detect the same percent slowing in atrophy than required by the left PSTC endpoint. Optimal composites can increase the power of clinical trials and increase the probability that smaller trials are informative, an observation especially relevant for PPA, but also for related neurodegenerative disorders including Alzheimer's disease.
Jones, Sandra R.; Garbarino, John R.
1999-01-01
Graphite furnace-atomic absorption spectrometry (GF-AAS) is a sensitive, precise, and accurate technique that can be used to determine arsenic and selenium in samples of water and sediment. The GF-AAS method has been developed to replace the hydride generation-atomic absorption spectrometry (HG-AAS) methods because the method detection limits are similar, bias and variability are comparable, and interferences are minimal. Advantages of the GF-AAS method include shorter sample preparation time, increased sample throughput from simultaneous multielement analysis, reduced amount of chemical waste, reduced sample volume requirements, increased linear concentration range, and the use of a more accurate digestion procedure. The linear concentration range for arsenic and selenium is 1 to 50 micrograms per liter in solution; the current method detection limit for arsenic in solution is 0.9 microgram per liter; the method detection limit for selenium in solution is 1 microgram per liter. This report describes results that were obtained using stop-flow and low-flow conditions during atomization. The bias and variability of the simultaneous determination of arsenic and selenium by GF-AAS under both conditions are supported with results from standard reference materials--water and sediment, real water samples, and spike recovery measurements. Arsenic and selenium results for all Standard Reference Water Samples analyzed were within one standard deviation of the most probable values. Long-term spike recoveries at 6.25, 25.0, 37.5 micrograms per liter in reagent-, ground-, and surface-water samples for arsenic averaged 103 plus or minus 2 percent using low-flow conditions and 104 plus or minus 4 percent using stop-flow conditions. Corresponding recoveries for selenium were 98 plus or minus 13 percent using low-flow conditions and 87 plus or minus 24 percent using stop-flow conditions. Spike recoveries at 25 micrograms per liter in 120 water samples ranged from 97 to 99 percent for arsenic and from 82 to 93 percent for selenium, depending on the flow conditions used. Statistical analysis of dissolved and whole-water recoverable analytical results for the same set of water samples indicated that there is no significant difference between the GF-AAS and HG-AAS methods. Interferences related to various chemical constituents were also identified. Although sulfate and chloride in association with various cations might interfere with the determination of arsenic and selenium by GF-AAS, the use of a magnesium nitrate/palladium matrix modifier and low-flow argon during atomization helped to minimize such interferences. When using stabilized temperature platform furnace conditions where stop flow is used during atomization, the addition of hydrogen (5 percent volume/volume) to the argon minimized chemical interferences. Nevertheless, stop flow during atomization was found to be less effective than low flow in reducing interference effects.
NASA Astrophysics Data System (ADS)
Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias
2016-04-01
This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.
Nascimento, Paloma Andrade Martins; Barsanelli, Paulo Lopes; Rebellato, Ana Paula; Pallone, Juliana Azevedo Lima; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi
2017-03-01
This study shows the use of time-domain (TD)-NMR transverse relaxation (T2) data and chemometrics in the nondestructive determination of fat content for powdered food samples such as commercial dried milk products. Most proposed NMR spectroscopy methods for measuring fat content correlate free induction decay or echo intensities with the sample's mass. The need for the sample's mass limits the analytical frequency of NMR determination, because weighing the samples is an additional step in this procedure. Therefore, the method proposed here is based on a multivariate model of T2 decay, measured with Carr-Purcell-Meiboom-Gill pulse sequence and reference values of fat content. The TD-NMR spectroscopy method shows high correlation (r = 0.95) with the lipid content, determined by the standard extraction method of Bligh and Dyer. For comparison, fat content determination was also performed using a multivariate model with near-IR (NIR) spectroscopy, which is also a nondestructive method. The advantages of the proposed TD-NMR method are that it (1) minimizes toxic residue generation, (2) performs measurements with high analytical frequency (a few seconds per analysis), and (3) does not require sample preparation (such as pelleting, needed for NIR spectroscopy analyses) or weighing the samples.
Electrodrift purification of materials for room temperature radiation detectors
James, R.B.; Van Scyoc, J.M. III; Schlesinger, T.E.
1997-06-24
A method of purifying nonmetallic, crystalline semiconducting materials useful for room temperature radiation detecting devices by applying an electric field across the material is disclosed. The present invention discloses a simple technology for producing purified ionic semiconducting materials, in particular PbI{sub 2} and preferably HgI{sub 2}, which produces high yields of purified product, requires minimal handling of the material thereby reducing the possibility of introducing or reintroducing impurities into the material, is easy to control, is highly selective for impurities, retains the stoichiometry of the material and employs neither high temperatures nor hazardous materials such as solvents or liquid metals. An electric field is applied to a bulk sample of the material causing impurities present in the sample to drift in a preferred direction. After all of the impurities have been transported to the ends of the sample the current flowing through the sample, a measure of the rate of transport of mobile impurities, falls to a low, steady state value, at which time the end sections of the sample where the impurities have concentrated are removed leaving a bulk sample of higher purity material. Because the method disclosed here only acts on the electrically active impurities, the stoichiometry of the host material remains substantially unaffected. 4 figs.
Electrodrift purification of materials for room temperature radiation detectors
James, Ralph B.; Van Scyoc, III, John M.; Schlesinger, Tuviah E.
1997-06-24
A method of purifying nonmetallic, crystalline semiconducting materials useful for room temperature radiation detecting devices by applying an electric field across the material. The present invention discloses a simple technology for producing purified ionic semiconducting materials, in particular PbI.sub.2 and preferably HgI.sub.2, which produces high yields of purified product, requires minimal handling of the material thereby reducing the possibility of introducing or reintroducing impurities into the material, is easy to control, is highly selective for impurities, retains the stoichiometry of the material and employs neither high temperatures nor hazardous materials such as solvents or liquid metals. An electric field is applied to a bulk sample of the material causing impurities present in the sample to drift in a preferred direction. After all of the impurities have been transported to the ends of the sample the current flowing through the sample, a measure of the rate of transport of mobile impurities, falls to a low, steady state value, at which time the end sections of the sample where the impurities have concentrated are removed leaving a bulk sample of higher purity material. Because the method disclosed here only acts on the electrically active impurities, the stoichiometry of the host material remains substantially unaffected.
Yan, Xu; Zhou, Minxiong; Ying, Lingfang; Yin, Dazhi; Fan, Mingxia; Yang, Guang; Zhou, Yongdi; Song, Fan; Xu, Dongrong
2013-01-01
Diffusion kurtosis imaging (DKI) is a new method of magnetic resonance imaging (MRI) that provides non-Gaussian information that is not available in conventional diffusion tensor imaging (DTI). DKI requires data acquisition at multiple b-values for parameter estimation; this process is usually time-consuming. Therefore, fewer b-values are preferable to expedite acquisition. In this study, we carefully evaluated various acquisition schemas using different numbers and combinations of b-values. Acquisition schemas that sampled b-values that were distributed to two ends were optimized. Compared to conventional schemas using equally spaced b-values (ESB), optimized schemas require fewer b-values to minimize fitting errors in parameter estimation and may thus significantly reduce scanning time. Following a ranked list of optimized schemas resulted from the evaluation, we recommend the 3b schema based on its estimation accuracy and time efficiency, which needs data from only 3 b-values at 0, around 800 and around 2600 s/mm2, respectively. Analyses using voxel-based analysis (VBA) and region-of-interest (ROI) analysis with human DKI datasets support the use of the optimized 3b (0, 1000, 2500 s/mm2) DKI schema in practical clinical applications. PMID:23735303
The laboratory of the 1990s—Planning for total automation
Brunner, Linda A.
1992-01-01
The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925
Fischer, Jesse R.; Quist, Michael C.
2014-01-01
All freshwater fish sampling methods are biased toward particular species, sizes, and sexes and are further influenced by season, habitat, and fish behavior changes over time. However, little is known about gear-specific biases for many common fish species because few multiple-gear comparison studies exist that have incorporated seasonal dynamics. We sampled six lakes and impoundments representing a diversity of trophic and physical conditions in Iowa, USA, using multiple gear types (i.e., standard modified fyke net, mini-modified fyke net, sinking experimental gill net, bag seine, benthic trawl, boat-mounted electrofisher used diurnally and nocturnally) to determine the influence of sampling methodology and season on fisheries assessments. Specifically, we describe the influence of season on catch per unit effort, proportional size distribution, and the number of samples required to obtain 125 stock-length individuals for 12 species of recreational and ecological importance. Mean catch per unit effort generally peaked in the spring and fall as a result of increased sampling effectiveness in shallow areas and seasonal changes in habitat use (e.g., movement offshore during summer). Mean proportional size distribution decreased from spring to fall for white bass Morone chrysops, largemouth bass Micropterus salmoides, bluegill Lepomis macrochirus, and black crappie Pomoxis nigromaculatus, suggesting selectivity for large and presumably sexually mature individuals in the spring and summer. Overall, the mean number of samples required to sample 125 stock-length individuals was minimized in the fall with sinking experimental gill nets, a boat-mounted electrofisher used at night, and standard modified nets for 11 of the 12 species evaluated. Our results provide fisheries scientists with relative comparisons between several recommended standard sampling methods and illustrate the effects of seasonal variation on estimates of population indices that will be critical to the future development of standardized sampling methods for freshwater fish in lentic ecosystems.
Sanderson, Nicholas D.; Atkins, Bridget L.; Brent, Andrew J.; Cole, Kevin; Foster, Dona; McNally, Martin A.; Oakley, Sarah; Peto, Leon; Taylor, Adrian; Peto, Tim E. A.; Crook, Derrick W.; Eyre, David W.
2017-01-01
ABSTRACT Culture of multiple periprosthetic tissue samples is the current gold standard for microbiological diagnosis of prosthetic joint infections (PJI). Additional diagnostic information may be obtained through culture of sonication fluid from explants. However, current techniques can have relatively low sensitivity, with prior antimicrobial therapy and infection by fastidious organisms influencing results. We assessed if metagenomic sequencing of total DNA extracts obtained direct from sonication fluid can provide an alternative rapid and sensitive tool for diagnosis of PJI. We compared metagenomic sequencing with standard aerobic and anaerobic culture in 97 sonication fluid samples from prosthetic joint and other orthopedic device infections. Reads from Illumina MiSeq sequencing were taxonomically classified using Kraken. Using 50 derivation samples, we determined optimal thresholds for the number and proportion of bacterial reads required to identify an infection and confirmed our findings in 47 independent validation samples. Compared to results from sonication fluid culture, the species-level sensitivity of metagenomic sequencing was 61/69 (88%; 95% confidence interval [CI], 77 to 94%; for derivation samples 35/38 [92%; 95% CI, 79 to 98%]; for validation samples, 26/31 [84%; 95% CI, 66 to 95%]), and genus-level sensitivity was 64/69 (93%; 95% CI, 84 to 98%). Species-level specificity, adjusting for plausible fastidious causes of infection, species found in concurrently obtained tissue samples, and prior antibiotics, was 85/97 (88%; 95% CI, 79 to 93%; for derivation samples, 43/50 [86%; 95% CI, 73 to 94%]; for validation samples, 42/47 [89%; 95% CI, 77 to 96%]). High levels of human DNA contamination were seen despite the use of laboratory methods to remove it. Rigorous laboratory good practice was required to minimize bacterial DNA contamination. We demonstrate that metagenomic sequencing can provide accurate diagnostic information in PJI. Our findings, combined with the increasing availability of portable, random-access sequencing technology, offer the potential to translate metagenomic sequencing into a rapid diagnostic tool in PJI. PMID:28490492
GROUND WATER ISSUE: LOW-FLOW (MINIMAL DRAWDOWN) GROUND-WATER SAMPLING PROCEDURES
This paper is intended to provide background information on the development of low-flow sampling procedures and its application under a variety of hydrogeologic settings. The sampling methodology described in this paper assumes that the monitoring goal is to sample monitoring wel...
An interface for the direct coupling of small liquid samples to AMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
An interface for the direct coupling of small liquid samples to AMS
Ognibene, T. J.; Thomas, A. T.; Daley, P. F.; ...
2015-05-28
We describe the moving wire interface attached to the 1-MV AMS system at LLNL’s Center for Accelerator Mass Spectrometry for the analysis of nonvolatile liquid samples as either discrete drops or from the direct output of biochemical separatory instrumentation, such as high-performance liquid chromatography (HPLC). Discrete samples containing at least a few 10 s of nanograms of carbon and as little as 50 zmol 14C can be measured with a 3–5% precision in a few minutes. The dynamic range of our system spans approximately 3 orders in magnitude. Sample to sample memory is minimized by the use of fresh targetsmore » for each discrete sample or by minimizing the amount of carbon present in a peak generated by an HPLC containing a significant amount of 14C. As a result, liquid sample AMS provides a new technology to expand our biomedical AMS program by enabling the capability to measure low-level biochemicals in extremely small samples that would otherwise be inaccessible.« less
Martini, Marinna A.; Sherwood, Chris; Horwitz, Rachel; Ramsey, Andree; Lightsom, Fran; Lacy, Jessie; Xu, Jingping
2006-01-01
3.\tpreserving minimally processed and partially processed versions of data sets. STG usually deploys ADV and PCADP probes configured as downward looking, mounted on bottom tripods, with the objective of measuring high-resolution near-bed currents. The velocity profiles are recorded with minimal internal data processing. Also recorded are parameters such as temperature, conductivity, optical backscatter, light transmission, and high frequency pressure. Sampling consists of high-frequency bursts(1–10 Hz) bursts of long duration (5–30 minutes) at regular and recurring intervals for a duration of 1 to 6 months. The result is very large data files, often 500 MB per Hydra, per deployment, in Sontek's compressed binary format. This section introduces the Hydratools toolbox and provides information about the history of the system's development. The USGS philosophy regarding data quality is discussed to provide an understating of the motivation for creating the system. General information about the following topics will also be discussed: hardware and software required for the system, basic processing steps, limitations of program usage, and features that are unique to the programs.
Evolution of the INMARSAT aeronautical system: Service, system, and business considerations
NASA Technical Reports Server (NTRS)
Sengupta, Jay R.
1995-01-01
A market-driven approach was adopted to develop enhancements to the Inmarsat-Aeronautical system, to address the requirements of potential new market segments. An evolutionary approach and well differentiated product/service portfolio was required, to minimize system upgrade costs and market penetration, respectively. The evolved system definition serves to minimize equipment cost/size/mass for short/medium range aircraft, by reducing the antenna gain requirement and relaxing the performance requirements for non safety-related communications. A validation program involving simulation, laboratory tests, over-satellite tests and flight trials is being conducted to confirm the system definition. Extensive market research has been conducted to determine user requirements and to quantify market demand for future Inmarsat Aero-1 AES, using sophisticated computer assisted survey techniques.
Turbulence flight director analysis and preliminary simulation
NASA Technical Reports Server (NTRS)
Johnson, D. E.; Klein, R. E.
1974-01-01
A control column and trottle flight director display system is synthesized for use during flight through severe turbulence. The column system is designed to minimize airspeed excursions without overdriving attitude. The throttle system is designed to augment the airspeed regulation and provide an indication of the trim thrust required for any desired flight path angle. Together they form an energy management system to provide harmonious display indications of current aircraft motions and required corrective action, minimize gust upset tendencies, minimize unsafe aircraft excursions, and maintain satisfactory ride qualities. A preliminary fixed-base piloted simulation verified the analysis and provided a shakedown for a more sophisticated moving-base simulation to be accomplished next. This preliminary simulation utilized a flight scenario concept combining piloting tasks, random turbulence, and discrete gusts to create a high but realistic pilot workload conducive to pilot error and potential upset. The turbulence director (energy management) system significantly reduced pilot workload and minimized unsafe aircraft excursions.
Computational design optimization for microfluidic magnetophoresis
Plouffe, Brian D.; Lewis, Laura H.; Murthy, Shashi K.
2011-01-01
Current macro- and microfluidic approaches for the isolation of mammalian cells are limited in both efficiency and purity. In order to design a robust platform for the enumeration of a target cell population, high collection efficiencies are required. Additionally, the ability to isolate pure populations with minimal biological perturbation and efficient off-chip recovery will enable subcellular analyses of these cells for applications in personalized medicine. Here, a rational design approach for a simple and efficient device that isolates target cell populations via magnetic tagging is presented. In this work, two magnetophoretic microfluidic device designs are described, with optimized dimensions and operating conditions determined from a force balance equation that considers two dominant and opposing driving forces exerted on a magnetic-particle-tagged cell, namely, magnetic and viscous drag. Quantitative design criteria for an electromagnetic field displacement-based approach are presented, wherein target cells labeled with commercial magnetic microparticles flowing in a central sample stream are shifted laterally into a collection stream. Furthermore, the final device design is constrained to fit on standard rectangular glass coverslip (60 (L)×24 (W)×0.15 (H) mm3) to accommodate small sample volume and point-of-care design considerations. The anticipated performance of the device is examined via a parametric analysis of several key variables within the model. It is observed that minimal currents (<500 mA) are required to generate magnetic fields sufficient to separate cells from the sample streams flowing at rate as high as 7 ml∕h, comparable to the performance of current state-of-the-art magnet-activated cell sorting systems currently used in clinical settings. Experimental validation of the presented model illustrates that a device designed according to the derived rational optimization can effectively isolate (∼100%) a magnetic-particle-tagged cell population from a homogeneous suspension even in a low abundance. Overall, this design analysis provides a rational basis to select the operating conditions, including chamber and wire geometry, flow rates, and applied currents, for a magnetic-microfluidic cell separation device. PMID:21526007
1993-05-01
obtained to provide a nominal control history . The guidance law is found by minimizing the V second variation of the suboptimal trajectory...deviations from the suboptimal trajectory to required changes in the nominal control history . The deviations from the suboptimal trajectory, used together...with the precomputed gains, determines the change in the nominal control history required to meet the final constraints while minimizing the change in
DOE Office of Scientific and Technical Information (OSTI.GOV)
Filho, Faete J; Tolbert, Leon M; Ozpineci, Burak
2012-01-01
The work developed here proposes a methodology for calculating switching angles for varying DC sources in a multilevel cascaded H-bridges converter. In this approach the required fundamental is achieved, the lower harmonics are minimized, and the system can be implemented in real time with low memory requirements. Genetic algorithm (GA) is the stochastic search method to find the solution for the set of equations where the input voltages are the known variables and the switching angles are the unknown variables. With the dataset generated by GA, an artificial neural network (ANN) is trained to store the solutions without excessive memorymore » storage requirements. This trained ANN then senses the voltage of each cell and produces the switching angles in order to regulate the fundamental at 120 V and eliminate or minimize the low order harmonics while operating in real time.« less
Superiorization with level control
NASA Astrophysics Data System (ADS)
Cegielski, Andrzej; Al-Musallam, Fadhel
2017-04-01
The convex feasibility problem is to find a common point of a finite family of closed convex subsets. In many applications one requires something more, namely finding a common point of closed convex subsets which minimizes a continuous convex function. The latter requirement leads to an application of the superiorization methodology which is actually settled between methods for convex feasibility problem and the convex constrained minimization. Inspired by the superiorization idea we introduce a method which sequentially applies a long-step algorithm for a sequence of convex feasibility problems; the method employs quasi-nonexpansive operators as well as subgradient projections with level control and does not require evaluation of the metric projection. We replace a perturbation of the iterations (applied in the superiorization methodology) by a perturbation of the current level in minimizing the objective function. We consider the method in the Euclidean space in order to guarantee the strong convergence, although the method is well defined in a Hilbert space.
NASA Technical Reports Server (NTRS)
Montesano, P. M.; Cook, B. D.; Sun, G.; Simard, M.; Zhang, Z.; Nelson, R. F.; Ranson, K. J.; Lutchke, S.; Blair, J. B.
2012-01-01
The synergistic use of active and passive remote sensing (i.e., data fusion) demonstrates the ability of spaceborne light detection and ranging (LiDAR), synthetic aperture radar (SAR) and multispectral imagery for achieving the accuracy requirements of a global forest biomass mapping mission. This data fusion approach also provides a means to extend 3D information from discrete spaceborne LiDAR measurements of forest structure across scales much larger than that of the LiDAR footprint. For estimating biomass, these measurements mix a number of errors including those associated with LiDAR footprint sampling over regional - global extents. A general framework for mapping above ground live forest biomass (AGB) with a data fusion approach is presented and verified using data from NASA field campaigns near Howland, ME, USA, to assess AGB and LiDAR sampling errors across a regionally representative landscape. We combined SAR and Landsat-derived optical (passive optical) image data to identify forest patches, and used image and simulated spaceborne LiDAR data to compute AGB and estimate LiDAR sampling error for forest patches and 100m, 250m, 500m, and 1km grid cells. Forest patches were delineated with Landsat-derived data and airborne SAR imagery, and simulated spaceborne LiDAR (SSL) data were derived from orbit and cloud cover simulations and airborne data from NASA's Laser Vegetation Imaging Sensor (L VIS). At both the patch and grid scales, we evaluated differences in AGB estimation and sampling error from the combined use of LiDAR with both SAR and passive optical and with either SAR or passive optical alone. This data fusion approach demonstrates that incorporating forest patches into the AGB mapping framework can provide sub-grid forest information for coarser grid-level AGB reporting, and that combining simulated spaceborne LiDAR with SAR and passive optical data are most useful for estimating AGB when measurements from LiDAR are limited because they minimized forest AGB sampling errors by 15 - 38%. Furthermore, spaceborne global scale accuracy requirements were achieved. At least 80% of the grid cells at 100m, 250m, 500m, and 1km grid levels met AGB density accuracy requirements using a combination of passive optical and SAR along with machine learning methods to predict vegetation structure metrics for forested areas without LiDAR samples. Finally, using either passive optical or SAR, accuracy requirements were met at the 500m and 250m grid level, respectively.
Fast de novo discovery of low-energy protein loop conformations.
Wong, Samuel W K; Liu, Jun S; Kou, S C
2017-08-01
In the prediction of protein structure from amino acid sequence, loops are challenging regions for computational methods. Since loops are often located on the protein surface, they can have significant roles in determining protein functions and binding properties. Loop prediction without the aid of a structural template requires extensive conformational sampling and energy minimization, which are computationally difficult. In this article we present a new de novo loop sampling method, the Parallely filtered Energy Targeted All-atom Loop Sampler (PETALS) to rapidly locate low energy conformations. PETALS explores both backbone and side-chain positions of the loop region simultaneously according to the energy function selected by the user, and constructs a nonredundant ensemble of low energy loop conformations using filtering criteria. The method is illustrated with the DFIRE potential and DiSGro energy function for loops, and shown to be highly effective at discovering conformations with near-native (or better) energy. Using the same energy function as the DiSGro algorithm, PETALS samples conformations with both lower RMSDs and lower energies. PETALS is also useful for assessing the accuracy of different energy functions. PETALS runs rapidly, requiring an average time cost of 10 minutes for a length 12 loop on a single 3.2 GHz processor core, comparable to the fastest existing de novo methods for generating an ensemble of conformations. Proteins 2017; 85:1402-1412. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Nunez, J. I.; Farmer, J. D.; Sellar, R. G.; Allen, Carlton C.
2010-01-01
To maximize the scientific return, future robotic and human missions to the Moon will need to have in-situ capabilities to enable the selection of the highest value samples for returning to Earth, or a lunar base for analysis. In order to accomplish this task efficiently, samples will need to be characterized using a suite of robotic instruments that can provide crucial information about elemental composition, mineralogy, volatiles and ices. Such spatially-correlated data sets, which place mineralogy into a microtextural context, are considered crucial for correct petrogenetic interpretations. . Combining microscopic imaging with visible= nearinfrared reflectance spectroscopy, provides a powerful in-situ approach for obtaining mineralogy within a microtextural context. The approach is non-destructive and requires minimal mechanical sample preparation. This approach provides data sets that are comparable to what geologists routinely acquire in the field, using a hand lens and in the lab using thin section petrography, and provide essential information for interpreting the primary formational processes in rocks and soils as well as the effects of secondary (diagenetic) alteration processes. Such observations lay a foundation for inferring geologic histories and provide "ground truth" for similar instruments on orbiting satellites; they support astronaut EVA activities and provide basic information about the physical properties of soils required for assessing associated health risks, and are basic tools in the exploration for in-situ resources to support human exploration of the Moon.
Mars Orbiter Sample Return Power Design
NASA Technical Reports Server (NTRS)
Mardesich, N.; Dawson, S.
1999-01-01
The NASA/JPL 2003/2005 Mars Sample Return (MSR) Missions will each have a sample return canister that will be filled with samples cored from the surface of MARS. These spherical canisters will be 14.8 cm in diameter and must be powered only by solar cells on the surface and must communicate using RF transmission with the recovery vehicle that will be coming in 2006 or 2009 to retrieve the canister. This paper considers the aspect and conclusion that went into the design of the power system that achieves the maximum power with the minimum risk. The power output for the spherical orbiting canister was modeled and plotted in various views of the orbit by the SOAP program developed by JPL. The requirements and geometry for a solar array on a sphere are unique and place special constraints on the design. These requirements include 1) accommodating a lid for sample loading into the canister, surface area was restricted from use on the Northern pole of the spherical canister. 2) minimal cell surface coverage (maximum cell efficiency), less than 40%, for recovery vehicle to locate the canister by optical techniques. 3) a RF transmission during 50% of MARS orbit time on any spin axis, which requires optimum circuit placement of the solar cell onto the spherical canister. The best configuration would have been a 4.5 volt round cell, but in the real world we compromised with six triangular silicon cells connected in series to form a hexagon. These hexagon circuits would be mounted onto a flat facet cut into the spherical canister. The surface flats are required in order to maximize power, the surface of the cells connected in series must be at the same angle relative to the sun. The flat facets intersect each other to allow twelve circuits evenly spaced just North and twelve circuits South of the equator of the spherical canister. Connecting these circuits in parallel allows sufficient power to operate the transmitter at minimum solar exposure, Northern pole of the canister facing the sun. Additional power, as much as 20%, is also generated by the circuits facing MARS due to albedo of MARS.
Feedback Augmented Sub-Ranging (FASR) Quantizer
NASA Technical Reports Server (NTRS)
Guilligan, Gerard
2012-01-01
This innovation is intended to reduce the size, power, and complexity of pipeline analog-to-digital converters (ADCs) that require high resolution and speed along with low power. Digitizers are important components in any application where analog signals (such as light, sound, temperature, etc.) need to be digitally processed. The innovation implements amplification of a sampled residual voltage in a switched capacitor amplifier stage that does not depend on charge redistribution. The result is less sensitive to capacitor mismatches that cause gain errors, which are the main limitation of such amplifiers in pipeline ADCs. The residual errors due to mismatch are reduced by at least a factor of 16, which is equivalent to at least 4 bits of improvement. The settling time is also faster because of a higher feedback factor. In traditional switched capacitor residue amplifiers, closed-loop amplification of a sampled and held residue signal is achieved by redistributing sampled charge onto a feedback capacitor around a high-gain transconductance amplifier. The residual charge that was sampled during the acquisition or sampling phase is stored on two or more capacitors, often equal in value or integral multiples of each other. During the hold or amplification phase, all of the charge is redistributed onto one capacitor in the feedback loop of the amplifier to produce an amplified voltage. The key error source is the non-ideal ratios of feedback and input capacitors caused by manufacturing tolerances, called mismatches. The mismatches cause non-ideal closed-loop gain, leading to higher differential non-linearity. Traditional solutions to the mismatch errors are to use larger capacitor values (than dictated by thermal noise requirements) and/or complex calibration schemes, both of which increase the die size and power dissipation. The key features of this innovation are (1) the elimination of the need for charge redistribution to achieve an accurate closed-loop gain of two, (2) a higher feedback factor in the amplifier stage giving a higher closed-loop bandwidth compared to the prior art, and (3) reduced requirement for calibration. The accuracy of the new amplifier is mainly limited by the sampling networks parasitic capacitances, which should be minimized in relation to the sampling capacitors.
System health monitoring using multiple-model adaptive estimation techniques
NASA Astrophysics Data System (ADS)
Sifford, Stanley Ryan
Monitoring system health for fault detection and diagnosis by tracking system parameters concurrently with state estimates is approached using a new multiple-model adaptive estimation (MMAE) method. This novel method is called GRid-based Adaptive Parameter Estimation (GRAPE). GRAPE expands existing MMAE methods by using new techniques to sample the parameter space. GRAPE expands on MMAE with the hypothesis that sample models can be applied and resampled without relying on a predefined set of models. GRAPE is initially implemented in a linear framework using Kalman filter models. A more generalized GRAPE formulation is presented using extended Kalman filter (EKF) models to represent nonlinear systems. GRAPE can handle both time invariant and time varying systems as it is designed to track parameter changes. Two techniques are presented to generate parameter samples for the parallel filter models. The first approach is called selected grid-based stratification (SGBS). SGBS divides the parameter space into equally spaced strata. The second approach uses Latin Hypercube Sampling (LHS) to determine the parameter locations and minimize the total number of required models. LHS is particularly useful when the parameter dimensions grow. Adding more parameters does not require the model count to increase for LHS. Each resample is independent of the prior sample set other than the location of the parameter estimate. SGBS and LHS can be used for both the initial sample and subsequent resamples. Furthermore, resamples are not required to use the same technique. Both techniques are demonstrated for both linear and nonlinear frameworks. The GRAPE framework further formalizes the parameter tracking process through a general approach for nonlinear systems. These additional methods allow GRAPE to either narrow the focus to converged values within a parameter range or expand the range in the appropriate direction to track the parameters outside the current parameter range boundary. Customizable rules define the specific resample behavior when the GRAPE parameter estimates converge. Convergence itself is determined from the derivatives of the parameter estimates using a simple moving average window to filter out noise. The system can be tuned to match the desired performance goals by making adjustments to parameters such as the sample size, convergence criteria, resample criteria, initial sampling method, resampling method, confidence in prior sample covariances, sample delay, and others.
Asymptotically Optimal Motion Planning for Learned Tasks Using Time-Dependent Cost Maps
Bowen, Chris; Ye, Gu; Alterovitz, Ron
2015-01-01
In unstructured environments in people’s homes and workspaces, robots executing a task may need to avoid obstacles while satisfying task motion constraints, e.g., keeping a plate of food level to avoid spills or properly orienting a finger to push a button. We introduce a sampling-based method for computing motion plans that are collision-free and minimize a cost metric that encodes task motion constraints. Our time-dependent cost metric, learned from a set of demonstrations, encodes features of a task’s motion that are consistent across the demonstrations and, hence, are likely required to successfully execute the task. Our sampling-based motion planner uses the learned cost metric to compute plans that simultaneously avoid obstacles and satisfy task constraints. The motion planner is asymptotically optimal and minimizes the Mahalanobis distance between the planned trajectory and the distribution of demonstrations in a feature space parameterized by the locations of task-relevant objects. The motion planner also leverages the distribution of the demonstrations to significantly reduce plan computation time. We demonstrate the method’s effectiveness and speed using a small humanoid robot performing tasks requiring both obstacle avoidance and satisfaction of learned task constraints. Note to Practitioners Motivated by the desire to enable robots to autonomously operate in cluttered home and workplace environments, this paper presents an approach for intuitively training a robot in a manner that enables it to repeat the task in novel scenarios and in the presence of unforeseen obstacles in the environment. Based on user-provided demonstrations of the task, our method learns features of the task that are consistent across the demonstrations and that we expect should be repeated by the robot when performing the task. We next present an efficient algorithm for planning robot motions to perform the task based on the learned features while avoiding obstacles. We demonstrate the effectiveness of our motion planner for scenarios requiring transferring a powder and pushing a button in environments with obstacles, and we plan to extend our results to more complex tasks in the future. PMID:26279642
NASA Astrophysics Data System (ADS)
Kim, Min-Suk; Won, Hwa-Yeon; Jeong, Jong-Mun; Böcker, Paul; Vergaij-Huizer, Lydia; Kupers, Michiel; Jovanović, Milenko; Sochal, Inez; Ryan, Kevin; Sun, Kyu-Tae; Lim, Young-Wan; Byun, Jin-Moo; Kim, Gwang-Gon; Suh, Jung-Joon
2016-03-01
In order to optimize yield in DRAM semiconductor manufacturing for 2x nodes and beyond, the (processing induced) overlay fingerprint towards the edge of the wafer needs to be reduced. Traditionally, this is achieved by acquiring denser overlay metrology at the edge of the wafer, to feed field-by-field corrections. Although field-by-field corrections can be effective in reducing localized overlay errors, the requirement for dense metrology to determine the corrections can become a limiting factor due to a significant increase of metrology time and cost. In this study, a more cost-effective solution has been found in extending the regular correction model with an edge-specific component. This new overlay correction model can be driven by an optimized, sparser sampling especially at the wafer edge area, and also allows for a reduction of noise propagation. Lithography correction potential has been maximized, with significantly less metrology needs. Evaluations have been performed, demonstrating the benefit of edge models in terms of on-product overlay performance, as well as cell based overlay performance based on metrology-to-cell matching improvements. Performance can be increased compared to POR modeling and sampling, which can contribute to (overlay based) yield improvement. Based on advanced modeling including edge components, metrology requirements have been optimized, enabling integrated metrology which drives down overall metrology fab footprint and lithography cycle time.
Rankin, Jeffery W; Kwarciak, Andrew M; Richter, W Mark; Neptune, Richard R
2012-11-01
The majority of manual wheelchair users will experience upper extremity injuries or pain, in part due to the high force requirements, repetitive motion and extreme joint postures associated with wheelchair propulsion. Recent studies have identified cadence, contact angle and peak force as important factors for reducing upper extremity demand during propulsion. However, studies often make comparisons between populations (e.g., able-bodied vs. paraplegic) or do not investigate specific measures of upper extremity demand. The purpose of this study was to use a musculoskeletal model and forward dynamics simulations of wheelchair propulsion to investigate how altering cadence, peak force and contact angle influence individual muscle demand. Forward dynamics simulations of wheelchair propulsion were generated to emulate group-averaged experimental data during four conditions: 1) self-selected propulsion technique, and while 2) minimizing cadence, 3) maximizing contact angle, and 4) minimizing peak force using biofeedback. Simulations were used to determine individual muscle mechanical power and stress as measures of muscle demand. Minimizing peak force and cadence had the lowest muscle power requirements. However, minimizing peak force increased cadence and recovery power, while minimizing cadence increased average muscle stress. Maximizing contact angle increased muscle stress and had the highest muscle power requirements. Minimizing cadence appears to have the most potential for reducing muscle demand and fatigue, which could decrease upper extremity injuries and pain. However, altering any of these variables to extreme values appears to be less effective; instead small to moderate changes may better reduce overall muscle demand. Copyright © 2012 Elsevier Ltd. All rights reserved.
Accurate and reproducible measurements of RhoA activation in small samples of primary cells.
Nini, Lylia; Dagnino, Lina
2010-03-01
Rho GTPase activation is essential in a wide variety of cellular processes. Measurement of Rho GTPase activation is difficult with limited material, such as tissues or primary cells that exhibit stringent culture requirements for growth and survival. We defined parameters to accurately and reproducibly measure RhoA activation (i.e., RhoA-GTP) in cultured primary keratinocytes in response to serum and growth factor stimulation using enzyme-linked immunosorbent assay (ELISA)-based G-LISA assays. We also established conditions that minimize RhoA-GTP in unstimulated cells without affecting viability, allowing accurate measurements of RhoA activation on stimulation or induction of exogenous GTPase expression. Copyright 2009 Elsevier Inc. All rights reserved.
Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
Application of quadratic optimization to supersonic inlet control.
NASA Technical Reports Server (NTRS)
Lehtinen, B.; Zeller, J. R.
1972-01-01
This paper describes the application of linear stochastic optimal control theory to the design of the control system for the air intake, the inlet, of a supersonic air-breathing propulsion system. The controls must maintain a stable inlet shock position in the presence of random airflow disturbances and prevent inlet unstart. Two different linear time invariant controllers are developed. One is designed to minimize a nonquadratic index, the expected frequency of inlet unstart, and the other is designed to minimize the mean square value of inlet shock motion. The quadratic equivalence principle is used to obtain a linear controller that minimizes the nonquadratic index. The two controllers are compared on the basis of unstart prevention, control effort requirements, and frequency response. It is concluded that while controls designed to minimize unstarts are desirable in that the index minimized is physically meaningful, computation time required is longer than for the minimum mean square shock position approach. The simpler minimum mean square shock position solution produced expected unstart frequency values which were not significantly larger than those of the nonquadratic solution.
NASA Technical Reports Server (NTRS)
Hammond, Ernest C., Jr.; Peters, Kevein; Boone, Kevin
1995-01-01
The current requirements for the Laboratory for Astronomy and Solar Physics, sends rocket satellites and in the near future will involve flights in the shuttle to the upper reaches of the Earth's atmosphere where they will be subjected to the atomic particles and electromagnetic radiation produced by the Sun and other cosmic radiation. It is therefore appropriate to examine the effect of neutrons, gamma rays, beta particles, and X-rays on the film currently being used by the Laboratory for current and future research requirements. It is also hoped by examining these particles in their effect that we will have simulated the space environment of the rockets, satellites, and shuttles. Several samples of the IIaO film were exposed to a neutron howitzer with a source energy of approximately 106 neutrons/steradians. We exposed several samples of the film to a 10 second blast of neutrons in both metal and plastic containers which exhibited higher density readings which indicated the possibility of some secondary nuclear interactions between neutrons and the aluminum container. The plastic container showed some variations at the higher densities. Exposure of the samples of IIaO film to a neutron beam of approximately 10 neutrons per steradians for eight minutes produces approximately a 13% difference in the density readings of the dark density grids. It is not noticeable that at the lighter density grid the neutrons have minimal effects, but on a whole the trend of the eight minute exposed IIaO film density grids at the darker end had a 7.1% difference than the control. Further analysis is anticipated by increasing the exposure time. Two sets of film were exposed to a beta source in a plastic container. The beta source was placed at the bottom so that the cone of rays striking the film would be conical for a period of seven days. It was observed in the films, designated 4a and 4b, a dramatic increase in the grid densities had occurred. The attenuation of beta particles due to the presence of air were observed. The darker density grids, whose positions were the furthest from the beta source, displayed minimal fluctuations as compared with the control. It is suspected that the orientation of the film in the cansister with the beta source is the key factor responsible for the dramatic increases of the lighter density grids. Emulsions 3a and 3b exposed for a period of six days with the grid orientation reserved produced substantial differences in the darker grids as shown in the graphs. There is a great deal of fluctuations in this sample between the beta exposed density grids and the control density grids. The lighter density grids whose orientations were reversed displays minimal fluctuations due to the presence of this beta source and the attenuation that is taking place.
Omasal sampling technique for assessing fermentative digestion in the forestomach of dairy cows.
Huhtanen, P; Brotz, P G; Satter, L D
1997-05-01
A procedure allowing digesta sampling from the omasum via a ruminal cannula without repeated entry into the omasum was developed. The sampling system consisted of a device inserted into the omasum via the ruminal cannula, a tube connecting the device to the ruminal cannula, and a single compressor/vacuum pump. Eight cows given ad libitum access to a total mixed diet were used in a crossover design to evaluate the effects of the sampling system on digestive activity, animal performance, and animal behavior. Results indicated that the omasal sampling system has minimal effect on normal digestive and productive functions of high-producing dairy cows. Dry matter intake was reduced (24.0 vs 21.8 kg/d; P < .02) and seemed related more to the sampling procedures than to the device in the omasum. Observations of animal behavior indicated that cows with the sampling device were similar to control cows, although rumination and total chewing times were reduced slightly. The composition of digesta samples was biased toward an over-abundance of the liquid phase, but using a double-marker system to calculate digesta flow resulted in fairly small coefficients of variation for measurements of ruminal digestion variables. This technique may prove useful for partitioning digestion between the fermentative portion of the forestomach and the lower gastrointestinal tract. The omasal sampling procedure requires less surgical intervention than the traditional methods using abomasal or duodenal cannulas as sampling sites to study forestomach digestion and avoids potentially confounding endogenous secretions of the abomasum.
Manju, Md Abu; Candel, Math J J M; Berger, Martijn P F
2014-07-10
In this paper, the optimal sample sizes at the cluster and person levels for each of two treatment arms are obtained for cluster randomized trials where the cost-effectiveness of treatments on a continuous scale is studied. The optimal sample sizes maximize the efficiency or power for a given budget or minimize the budget for a given efficiency or power. Optimal sample sizes require information on the intra-cluster correlations (ICCs) for effects and costs, the correlations between costs and effects at individual and cluster levels, the ratio of the variance of effects translated into costs to the variance of the costs (the variance ratio), sampling and measuring costs, and the budget. When planning, a study information on the model parameters usually is not available. To overcome this local optimality problem, the current paper also presents maximin sample sizes. The maximin sample sizes turn out to be rather robust against misspecifying the correlation between costs and effects at the cluster and individual levels but may lose much efficiency when misspecifying the variance ratio. The robustness of the maximin sample sizes against misspecifying the ICCs depends on the variance ratio. The maximin sample sizes are robust under misspecification of the ICC for costs for realistic values of the variance ratio greater than one but not robust under misspecification of the ICC for effects. Finally, we show how to calculate optimal or maximin sample sizes that yield sufficient power for a test on the cost-effectiveness of an intervention.
NASA Astrophysics Data System (ADS)
Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco
2017-04-01
Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This process is repeated until a threshold in the objective function is met or insufficient changes are produced in successive iterations.
NASA Astrophysics Data System (ADS)
Campo, Lorenzo; Castelli, Fabio; Caparrini, Francesca
2010-05-01
The modern distributed hydrological models allow the representation of the different surface and subsurface phenomena with great accuracy and high spatial and temporal resolution. Such complexity requires, in general, an equally accurate parametrization. A number of approaches have been followed in this respect, from simple local search method (like Nelder-Mead algorithm), that minimize a cost function representing some distance between model's output and available measures, to more complex approaches like dynamic filters (such as the Ensemble Kalman Filter) that carry on an assimilation of the observations. In this work the first approach was followed in order to compare the performances of three different direct search algorithms on the calibration of a distributed hydrological balance model. The direct search family can be defined as that category of algorithms that make no use of derivatives of the cost function (that is, in general, a black box) and comprehend a large number of possible approaches. The main benefit of this class of methods is that they don't require changes in the implementation of the numerical codes to be calibrated. The first algorithm is the classical Nelder-Mead, often used in many applications and utilized as reference. The second algorithm is a GSS (Generating Set Search) algorithm, built in order to guarantee the conditions of global convergence and suitable for a parallel and multi-start implementation, here presented. The third one is the EGO algorithm (Efficient Global Optimization), that is particularly suitable to calibrate black box cost functions that require expensive computational resource (like an hydrological simulation). EGO minimizes the number of evaluations of the cost function balancing the need to minimize a response surface that approximates the problem and the need to improve the approximation sampling where prediction error may be high. The hydrological model to be calibrated was MOBIDIC, a complete balance distributed model developed at the Department of Civil and Environmental Engineering of the University of Florence. Discussion on the comparisons between the effectiveness of the different algorithms on different cases of study on Central Italy basins is provided.
Asakawa, Naoya; Uchida, Keisuke; Sakakibara, Mamoru; Omote, Kazunori; Noguchi, Keiji; Tokuda, Yusuke; Kamiya, Kiwamu; Hatanaka, Kanako C; Matsuno, Yoshihiro; Yamada, Shiro; Asakawa, Kyoko; Fukasawa, Yuichiro; Nagai, Toshiyuki; Anzai, Toshihisa; Ikeda, Yoshihiko; Ishibashi-Ueda, Hatsue; Hirota, Masanori; Orii, Makoto; Akasaka, Takashi; Uto, Kenta; Shingu, Yasushige; Matsui, Yoshiro; Morimoto, Shin-Ichiro; Tsutsui, Hiroyuki; Eishi, Yoshinobu
2017-01-01
Although rare, cardiac sarcoidosis (CS) is potentially fatal. Early diagnosis and intervention are essential, but histopathologic diagnosis is limited. We aimed to detect Propionibacterium acnes, a commonly implicated etiologic agent of sarcoidosis, in myocardial tissues obtained from CS patients. We examined formalin-fixed paraffin-embedded myocardial tissues obtained by surgery or autopsy and endomyocardial biopsy from patients with CS (n = 26; CS-group), myocarditis (n = 15; M-group), or other cardiomyopathies (n = 39; CM-group) using immunohistochemistry (IHC) with a P. acnes-specific monoclonal antibody. We found granulomas in 16 (62%) CS-group samples. Massive (≥14 inflammatory cells) and minimal (<14 inflammatory cells) inflammatory foci, respectively, were detected in 16 (62%) and 11 (42%) of the CS-group samples, 10 (67%) and 10 (67%) of the M-group samples, and 1 (3%) and 18 (46%) of the CM-group samples. P. acnes-positive reactivity in granulomas, massive inflammatory foci, and minimal inflammatory foci were detected in 10 (63%), 10 (63%), and 8 (73%) of the CS-group samples, respectively, and in none of the M-group and CM-group samples. Frequent identification of P. acnes in sarcoid granulomas of originally aseptic myocardial tissues suggests that this indigenous bacterium causes granuloma in many CS patients. IHC detection of P. acnes in massive or minimal inflammatory foci of myocardial biopsy samples without granulomas may be useful for differentiating sarcoidosis from myocarditis or other cardiomyopathies.
A weighted ℓ{sub 1}-minimization approach for sparse polynomial chaos expansions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peng, Ji; Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu
2014-06-15
This work proposes a method for sparse polynomial chaos (PC) approximation of high-dimensional stochastic functions based on non-adapted random sampling. We modify the standard ℓ{sub 1}-minimization algorithm, originally proposed in the context of compressive sampling, using a priori information about the decay of the PC coefficients, when available, and refer to the resulting algorithm as weightedℓ{sub 1}-minimization. We provide conditions under which we may guarantee recovery using this weighted scheme. Numerical tests are used to compare the weighted and non-weighted methods for the recovery of solutions to two differential equations with high-dimensional random inputs: a boundary value problem with amore » random elliptic operator and a 2-D thermally driven cavity flow with random boundary condition.« less
[Bacteriological quality of air in a ward for sterile pharmaceutical preparations].
Caorsi P, Beatriz; Sakurada Z, Andrea; Ulloa F, M Teresa; Pezzani V, Marcela; Latorre O, Paz
2011-02-01
An extremely clean area is required for preparation of sterile pharmaceutical compounds, in compliance with international standards, to minimize the probability of microbial contamination. To evaluate the bacteriological quality of the air in the Sterile Pharmaceutical Preparation Unit of the University of Chile's Clinical Hospital and to set up alerts and action levels of bacterial growth. We studied eight representative sites of our Unit on a daily basis from January to February 2005 and twice a week from June 2005 to February 2006. We collected 839 samples of air by impact in the Petri dish. 474 (56.5%) samples were positive; 17 (3.5%) of them had an inappropriate bacterial growth (2% of total samples). The samples from sites 1 and 2 (big and small biosafety cabinets) were negative. The countertop and transfer area occasionally exceeded the bacterial growth limits. The most frequently isolated bacteria were coagulase-negative staphylococci, Micrococcus spp and Corynebacterium spp, from skin microbiota, and Bacillus spp, an environmental bacteria. From a microbiological perspective, the air quality in our sterile preparation unit complied with international standards. Setting institutional alerts and action levels and appropriately identifying bacteria in sensitive areas permits quantification of the microbial load and application of preventive measures.
Bal, Dominika; Gradowska, Wanda; Gryff-Keller, Adam
2002-06-15
Determination of the absolute configuration of some metabolites in body fluids is important for the diagnosis of some inborn errors of metabolism. Presently available methods of such determinations are tedious and usually require highly specialized instrumentation. In this work, an alternative method, based on high-resolution nuclear magnetic resonance spectroscopy in the presence of the chiral lanthanide shift reagent as an auxiliary additive, has been proposed (NMR/LSR). The method involves the lineshape analysis of a chosen multiplet of the one-dimensional 1H NMR spectrum or application of the two-dimensional 1H-13C correlation spectroscopy (HSQC). In order to confirm the resonance assignments and to boost the signal-noise ratio, the addition of an amount of racemic analyte to the urine sample is recommended. The entire procedure is simple in application and demands minimal or no preprocessing of urine samples. The effectiveness of the method has been confirmed by finding the expected forms of 2-hydroxyglutaric acid and 5-oxoproline in the urine samples of an independently diagnosed patient with 2-D-hydroxyglutaric aciduria and 5-L-oxoprolinuria, respectively.
Chandu, Dilip; Paul, Sudakshina; Parker, Mathew; Dudin, Yelena; King-Sitzes, Jennifer; Perez, Tim; Mittanck, Don W.; Shah, Manali; Glenn, Kevin C.; Piepenburg, Olaf
2016-01-01
Testing for the presence of genetically modified material in seed samples is of critical importance for all stakeholders in the agricultural industry, including growers, seed manufacturers, and regulatory bodies. While rapid antibody-based testing for the transgenic protein has fulfilled this need in the past, the introduction of new variants of a given transgene demands new diagnostic regimen that allows distinguishing different traits at the nucleic acid level. Although such molecular tests can be performed by PCR in the laboratory, their requirement for expensive equipment and sophisticated operation have prevented its uptake in point-of-use applications. A recently developed isothermal DNA amplification technique, recombinase polymerase amplification (RPA), combines simple sample preparation and amplification work-flow procedures with the use of minimal detection equipment in real time. Here, we report the development of a highly sensitive and specific RPA-based detection system for Genuity Roundup Ready 2 Yield (RR2Y) material in soybean (Glycine max) seed samples and present the results of studies applying the method in both laboratory and field-type settings. PMID:27314015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, T.; Jones, H.; Wong, K.
The Marshall Islands Environmental Characterization and Dose Assessment Program has recently implemented waste minimization measures to reduce low level radioactive (LLW) and low level mixed (LLWMIXED) waste streams at the Lawrence Livermore National Laboratory (LLNL). Several thousand environmental samples are collected annually from former US nuclear test sites in the Marshall Islands, and returned to LLNL for processing and radiometric analysis. In the past, we analyzed coconut milk directly by gamma-spectrometry after adding formaldehyde (as preservative) and sealing the fluid in metal cans. This procedure was not only tedious and time consuming but generated storage and waste disposal problems. Wemore » have now reduced the number of coconut milk samples required for analysis from 1500 per year to approximately 250, and developed a new analytical procedure which essentially eliminates the associated mixed radioactive waste stream. Coconut milk samples are mixed with a few grams of ammonium-molydophosphate (AMP) which quantitatively scavenges the target radionuclide cesium 137 in an ion-exchange process. The AMP is then separated from the mixture and sealed in a plastic container. The bulk sample material can be disposed of as a non- radioactive non-hazardous waste, and the relatively small amount of AMP conveniently counted by gamma-spectrometry, packaged and stored for future use.« less
A magnetic resonance (MR) microscopy system using a microfluidically cryo-cooled planar coil.
Koo, Chiwan; Godley, Richard F; Park, Jaewon; McDougall, Mary P; Wright, Steven M; Han, Arum
2011-07-07
We present the development of a microfluidically cryo-cooled planar coil for magnetic resonance (MR) microscopy. Cryogenically cooling radiofrequency (RF) coils for magnetic resonance imaging (MRI) can improve the signal to noise ratio (SNR) of the experiment. Conventional cryostats typically use a vacuum gap to keep samples to be imaged, especially biological samples, at or near room temperature during cryo-cooling. This limits how close a cryo-cooled coil can be placed to the sample. At the same time, a small coil-to-sample distance significantly improves the MR imaging capability due to the limited imaging depth of planar MR microcoils. These two conflicting requirements pose challenges to the use of cryo-cooling in MR microcoils. The use of a microfluidic based cryostat for localized cryo-cooling of MR microcoils is a step towards eliminating these constraints. The system presented here consists of planar receive-only coils with integrated cryo-cooling microfluidic channels underneath, and an imaging surface on top of the planar coils separated by a thin nitrogen gas gap. Polymer microfluidic channel structures fabricated through soft lithography processes were used to flow liquid nitrogen under the coils in order to cryo-cool the planar coils to liquid nitrogen temperature (-196 °C). Two unique features of the cryo-cooling system minimize the distance between the coil and the sample: (1) the small dimension of the polymer microfluidic channel enables localized cooling of the planar coils, while minimizing thermal effects on the nearby imaging surface. (2) The imaging surface is separated from the cryo-cooled planar coil by a thin gap through which nitrogen gas flows to thermally insulate the imaging surface, keeping it above 0 °C and preventing potential damage to biological samples. The localized cooling effect was validated by simulations, bench testing, and MR imaging experiments. Using this cryo-cooled planar coil system inside a 4.7 Tesla MR system resulted in an average image SNR enhancement of 1.47 ± 0.11 times relative to similar room-temperature coils. This journal is © The Royal Society of Chemistry 2011
A Magnetic Resonance (MR) Microscopy System using a Microfluidically Cryo-Cooled Planar Coil
Koo, Chiwan; Godley, Richard F.; Park, Jaewon; McDougall, Mary P.; Wright, Steven M.; Han, Arum
2011-01-01
We present the development of a microfluidically cryo-cooled planar coil for magnetic resonance (MR) microscopy. Cryogenically cooling radiofrequency (RF) coils for magnetic resonance imaging (MRI) can improve the signal to noise ratio (SNR) of the experiment. Conventional cryostats typically use a vacuum gap to keep samples to be imaged, especially biological samples, at or near room temperature during cryo-cooling. This limits how close a cryo-cooled coil can be placed to the sample. At the same time, a small coil-to-sample distance significantly improves the MR imaging capability due to the limited imaging depth of planar MR microcoils. These two conflicting requirements pose challenges to the use of cryo-cooling in MR microcoils. The use of a microfluidic based cryostat for localized cryo-cooling of MR microcoils is a step towards eliminating these constraints. The system presented here consists of planar receive-only coils with integrated cryo-cooling microfluidic channels underneath, and an imaging surface on top of the planar coils separated by a thin nitrogen gas gap. Polymer microfluidic channel structures fabricated through soft lithography processes were used to flow liquid nitrogen under the coils in order to cryo-cool the planar coils to liquid nitrogen temperature (−196°C). Two unique features of the cryo-cooling system minimize the distance between the coil and the sample: 1) The small dimension of the polymer microfluidic channel enables localized cooling of the planar coils, while minimizing thermal effects on the nearby imaging surface. 2) The imaging surface is separated from the cryo-cooled planar coil by a thin gap through which nitrogen gas flows to thermally insulate the imaging surface, keeping it above 0°C and preventing potential damage to biological samples. The localized cooling effect was validated by simulations, bench testing, and MR imaging experiments. Using this cryo-cooled planar coil system inside a 4.7 Tesla MR system resulted in an average image SNR enhancement of 1.47 ± 0.11 times relative to similar room-temperature coils. PMID:21603723
System automatically supplies precise analytical samples of high-pressure gases
NASA Technical Reports Server (NTRS)
Langdon, W. M.
1967-01-01
High-pressure-reducing and flow-stabilization system delivers analytical gas samples from a gas supply. The system employs parallel capillary restrictors for pressure reduction and downstream throttling valves for flow control. It is used in conjunction with a sampling valve and minimizes alterations of the sampled gas.
40 CFR 1065.1107 - Sample media and sample system preparation; sample system assembly.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) For capturing PM, we recommend using pure quartz filters with no binder. Select the filter diameter to minimize filter change intervals, accounting for the expected PM emission rate, sample flow rate, and... filter without replacing the sorbent or otherwise disassembling the batch sampler. In those cases...
Sampling Error in a Particulate Mixture: An Analytical Chemistry Experiment.
ERIC Educational Resources Information Center
Kratochvil, Byron
1980-01-01
Presents an undergraduate experiment demonstrating sampling error. Selected as the sampling system is a mixture of potassium hydrogen phthalate and sucrose; using a self-zeroing, automatically refillable buret to minimize titration time of multiple samples and employing a dilute back-titrant to obtain high end-point precision. (CS)
Open-target sparse sensing of biological agents using DNA microarray
2011-01-01
Background Current biosensors are designed to target and react to specific nucleic acid sequences or structural epitopes. These 'target-specific' platforms require creation of new physical capture reagents when new organisms are targeted. An 'open-target' approach to DNA microarray biosensing is proposed and substantiated using laboratory generated data. The microarray consisted of 12,900 25 bp oligonucleotide capture probes derived from a statistical model trained on randomly selected genomic segments of pathogenic prokaryotic organisms. Open-target detection of organisms was accomplished using a reference library of hybridization patterns for three test organisms whose DNA sequences were not included in the design of the microarray probes. Results A multivariate mathematical model based on the partial least squares regression (PLSR) was developed to detect the presence of three test organisms in mixed samples. When all 12,900 probes were used, the model correctly detected the signature of three test organisms in all mixed samples (mean(R2)) = 0.76, CI = 0.95), with a 6% false positive rate. A sampling algorithm was then developed to sparsely sample the probe space for a minimal number of probes required to capture the hybridization imprints of the test organisms. The PLSR detection model was capable of correctly identifying the presence of the three test organisms in all mixed samples using only 47 probes (mean(R2)) = 0.77, CI = 0.95) with nearly 100% specificity. Conclusions We conceived an 'open-target' approach to biosensing, and hypothesized that a relatively small, non-specifically designed, DNA microarray is capable of identifying the presence of multiple organisms in mixed samples. Coupled with a mathematical model applied to laboratory generated data, and sparse sampling of capture probes, the prototype microarray platform was able to capture the signature of each organism in all mixed samples with high sensitivity and specificity. It was demonstrated that this new approach to biosensing closely follows the principles of sparse sensing. PMID:21801424
NASA Astrophysics Data System (ADS)
Mota, Mariana F. B.; Gama, Ednilton M.; Rodrigues, Gabrielle de C.; Rodrigues, Guilherme D.; Nascentes, Clésia C.; Costa, Letícia M.
2018-01-01
In this work, a dilute-and-shoot method was developed for Ca, P, S and Zn determination in new and used lubricating oil samples by total reflection X-ray fluorescence (TXRF). The oil samples were diluted with organic solvents followed by addition of yttrium as internal standard and the TXRF measurements were performed after solvent evaporation. The method was optimized using an interlaboratorial reference material. The experimental parameters evaluated were sample volume (50 or 100 μL), measurement time (250 or 500 s) and volume deposited on the quartz glass sample carrier (5 or 10 μL). All of them were evaluated and optimized using xylene, kerosene and hexane. Analytical figures of merit (accuracy, precision, limit of detection and quantification) were used to evaluate the performance of the analytical method for all solvents. The recovery rates varied from 99 to 111% and the relative standard deviation remained between 1.7% and 10% (n = 8). For all elements, the results obtained by applying the new method were in agreement with the certified value. After the validation step, the method was applied for Ca, P, S and Zn quantification in eight new and four used lubricating oil samples, for all solvents. The concentration of the elements in the samples varied in the ranges of 1620-3711 mg L- 1 for Ca, 704-1277 mg L- 1 for P, 2027-9147 mg L- 1 for S, and 898-1593 mg L- 1 for Zn. The association of TXRF with a dilute-and-shoot sample preparation strategy was efficient for Ca, P, S and Zn determination in lubricating oils, presenting accurate results. Additionally, the time required for analysis is short, the reagent volumes are low minimizing waste generation, and the technique does not require calibration curves.
Khoo, T-L; Xiros, N; Guan, F; Orellana, D; Holst, J; Joshua, D E; Rasko, J E J
2013-08-01
The CELL-DYN Emerald is a compact bench-top hematology analyzer that can be used for a three-part white cell differential analysis. To determine its utility for analysis of human and mouse samples, we evaluated this machine against the larger CELL-DYN Sapphire and Sysmex XT2000iV hematology analyzers. 120 human (normal and abnormal) and 30 mouse (normal and abnormal) samples were analyzed on both the CELL-DYN Emerald and CELL-DYN Sapphire or Sysmex XT2000iV analyzers. For mouse samples, the CELL-DYN Emerald analyzer required manual recalibration based on the histogram populations. Analysis of the CELL-DYN Emerald showed excellent precision, within accepted ranges (white cell count CV% = 2.09%; hemoglobin CV% = 1.68%; platelets CV% = 4.13%). Linearity was excellent (R² ≥ 0.99), carryover was minimal (<1%), and overall interinstrument agreement was acceptable for both human and mouse samples. Comparison between the CELL-DYN Emerald and Sapphire analyzers for human samples or Sysmex XT2000iV analyzer for mouse samples showed excellent correlation for all parameters. The CELL-DYN Emerald was generally comparable to the larger reference analyzer for both human and mouse samples. It would be suitable for use in satellite research laboratories or as a backup system in larger laboratories. © 2012 John Wiley & Sons Ltd.
Design criteria for developing low-resource magnetic bead assays using surface tension valves
Adams, Nicholas M.; Creecy, Amy E.; Majors, Catherine E.; Wariso, Bathsheba A.; Short, Philip A.; Wright, David W.; Haselton, Frederick R.
2013-01-01
Many assays for biological sample processing and diagnostics are not suitable for use in settings that lack laboratory resources. We have recently described a simple, self-contained format based on magnetic beads for extracting infectious disease biomarkers from complex biological samples, which significantly reduces the time, expertise, and infrastructure required. This self-contained format has the potential to facilitate the application of other laboratory-based sample processing assays in low-resource settings. The technology is enabled by immiscible fluid barriers, or surface tension valves, which stably separate adjacent processing solutions within millimeter-diameter tubing and simultaneously permit the transit of magnetic beads across the interfaces. In this report, we identify the physical parameters of the materials that maximize fluid stability and bead transport and minimize solution carryover. We found that fluid stability is maximized with ≤0.8 mm i.d. tubing, valve fluids of similar density to the adjacent solutions, and tubing with ≤20 dyn/cm surface energy. Maximizing bead transport was achieved using ≥2.4 mm i.d. tubing, mineral oil valve fluid, and a mass of 1-3 mg beads. The amount of solution carryover across a surface tension valve was minimized using ≤0.2 mg of beads, tubing with ≤20 dyn/cm surface energy, and air separators. The most favorable parameter space for valve stability and bead transport was identified by combining our experimental results into a single plot using two dimensionless numbers. A strategy is presented for developing additional self-contained assays based on magnetic beads and surface tension valves for low-resource diagnostic applications. PMID:24403996
NASA Technical Reports Server (NTRS)
Meneghini, Robert; Kim, Hyokyung
2016-01-01
For an airborne or spaceborne radar, the precipitation-induced path attenuation can be estimated from the measurements of the normalized surface cross section, sigma 0, in the presence and absence of precipitation. In one implementation, the mean rain-free estimate and its variability are found from a lookup table (LUT) derived from previously measured data. For the dual-frequency precipitation radar aboard the global precipitation measurement satellite, the nominal table consists of the statistics of the rain-free 0 over a 0.5 deg x 0.5 deg latitude-longitude grid using a three-month set of input data. However, a problem with the LUT is an insufficient number of samples in many cells. An alternative table is constructed by a stepwise procedure that begins with the statistics over a 0.25 deg x 0.25 deg grid. If the number of samples at a cell is too few, the area is expanded, cell by cell, choosing at each step that cell that minimizes the variance of the data. The question arises, however, as to whether the selected region corresponds to the smallest variance. To address this question, a second type of variable-averaging grid is constructed using all possible spatial configurations and computing the variance of the data within each region. Comparisons of the standard deviations for the fixed and variable-averaged grids are given as a function of incidence angle and surface type using a three-month set of data. The advantage of variable spatial averaging is that the average standard deviation can be reduced relative to the fixed grid while satisfying the minimum sample requirement.
Aziz, Fahad
2012-09-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) offers a minimally invasive alternative to mediastinoscopy with additional access to the hilar nodes, a better safety profile, and it removes the costs and hazards of theatre time and general anesthesia with comparable sensitivity, although the negative predictive value of mediastinoscopy (and sample size) is greater. EBUS- TBNA also obtains larger samples than conventional TBNA, has superior performance and theoretically is safer, allowing real-time sampling under direct vision. It can also have predictive value both in sonographic appearance of the nodes and histological characteristics. EBUS-TBNA is therefore indicated for NSCLC staging, diagnosis of lung cancer when there is no endobronchial lesion, and diagnosis of both benign (especially tuberculosis and sarcoidosis) and malignant mediastinal lesions. The procedure is different than for flexible bronchoscopy, takes longer, and requires more training. EBUS-TBNA is more expensive than conventional TBNA but can save costs by reducing the number of more costly mediastinoscopies. In the future, endobronchial ultrasound may have applications in airways disease and pulmonary vascular disease.
Nanoscale imaging of clinical specimens using pathology-optimized expansion microscopy
Zhao, Yongxin; Bucur, Octavian; Irshad, Humayun; Chen, Fei; Weins, Astrid; Stancu, Andreea L.; Oh, Eun-Young; DiStasio, Marcello; Torous, Vanda; Glass, Benjamin; Stillman, Isaac E.; Schnitt, Stuart J.; Beck, Andrew H.; Boyden, Edward S.
2017-01-01
Expansion microscopy (ExM), a method for improving the resolution of light microscopy by physically expanding the specimen, has not been applied to clinical tissue samples. Here we report a clinically optimized form of ExM that supports nanoscale imaging of human tissue specimens that have been fixed with formalin, embedded in paraffin, stained with hematoxylin and eosin (H&E), and/or fresh frozen. The method, which we call expansion pathology (ExPath), converts clinical samples into an ExM-compatible state, then applies an ExM protocol with protein anchoring and mechanical homogenization steps optimized for clinical samples. ExPath enables ~70 nm resolution imaging of diverse biomolecules in intact tissues using conventional diffraction-limited microscopes, and standard antibody and fluorescent DNA in situ hybridization reagents. We use ExPath for optical diagnosis of kidney minimal-change disease, which previously required electron microscopy (EM), and demonstrate high-fidelity computational discrimination between early breast neoplastic lesions that to date have challenged human judgment. ExPath may enable the routine use of nanoscale imaging in pathology and clinical research. PMID:28714966
INAPPROPRIATE CONFIDENCE AND RETIREMENT PLANNING: FOUR STUDIES WITH A NATIONAL SAMPLE
Parker, Andrew M.; de Bruin, Wändi Bruine; Yoong, Joanne; Willis, Robert
2011-01-01
Financial decisions about investing and saving for retirement are increasingly complex, requiring financial knowledge and confidence in that knowledge. Few studies have examined whether direct assessments of individuals’ confidence are related to the outcomes of their financial decisions. Here, we analyzed data from a national sample recruited through RAND’s American Life Panel (ALP), an internet panel of U.S. adults aged 18 to 88. We examined the relationship of confidence with self-reported and actual financial decisions, using four different tasks, each performed by overlapping samples of ALP participants. The four tasks were designed by different researchers for different purposes, using different methods to assess confidence. Yet, measures of confidence were correlated across tasks, and results were consistent across methodologies. Confidence and knowledge showed only modest positive correlations. However, even after controlling for actual knowledge, individuals with greater confidence were more likely to report financial planning for retirement and to successfully minimize fees on a hypothetical investment task. Implications for the role of confidence (even if it is unjustified) in investment behavior is discussed. PMID:23049164
Nanoscale imaging of clinical specimens using pathology-optimized expansion microscopy.
Zhao, Yongxin; Bucur, Octavian; Irshad, Humayun; Chen, Fei; Weins, Astrid; Stancu, Andreea L; Oh, Eun-Young; DiStasio, Marcello; Torous, Vanda; Glass, Benjamin; Stillman, Isaac E; Schnitt, Stuart J; Beck, Andrew H; Boyden, Edward S
2017-08-01
Expansion microscopy (ExM), a method for improving the resolution of light microscopy by physically expanding a specimen, has not been applied to clinical tissue samples. Here we report a clinically optimized form of ExM that supports nanoscale imaging of human tissue specimens that have been fixed with formalin, embedded in paraffin, stained with hematoxylin and eosin, and/or fresh frozen. The method, which we call expansion pathology (ExPath), converts clinical samples into an ExM-compatible state, then applies an ExM protocol with protein anchoring and mechanical homogenization steps optimized for clinical samples. ExPath enables ∼70-nm-resolution imaging of diverse biomolecules in intact tissues using conventional diffraction-limited microscopes and standard antibody and fluorescent DNA in situ hybridization reagents. We use ExPath for optical diagnosis of kidney minimal-change disease, a process that previously required electron microscopy, and we demonstrate high-fidelity computational discrimination between early breast neoplastic lesions for which pathologists often disagree in classification. ExPath may enable the routine use of nanoscale imaging in pathology and clinical research.
SU-G-IeP1-13: Sub-Nyquist Dynamic MRI Via Prior Rank, Intensity and Sparsity Model (PRISM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, B; Gao, H
Purpose: Accelerated dynamic MRI is important for MRI guided radiotherapy. Inspired by compressive sensing (CS), sub-Nyquist dynamic MRI has been an active research area, i.e., sparse sampling in k-t space for accelerated dynamic MRI. This work is to investigate sub-Nyquist dynamic MRI via a previously developed CS model, namely Prior Rank, Intensity and Sparsity Model (PRISM). Methods: The proposed method utilizes PRISM with rank minimization and incoherent sampling patterns for sub-Nyquist reconstruction. In PRISM, the low-rank background image, which is automatically calculated by rank minimization, is excluded from the L1 minimization step of the CS reconstruction to further sparsify themore » residual image, thus allowing for higher acceleration rates. Furthermore, the sampling pattern in k-t space is made more incoherent by sampling a different set of k-space points at different temporal frames. Results: Reconstruction results from L1-sparsity method and PRISM method with 30% undersampled data and 15% undersampled data are compared to demonstrate the power of PRISM for dynamic MRI. Conclusion: A sub- Nyquist MRI reconstruction method based on PRISM is developed with improved image quality from the L1-sparsity method.« less
Advanced Curation of Current and Future Extraterrestrial Samples
NASA Technical Reports Server (NTRS)
Allen, Carlton C.
2013-01-01
Curation of extraterrestrial samples is the critical interface between sample return missions and the international research community. Curation includes documentation, preservation, preparation, and distribution of samples. The current collections of extraterrestrial samples include: Lunar rocks / soils collected by the Apollo astronauts Meteorites, including samples of asteroids, the Moon, and Mars "Cosmic dust" (asteroid and comet particles) collected by high-altitude aircraft Solar wind atoms collected by the Genesis spacecraft Comet particles collected by the Stardust spacecraft Interstellar dust collected by the Stardust spacecraft Asteroid particles collected by the Hayabusa spacecraft These samples were formed in environments strikingly different from that on Earth. Terrestrial contamination can destroy much of the scientific significance of many extraterrestrial materials. In order to preserve the research value of these precious samples, contamination must be minimized, understood, and documented. In addition the samples must be preserved - as far as possible - from physical and chemical alteration. In 2011 NASA selected the OSIRIS-REx mission, designed to return samples from the primitive asteroid 1999 RQ36 (Bennu). JAXA will sample C-class asteroid 1999 JU3 with the Hayabusa-2 mission. ESA is considering the near-Earth asteroid sample return mission Marco Polo-R. The Decadal Survey listed the first lander in a Mars sample return campaign as its highest priority flagship-class mission, with sample return from the South Pole-Aitken basin and the surface of a comet among additional top priorities. The latest NASA budget proposal includes a mission to capture a 5-10 m asteroid and return it to the vicinity of the Moon as a target for future sampling. Samples, tools, containers, and contamination witness materials from any of these missions carry unique requirements for acquisition and curation. Some of these requirements represent significant advances over methods currently used. New analytical and screening techniques will increase the value of current sample collections. Improved web-based tools will make information on all samples more accessible to researchers and the public. Advanced curation of current and future extraterrestrial samples includes: Contamination Control - inorganic / organic Temperature of preservation - subfreezing / cryogenic Non-destructive preliminary examination - X-ray tomography / XRF mapping / Raman mapping Microscopic samples - handling / sectioning / transport Special samples - unopened lunar cores Informatics - online catalogs / community-based characterization.
Tsaur, G A; Riger, T O; Popov, A M; Nasedkina, T V; Kustanovich, A M; Solodovnikov, A G; Streneva, O V; Shorikov, E V; Tsvirenko, S V; Saveliev, L I; Fechina, L G
2015-04-01
The occurrence of minimal residual disease is an important prognostic factor under acute lymphoblastic leucosis in children and adults. In overwhelming majority of research studies bone marrow is used to detect minimal residual disease. The comparative characteristic of detection of minimal residual disease in peripheral blood and bone marrow was carried out. The prognostic role of occurrence of minimal residual disease in peripheral blood and bone marrow under therapy according protocol MLL-Baby was evaluated. The analysis embraced 142 pair samples from 53 patients with acute lymphoblastic leucosis and various displacements of gene MLL younger than 365 days. The minimal residual disease was detected by force of identification of chimeric transcripts using polymerase chain reaction in real-time mode in 7 sequential points of observation established by protocol of therapy. The comparability of results of qualitative detection of minimal residual disease in bone marrow and peripheral blood amounted to 84.5%. At that, in all 22 (15.5%) discordant samples minimal residual disease was detected only in bone marrow. Despite of high level of comparability of results of detection of minimal residual disease in peripheral blood and bone marrow the occurrence of minimal residual disease in peripheral blood at various stages of therapy demonstrated no independent prognostic significance. The established differences had no relationship with sensitivity of method determined by value of absolute expression of gene ABL. Most likely, these differences reflected real distribution of tumor cells. The results of study demonstrated that application of peripheral blood instead of bone marrow for monitoring of minimal residual disease under acute lymphoblastic leucosis in children of first year of life is inappropriate. At the same time, retention of minimal residual disease in TH4 in bone marrow was an independent and prognostic unfavorable factor under therapy of acute lymphoblastic leucosis of children of first year of life according protocol MLL-Baby (OO=7.326, confidence interval 2.378-22.565).
Rapid DNA analysis for automated processing and interpretation of low DNA content samples.
Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F
2016-01-01
Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.
MacDonald, Russell D; Thomas, Laura; Rusk, Frederick C; Marques, Shauna D; McGuire, Dan
2010-01-01
Transport medicine personnel are potentially exposed to jet fuel combustion products. Setting-specific data are required to determine whether this poses a risk. This study assessed exposure to jet fuel combustion products, compared various engine ignition scenarios, and determined methods to minimize exposure. The Beechcraft King Air B200 turboprop aircraft equipped with twin turbine engines, using a kerosene-based jet fuel (Jet A-1), was used to measure products of combustion during boarding, engine startup, and flight in three separate engine start scenarios ("shielded": internal engine start, door closed; "exposed": ground power unit start, door open; and "minimized": ground power unit right engine start, door open). Real-time continuous monitoring equipment was used for oxygen, carbon dioxide, carbon monoxide, nitrogen dioxide, hydrogen sulfide, sulfur dioxide, volatile organic compounds, and particulate matter. Integrated methods were used for aldehydes, polycyclic aromatic hydrocarbons, volatile organic compounds, and aliphatic hydrocarbons. Samples were taken in the paramedic breathing zone for approximately 60 minutes, starting just before the paramedics boarded the aircraft. Data were compared against regulated time-weighted exposure thresholds to determine the presence of potentially harmful products of combustion. Polycyclic aromatic hydrocarbons, aldehydes, volatile organic compounds, and aliphatic hydrocarbons were found at very low concentrations or beneath the limits of detection. There were significant differences in exposures to particulates, carbon monoxide, and total volatile organic compound between the "exposed" and "minimized" scenarios. Elevated concentrations of carbon monoxide and total volatile organic compounds were present during the ground power unit-assisted dual-engine start. There were no appreciable exposures during the "minimized" or "shielded" scenarios. Air medical personnel exposures to jet fuel combustion products were generally low and did not exceed established U.S. or Canadian health and safety exposure limits. Avoidance of ground power unit-assisted dual-engine starts and closing the hangar door prior to start minimize or eliminate the occupational exposure.
A field-deployable mobile molecular diagnostic system for malaria at the point of need.
Choi, Gihoon; Song, Daniel; Shrestha, Sony; Miao, Jun; Cui, Liwang; Guan, Weihua
2016-11-01
In response to the urgent need of a field-deployable and highly sensitive malaria diagnosis, we developed a standalone, "sample-in-answer-out" molecular diagnostic system (AnyMDx) to enable quantitative molecular analysis of blood-borne malaria in low resource areas. The system consists of a durable battery-powered analyzer and a disposable microfluidic compact disc loaded with reagents ready for use. A low power thermal module and a novel fluorescence-sensing module are integrated into the analyzer for real-time monitoring of loop-mediated isothermal nucleic acid amplification (LAMP) of target parasite DNA. With 10 μL of raw blood sample, the AnyMDx system automates the nucleic acid sample preparation and subsequent LAMP and real-time detection. Under laboratory conditions with whole-blood samples spiked with cultured Plasmodium falciparum, we achieved a detection limit of ∼0.6 parasite per μL, much lower than those for the conventional microscopy and rapid diagnostic tests (∼50-100 parasites per μL). The turnaround time from sample to answer is less than 40 minutes. The AnyMDx is user-friendly requiring minimal technological training. The analyzer and the disposable reagent compact discs are cost-effective, making AnyMDx a potential tool for malaria molecular diagnosis under field settings for malaria elimination.
Lorenz, J.J.; McIvor, C.C.; Powell, G.V.N.; Frederick, P.C.
1997-01-01
We describe a 9 m2 drop net and removable walkways designed to quantify densities of small fishes in wetland habitats with low to moderate vegetation density. The method permits the collection of small, quantitative, discrete samples in ecologically sensitive areas by combining rapid net deployment from fixed sites with the carefully contained use of the fish toxicant rotenone. This method requires very little contact with the substrate, causes minimal alteration to the habitat being sampled, samples small fishes in an unbiased manner, and allows for differential sampling of microhabitats within a wetland. When used in dwarf red mangrove (Rhizophora mangle) habitat in southern Everglades National Park and adjacent areas (September 1990 to March 1993), we achieved high recovery efficiencies (78–90%) for five common species <110 mm in length. We captured 20,193 individuals of 26 species. The most abundant fishes were sheepshead minnowCyprinodon variegatus, goldspotted killifishFloridichthys carpio, rainwater killifishLucania parva, sailfin mollyPoecilia latipinna, and the exotic Mayan cichlidCichlasoma urophthalmus. The 9 m2 drop net and associated removable walkways are versatile and can be used in a variety of wetland types, including both interior and coastal wetlands with either herbaceous or woody vegetation.
Simultaneous extraction of proteins and metabolites from cells in culture
Sapcariu, Sean C.; Kanashova, Tamara; Weindl, Daniel; Ghelfi, Jenny; Dittmar, Gunnar; Hiller, Karsten
2014-01-01
Proper sample preparation is an integral part of all omics approaches, and can drastically impact the results of a wide number of analyses. As metabolomics and proteomics research approaches often yield complementary information, it is desirable to have a sample preparation procedure which can yield information for both types of analyses from the same cell population. This protocol explains a method for the separation and isolation of metabolites and proteins from the same biological sample, in order for downstream use in metabolomics and proteomics analyses simultaneously. In this way, two different levels of biological regulation can be studied in a single sample, minimizing the variance that would result from multiple experiments. This protocol can be used with both adherent and suspension cell cultures, and the extraction of metabolites from cellular medium is also detailed, so that cellular uptake and secretion of metabolites can be quantified. Advantages of this technique includes:1.Inexpensive and quick to perform; this method does not require any kits.2.Can be used on any cells in culture, including cell lines and primary cells extracted from living organisms.3.A wide variety of different analysis techniques can be used, adding additional value to metabolomics data analyzed from a sample; this is of high value in experimental systems biology. PMID:26150938
A Dual Wedge Microneedle for sampling of perilymph solution via round window membrane
Watanabe, Hirobumi; Cardoso, Luis; Lalwani, Anil K.; Kysar, Jeffrey W.
2017-01-01
Objective Precision medicine for inner-ear disease is hampered by the absence of a methodology to sample inner-ear fluid atraumatically. The round window membrane (RWM) is an attractive portal for accessing cochlear fluids as it heals spontaneously. In this study, we report on the development of a microneedle for perilymph sampling that minimizes size of RWM perforation, facilitates quick aspiration, and provides precise volume control. Methods Considering the mechanical anisotropy of the RWM and hydrodynamics through a microneedle, a 31G stainless steel pipe was machined into wedge-shaped design via electrical discharge machining. Guinea pig RWM was penetrated in vitro, and 1 μ1 of perilymph was sampled and analyzed via UV-vis spectroscopy. Results The prototype wedge shaped needle created oval perforation with minor and major diameter of 143 and 344 μm (n=6). The sampling duration and standard deviation of aspirated volume were seconds and 6.8% respectively. The protein concentration was 1.74 mg/mL. Conclusion The prototype needle facilitated precise perforation of RWMs and rapid aspiration of cochlear fluid with precise volume control. The needle design is promising and requires testing in human cadaveric temporal bone and further optimization to become clinically viable. PMID:26888440
Fonneløp, Ane Elida; Johannessen, Helen; Egeland, Thore; Gill, Peter
2016-07-01
As the profiling systems used in forensic analyses have become more sensitive in recent years, the risk of detecting a contamination in a DNA sample has increased proportionally. This requires more stringent work protocols and awareness to minimize the chance of contamination. Although there is high consciousness on contamination and best practice procedures in forensic labs, the same requirements are not always applied by the police. In this study we have investigated the risk of contamination from police staff. Environmental DNA was monitored by performing wipe tests (sampling of hot spots) at two large police units (scenes of crime departments). Additionally, the DNA profiles of the scenes of crime officers were compared to casework samples that their own unit had investigated in the period of 2009-2015. Furthermore, a pilot study to assess whether DNA from the outside package of an exhibit could be transferred to a DNA sample was carried out. Environmental DNA was detected in various samples from hot spots. Furthermore, 16 incidences of previously undetected police-staff contamination were found in casework that had been submitted between 2009 and 2015. In 6 cases the police officers with a matching DNA profile reported that they had not been involved with the case. We have demonstrated that DNA from the outside package can be transferred to an exhibit during examination. This experience demonstrates that when implementing the new multiplex systems, it is important to ensure that 'best practice' procedures are upgraded, and appropriate training is provided in order to ensure that police are aware of the increased contamination risks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
On the influence of crystal size and wavelength on native SAD phasing.
Liebschner, Dorothee; Yamada, Yusuke; Matsugaki, Naohiro; Senda, Miki; Senda, Toshiya
2016-06-01
Native SAD is an emerging phasing technique that uses the anomalous signal of native heavy atoms to obtain crystallographic phases. The method does not require specific sample preparation to add anomalous scatterers, as the light atoms contained in the native sample are used as marker atoms. The most abundant anomalous scatterer used for native SAD, which is present in almost all proteins, is sulfur. However, the absorption edge of sulfur is at low energy (2.472 keV = 5.016 Å), which makes it challenging to carry out native SAD phasing experiments as most synchrotron beamlines are optimized for shorter wavelength ranges where the anomalous signal of sulfur is weak; for longer wavelengths, which produce larger anomalous differences, the absorption of X-rays by the sample, solvent, loop and surrounding medium (e.g. air) increases tremendously. Therefore, a compromise has to be found between measuring strong anomalous signal and minimizing absorption. It was thus hypothesized that shorter wavelengths should be used for large crystals and longer wavelengths for small crystals, but no thorough experimental analyses have been reported to date. To study the influence of crystal size and wavelength, native SAD experiments were carried out at different wavelengths (1.9 and 2.7 Å with a helium cone; 3.0 and 3.3 Å with a helium chamber) using lysozyme and ferredoxin reductase crystals of various sizes. For the tested crystals, the results suggest that larger sample sizes do not have a detrimental effect on native SAD data and that long wavelengths give a clear advantage with small samples compared with short wavelengths. The resolution dependency of substructure determination was analyzed and showed that high-symmetry crystals with small unit cells require higher resolution for the successful placement of heavy atoms.
Report #10-P-0218, September 8, 2010. With minimal exceptions, our independent sampling results at the Wheeler Pit Superfund Site were consistent with the sampling results that EPA Region 5 has obtained historically.