Construction and testing of a simple and economical soil greenhouse gas automatic sampler
Ginting, D.; Arnold, S.L.; Arnold, N.S.; Tubbs, R.S.
2007-01-01
Quantification of soil greenhouse gas emissions requires considerable sampling to account for spatial and/or temporal variation. With manual sampling, additional personnel are often not available to sample multiple sites within a narrow time interval. The objectives were to construct an automatic gas sampler and to compare the accuracy and precision of automatic versus manual sampling. The automatic sampler was tested with carbon dioxide (CO2) fluxes that mimicked the range of CO2 fluxes during a typical corn-growing season in eastern Nebraska. Gas samples were drawn from the chamber at 0, 5, and 10 min manually and with the automatic sampler. The three samples drawn with the automatic sampler were transferred to pre-vacuumed vials after 1 h; thus the samples in syringe barrels stayed connected with the increasing CO2 concentration in the chamber. The automatic sampler sustains accuracy and precision in greenhouse gas sampling while improving time efficiency and reducing labor stress. Copyright ?? Taylor & Francis Group, LLC.
Automatic sample Dewar for MX beam-line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charignon, T.; Tanchon, J.; Trollier, T.
2014-01-29
It is very common for crystals of large biological macromolecules to show considerable variation in quality of their diffraction. In order to increase the number of samples that are tested for diffraction quality before any full data collections at the ESRF*, an automatic sample Dewar has been implemented. Conception and performances of the Dewar are reported in this paper. The automatic sample Dewar has 240 samples capability with automatic loading/unloading ports. The storing Dewar is capable to work with robots and it can be integrated in a full automatic MX** beam-line. The samples are positioned in the front of themore » loading/unloading ports with and automatic rotating plate. A view port has been implemented for data matrix camera reading on each sample loaded in the Dewar. At last, the Dewar is insulated with polyurethane foam that keeps the liquid nitrogen consumption below 1.6 L/h. At last, the static insulation also makes vacuum equipment and maintenance unnecessary. This Dewar will be useful for increasing the number of samples tested in synchrotrons.« less
NASA Technical Reports Server (NTRS)
Jahnsen, Vilhelm J. (Inventor); Campen, Jr., Charles F. (Inventor)
1980-01-01
A sample processor and method for the automatic extraction of families of compounds, known as extracts, from liquid and/or homogenized solid samples are disclosed. The sample processor includes a tube support structure which supports a plurality of extraction tubes, each containing a sample from which families of compounds are to be extracted. The support structure is moveable automatically with respect to one or more extraction stations, so that as each tube is at each station a solvent system, consisting of a solvent and reagents, is introduced therein. As a result an extract is automatically extracted from the tube. The sample processor includes an arrangement for directing the different extracts from each tube to different containers, or to direct similar extracts from different tubes to the same utilization device.
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Hwang, Mi-Jung; Seol, Geun Hee
2015-01-01
Heel blood sampling is a common but painful procedure for neonates. Automatic lancets have been shown to be more effective, with reduced pain and tissue damage, than manual lancets, but the effects of lancet type on cortical activation have not yet been compared. The study aimed to compare the effects of manual and automatic lancets on cerebral oxygenation and pain of heel blood sampling in 24 premature infants with respiratory distress syndrome. Effectiveness was measured by assessing numbers of pricks and squeezes and duration of heel blood sampling. Pain responses were measured using the premature infant pain profile score, heart rate, and oxygen saturation (SpO2). Regional cerebral oxygen saturation (rScO2) was measured using near-infrared spectroscopy, and cerebral fractional tissue oxygen extraction was calculated from SpO2 and rScO. Measures of effectiveness were significantly better with automatic than with manual lancing, including fewer heel punctures (P = .009) and squeezes (P < .001) and shorter duration of heel blood sampling (P = .002). rScO2 was significantly higher (P = .013) and cerebral fractional tissue oxygen extraction after puncture significantly lower (P = .040) with automatic lancing. Premature infant pain profile scores during (P = .004) and after (P = .048) puncture were significantly lower in the automatic than in the manual lancet group. Automatic lancets for heel blood sampling in neonates with respiratory distress syndrome significantly reduced pain and enhanced cerebral oxygenation, suggesting that heel blood should be sampled routinely using an automatic lancet.
NASA Technical Reports Server (NTRS)
Kelbaugh, B. N.; Picciolo, G. L.; Chappelle, E. W.; Colburn, M. E. (Inventor)
1973-01-01
An automated apparatus is reported for sequentially assaying urine samples for the presence of bacterial adenosine triphosphate (ATP) that comprises a rotary table which carries a plurality of sample containing vials and automatically dispenses fluid reagents into the vials preparatory to injecting a light producing luciferase-luciferin mixture into the samples. The device automatically measures the light produced in each urine sample by a bioluminescence reaction of the free bacterial adenosine triphosphate with the luciferase-luciferin mixture. The light measured is proportional to the concentration of bacterial adenosine triphosphate which, in turn, is proportional to the number of bacteria present in the respective urine sample.
Comparison of the efficiency between two sampling plans for aflatoxins analysis in maize
Mallmann, Adriano Olnei; Marchioro, Alexandro; Oliveira, Maurício Schneider; Rauber, Ricardo Hummes; Dilkin, Paulo; Mallmann, Carlos Augusto
2014-01-01
Variance and performance of two sampling plans for aflatoxins quantification in maize were evaluated. Eight lots of maize were sampled using two plans: manual, using sampling spear for kernels; and automatic, using a continuous flow to collect milled maize. Total variance and sampling, preparation, and analysis variance were determined and compared between plans through multifactor analysis of variance. Four theoretical distribution models were used to compare aflatoxins quantification distributions in eight maize lots. The acceptance and rejection probabilities for a lot under certain aflatoxin concentration were determined using variance and the information on the selected distribution model to build the operational characteristic curves (OC). Sampling and total variance were lower at the automatic plan. The OC curve from the automatic plan reduced both consumer and producer risks in comparison to the manual plan. The automatic plan is more efficient than the manual one because it expresses more accurately the real aflatoxin contamination in maize. PMID:24948911
Automatic multiple applicator electrophoresis
NASA Technical Reports Server (NTRS)
Grunbaum, B. W.
1977-01-01
Easy-to-use, economical device permits electrophoresis on all known supporting media. System includes automatic multiple-sample applicator, sample holder, and electrophoresis apparatus. System has potential applicability to fields of taxonomy, immunology, and genetics. Apparatus is also used for electrofocusing.
A new method for automatic discontinuity traces sampling on rock mass 3D model
NASA Astrophysics Data System (ADS)
Umili, G.; Ferrero, A.; Einstein, H. H.
2013-02-01
A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.
NASA Astrophysics Data System (ADS)
Pavlov, S. S.; Dmitriev, A. Yu.; Chepurchenko, I. A.; Frontasyeva, M. V.
2014-11-01
The automation system for measurement of induced activity of gamma-ray spectra for multi-element high volume neutron activation analysis (NAA) was designed, developed and implemented at the reactor IBR-2 at the Frank Laboratory of Neutron Physics. The system consists of three devices of automatic sample changers for three Canberra HPGe detector-based gamma spectrometry systems. Each sample changer consists of two-axis of linear positioning module M202A by DriveSet company and disk with 45 slots for containers with samples. Control of automatic sample changer is performed by the Xemo S360U controller by Systec company. Positioning accuracy can reach 0.1 mm. Special software performs automatic changing of samples and measurement of gamma spectra at constant interaction with the NAA database.
Trefz, Phillip; Rösner, Lisa; Hein, Dietmar; Schubert, Jochen K; Miekisch, Wolfram
2013-04-01
Needle trap devices (NTDs) have shown many advantages such as improved detection limits, reduced sampling time and volume, improved stability, and reproducibility if compared with other techniques used in breath analysis such as solid-phase extraction and solid-phase micro-extraction. Effects of sampling flow (2-30 ml/min) and volume (10-100 ml) were investigated in dry gas standards containing hydrocarbons, aldehydes, and aromatic compounds and in humid breath samples. NTDs contained (single-bed) polymer packing and (triple-bed) combinations of divinylbenzene/Carbopack X/Carboxen 1000. Substances were desorbed from the NTDs by means of thermal expansion and analyzed by gas chromatography-mass spectrometry. An automated CO2-controlled sampling device for direct alveolar sampling at the point-of-care was developed and tested in pilot experiments. Adsorption efficiency for small volatile organic compounds decreased and breakthrough increased when sampling was done with polymer needles from a water-saturated matrix (breath) instead from dry gas. Humidity did not affect analysis with triple-bed NTDs. These NTDs showed only small dependencies on sampling flow and low breakthrough from 1-5 %. The new sampling device was able to control crucial parameters such as sampling flow and volume. With triple-bed NTDs, substance amounts increased linearly with increasing sample volume when alveolar breath was pre-concentrated automatically. When compared with manual sampling, automatic sampling showed comparable or better results. Thorough control of sampling and adequate choice of adsorption material is mandatory for application of needle trap micro-extraction in vivo. The new CO2-controlled sampling device allows direct alveolar sampling at the point-of-care without the need of any additional sampling, storage, or pre-concentration steps.
AN ASSESSMENT OF AUTOMATIC SEWER FLOW SAMPLERS (EPA/600/2-75/065)
A brief review of the characteristics of storm and combined sewer flows is given followed by a general discussion of the purposes for and requirements of a sampling program. The desirable characteristics of automatic sampling equipment are set forth and problem areas are outlined...
Automated mass spectrometer analysis system
NASA Technical Reports Server (NTRS)
Giffin, Charles E. (Inventor); Kuppermann, Aron (Inventor); Dreyer, William J. (Inventor); Boettger, Heinz G. (Inventor)
1982-01-01
An automated mass spectrometer analysis system is disclosed, in which samples are automatically processed in a sample processor and converted into volatilizable samples, or their characteristic volatilizable derivatives. Each volatilizable sample is sequentially volatilized and analyzed in a double focusing mass spectrometer, whose output is in the form of separate ion beams all of which are simultaneously focused in a focal plane. Each ion beam is indicative of a different sample component or different fragments of one or more sample components and the beam intensity is related to the relative abundance of the sample component. The system includes an electro-optical ion detector which automatically and simultaneously converts the ion beams, first into electron beams which in turn produce a related image which is transferred to the target of a vilicon unit. The latter converts the images into electrical signals which are supplied to a data processor, whose output is a list of the components of the analyzed sample and their abundances. The system is under the control of a master control unit, which in addition to monitoring and controlling various power sources, controls the automatic operation of the system under expected and some unexpected conditions and further protects various critical parts of the system from damage due to particularly abnormal conditions.
Automated mass spectrometer analysis system
NASA Technical Reports Server (NTRS)
Boettger, Heinz G. (Inventor); Giffin, Charles E. (Inventor); Dreyer, William J. (Inventor); Kuppermann, Aron (Inventor)
1978-01-01
An automated mass spectrometer analysis system is disclosed, in which samples are automatically processed in a sample processor and converted into volatilizable samples, or their characteristic volatilizable derivatives. Each volatizable sample is sequentially volatilized and analyzed in a double focusing mass spectrometer, whose output is in the form of separate ion beams all of which are simultaneously focused in a focal plane. Each ion beam is indicative of a different sample component or different fragments of one or more sample components and the beam intensity is related to the relative abundance of the sample component. The system includes an electro-optical ion detector which automatically and simultaneously converts the ion beams, first into electron beams which in turn produce a related image which is transferred to the target of a vidicon unit. The latter converts the images into electrical signals which are supplied to a data processor, whose output is a list of the components of the analyzed sample and their abundances. The system is under the control of a master control unit, which in addition to monitoring and controlling various power sources, controls the automatic operation of the system under expected and some unexpected conditions and further protects various critical parts of the system from damage due to particularly abnormal conditions.
Irregular and adaptive sampling for automatic geophysic measure systems
NASA Astrophysics Data System (ADS)
Avagnina, Davide; Lo Presti, Letizia; Mulassano, Paolo
2000-07-01
In this paper a sampling method, based on an irregular and adaptive strategy, is described. It can be used as automatic guide for rovers designed to explore terrestrial and planetary environments. Starting from the hypothesis that a explorative vehicle is equipped with a payload able to acquire measurements of interesting quantities, the method is able to detect objects of interest from measured points and to realize an adaptive sampling, while badly describing the not interesting background.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
Isotopic (d18O/d2H) integrity of water samples collected and stored by automatic samplers
USDA-ARS?s Scientific Manuscript database
Stable water isotopes are increasingly becoming part of routine monitoring programs that utilize automatic samplers. The objectives of this study were to quantify the uncertainty in isotope signatures due to the length of sample storage (1-24 d) inside autosamplers over a range of air temperatures (...
Set Up of an Automatic Water Quality Sampling System in Irrigation Agriculture
Heinz, Emanuel; Kraft, Philipp; Buchen, Caroline; Frede, Hans-Georg; Aquino, Eugenio; Breuer, Lutz
2014-01-01
We have developed a high-resolution automatic sampling system for continuous in situ measurements of stable water isotopic composition and nitrogen solutes along with hydrological information. The system facilitates concurrent monitoring of a large number of water and nutrient fluxes (ground, surface, irrigation and rain water) in irrigated agriculture. For this purpose we couple an automatic sampling system with a Wavelength-Scanned Cavity Ring Down Spectrometry System (WS-CRDS) for stable water isotope analysis (δ2H and δ18O), a reagentless hyperspectral UV photometer (ProPS) for monitoring nitrate content and various water level sensors for hydrometric information. The automatic sampling system consists of different sampling stations equipped with pumps, a switch cabinet for valve and pump control and a computer operating the system. The complete system is operated via internet-based control software, allowing supervision from nearly anywhere. The system is currently set up at the International Rice Research Institute (Los Baños, The Philippines) in a diversified rice growing system to continuously monitor water and nutrient fluxes. Here we present the system's technical set-up and provide initial proof-of-concept with results for the isotopic composition of different water sources and nitrate values from the 2012 dry season. PMID:24366178
On the Relationship Between Automatic Attitudes and Self-Reported Sexual Assault in Men
Widman, Laura; Olson, Michael
2013-01-01
Research and theory suggest rape supportive attitudes are important predictors of sexual assault; yet, to date, rape supportive attitudes have been assessed exclusively through self-report measures that are methodologically and theoretically limited. To address these limitations, the objectives of the current project were to: (1) develop a novel implicit rape attitude assessment that captures automatic attitudes about rape and does not rely on self-reports, and (2) examine the association between automatic rape attitudes and sexual assault perpetration. We predicted that automatic rape attitudes would be a significant unique predictor of sexual assault even when self-reported rape attitudes (i.e., rape myth acceptance and hostility toward women) were controlled. We tested the generalizability of this prediction in two independent samples: a sample of undergraduate college men (n = 75, M age = 19.3 years) and a sample of men from the community (n = 50, M age = 35.9 years). We found the novel implicit rape attitude assessment was significantly associated with the frequency of sexual assault perpetration in both samples and contributed unique variance in explaining sexual assault beyond rape myth acceptance and hostility toward women. We discuss the ways in which future research on automatic rape attitudes may significantly advance measurement and theory aimed at understanding and preventing sexual assault. PMID:22618119
7 CFR 58.227 - Sampling device.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling device. 58.227 Section 58.227 Agriculture....227 Sampling device. If automatic sampling devices are used, they shall be constructed in such a.... The type of sampler and the sampling procedure shall be as approved by the Administrator. ...
7 CFR 58.227 - Sampling device.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Sampling device. 58.227 Section 58.227 Agriculture....227 Sampling device. If automatic sampling devices are used, they shall be constructed in such a.... The type of sampler and the sampling procedure shall be as approved by the Administrator. ...
2011-01-01
Background Bioinformatics data analysis is often using linear mixture model representing samples as additive mixture of components. Properly constrained blind matrix factorization methods extract those components using mixture samples only. However, automatic selection of extracted components to be retained for classification analysis remains an open issue. Results The method proposed here is applied to well-studied protein and genomic datasets of ovarian, prostate and colon cancers to extract components for disease prediction. It achieves average sensitivities of: 96.2 (sd = 2.7%), 97.6% (sd = 2.8%) and 90.8% (sd = 5.5%) and average specificities of: 93.6% (sd = 4.1%), 99% (sd = 2.2%) and 79.4% (sd = 9.8%) in 100 independent two-fold cross-validations. Conclusions We propose an additive mixture model of a sample for feature extraction using, in principle, sparseness constrained factorization on a sample-by-sample basis. As opposed to that, existing methods factorize complete dataset simultaneously. The sample model is composed of a reference sample representing control and/or case (disease) groups and a test sample. Each sample is decomposed into two or more components that are selected automatically (without using label information) as control specific, case specific and not differentially expressed (neutral). The number of components is determined by cross-validation. Automatic assignment of features (m/z ratios or genes) to particular component is based on thresholds estimated from each sample directly. Due to the locality of decomposition, the strength of the expression of each feature across the samples can vary. Yet, they will still be allocated to the related disease and/or control specific component. Since label information is not used in the selection process, case and control specific components can be used for classification. That is not the case with standard factorization methods. Moreover, the component selected by proposed method as disease specific can be interpreted as a sub-mode and retained for further analysis to identify potential biomarkers. As opposed to standard matrix factorization methods this can be achieved on a sample (experiment)-by-sample basis. Postulating one or more components with indifferent features enables their removal from disease and control specific components on a sample-by-sample basis. This yields selected components with reduced complexity and generally, it increases prediction accuracy. PMID:22208882
Identification of forensic samples by using an infrared-based automatic DNA sequencer.
Ricci, Ugo; Sani, Ilaria; Klintschar, Michael; Cerri, Nicoletta; De Ferrari, Francesco; Giovannucci Uzielli, Maria Luisa
2003-06-01
We have recently introduced a new protocol for analyzing all core loci of the Federal Bureau of Investigation's (FBI) Combined DNA Index System (CODIS) with an infrared (IR) automatic DNA sequencer (LI-COR 4200). The amplicons were labeled with forward oligonucleotide primers, covalently linked to a new infrared fluorescent molecule (IRDye 800). The alleles were displayed as familiar autoradiogram-like images with real-time detection. This protocol was employed for paternity testing, population studies, and identification of degraded forensic samples. We extensively analyzed some simulated forensic samples and mixed stains (blood, semen, saliva, bones, and fixed archival embedded tissues), comparing the results with donor samples. Sensitivity studies were also performed for the four multiplex systems. Our results show the efficiency, reliability, and accuracy of the IR system for the analysis of forensic samples. We also compared the efficiency of the multiplex protocol with ultraviolet (UV) technology. Paternity tests, undegraded DNA samples, and real forensic samples were analyzed with this approach based on IR technology and with UV-based automatic sequencers in combination with commercially-available kits. The comparability of the results with the widespread UV methods suggests that it is possible to exchange data between laboratories using the same core group of markers but different primer sets and detection methods.
Development of an automatic volcanic ash sampling apparatus for active volcanoes
NASA Astrophysics Data System (ADS)
Shimano, Taketo; Nishimura, Takeshi; Chiga, Nobuyuki; Shibasaki, Yoshinobu; Iguchi, Masato; Miki, Daisuke; Yokoo, Akihiko
2013-12-01
We develop an automatic system for the sampling of ash fall particles, to be used for continuous monitoring of magma ascent and eruptive dynamics at active volcanoes. The system consists of a sampling apparatus and cameras to monitor surface phenomena during eruptions. The Sampling Apparatus for Time Series Unmanned Monitoring of Ash (SATSUMA-I and SATSUMA-II) is less than 10 kg in weight and works automatically for more than a month with a 10-kg lead battery to obtain a total of 30 to 36 samples in one cycle of operation. The time range covered in one cycle varies from less than an hour to several months, depending on the aims of observation, allowing researchers to target minute-scale fluctuations in a single eruptive event, as well as daily to weekly trends in persistent volcanic activity. The latest version, SATSUMA-II, also enables control of sampling parameters remotely by e-mail commands. Durability of the apparatus is high: our prototypes worked for several months, in rainy and typhoon seasons, at windy and humid locations, and under strong sunlight. We have been successful in collecting ash samples emitted from Showa crater almost everyday for more than 4 years (2008-2012) at Sakurajima volcano in southwest Japan.
Method and apparatus for telemetry adaptive bandwidth compression
NASA Technical Reports Server (NTRS)
Graham, Olin L.
1987-01-01
Methods and apparatus are provided for automatic and/or manual adaptive bandwidth compression of telemetry. An adaptive sampler samples a video signal from a scanning sensor and generates a sequence of sampled fields. Each field and range rate information from the sensor are hence sequentially transmitted to and stored in a multiple and adaptive field storage means. The field storage means then, in response to an automatic or manual control signal, transfers the stored sampled field signals to a video monitor in a form for sequential or simultaneous display of a desired number of stored signal fields. The sampling ratio of the adaptive sample, the relative proportion of available communication bandwidth allocated respectively to transmitted data and video information, and the number of fields simultaneously displayed are manually or automatically selectively adjustable in functional relationship to each other and detected range rate. In one embodiment, when relatively little or no scene motion is detected, the control signal maximizes sampling ratio and causes simultaneous display of all stored fields, thus maximizing resolution and bandwidth available for data transmission. When increased scene motion is detected, the control signal is adjusted accordingly to cause display of fewer fields. If greater resolution is desired, the control signal is adjusted to increase the sampling ratio.
NASA Astrophysics Data System (ADS)
Paiè, Petra; Bassi, Andrea; Bragheri, Francesca; Osellame, Roberto
2017-02-01
Selective plane illumination microscopy (SPIM) is an optical sectioning technique that allows imaging of biological samples at high spatio-temporal resolution. Standard SPIM devices require dedicated set-ups, complex sample preparation and accurate system alignment, thus limiting the automation of the technique, its accessibility and throughput. We present a millimeter-scaled optofluidic device that incorporates selective plane illumination and fully automatic sample delivery and scanning. To this end an integrated cylindrical lens and a three-dimensional fluidic network were fabricated by femtosecond laser micromachining into a single glass chip. This device can upgrade any standard fluorescence microscope to a SPIM system. We used SPIM on a CHIP to automatically scan biological samples under a conventional microscope, without the need of any motorized stage: tissue spheroids expressing fluorescent proteins were flowed in the microchannel at constant speed and their sections were acquired while passing through the light sheet. We demonstrate high-throughput imaging of the entire sample volume (with a rate of 30 samples/min), segmentation and quantification in thick (100-300 μm diameter) cellular spheroids. This optofluidic device gives access to SPIM analyses to non-expert end-users, opening the way to automatic and fast screening of a high number of samples at subcellular resolution.
An automated atmospheric sampling system operating on 747 airliners
NASA Technical Reports Server (NTRS)
Perkins, P. J.; Gustafsson, U. R. C.
1976-01-01
An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.
40 CFR Appendix E to Part 403 - Sampling Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... done manually or automatically, and discretely or continuously. If discrete sampling is employed, at least 12 aliquots should be composited. Discrete sampling may be flow proportioned either by varying the...
[Application of automatic photography in Schistosoma japonicum miracidium hatching experiments].
Ming-Li, Zhou; Ai-Ling, Cai; Xue-Feng, Wang
2016-05-20
To explore the value of automatic photography in the observation of results of Schistosoma japonicum miracidium hatching experiments. Some fresh S. japonicum eggs were added into cow feces, and the samples of feces were divided into a low infested experimental group and a high infested group (40 samples each group). In addition, there was a negative control group with 40 samples of cow feces without S. japonicum eggs. The conventional nylon bag S. japonicum miracidium hatching experiments were performed. The process was observed with the method of flashlight and magnifying glass combined with automatic video (automatic photography method), and, at the same time, with the naked eye observation method. The results were compared. In the low infested group, the miracidium positive detection rates were 57.5% and 85.0% by the naked eye observation method and automatic photography method, respectively ( χ 2 = 11.723, P < 0.05). In the high infested group, the positive detection rates were 97.5% and 100% by the naked eye observation method and automatic photography method, respectively ( χ 2 = 1.253, P > 0.05). In the two infested groups, the average positive detection rates were 77.5% and 92.5% by the naked eye observation method and automatic photography method, respectively ( χ 2 = 6.894, P < 0.05). The automatic photography can effectively improve the positive detection rate in the S. japonicum miracidium hatching experiments.
Automation of a flocculation test for syphilis on Groupamatic equipment.
Garretta, M; Paris-Hamelin, A; Gener, J; Muller, A; Matte, C; Vaisman, A
1975-01-01
A flocculation reaction employing a cardiolipid antigen was used for syphilis screening on Groupamatic equipment in parallel with conventional screening reactions: Kolmer CF, RPCF, Kahn, Kline, and RPR. The positive samples were confirmed by FTA-200, FTA-ABS, TPI, and in some cases by TPHA. There were 5,212 known samples which had already been tested by all methods and of which 1,648 were positive, and 58,636 screened samples including 65 positives. Half of the samples in the first series were taken without anticoagulant; the remainder were collected in potassium EDTA. The percentage of false positives with the Groupamatic was about 1-4 per cent. The percentage of false negatives among positve (greater than or equal+) samples varied from 0-18 to 1-3 per cent.; on the other hand the sensitivity was less good for samples giving doubtful and/or dissociated reactions in conventional screening reactions. The specificity and sensitivity of this technique are acceptable for a blood transfusion centre. The reproducibility is excellent and the automatic reading of results accurate. Additional advantages are rapidity (340 samples processed per hour); simultaneous performance of eleven other immunohaematological reactions; no contamination between samples; automatic reading, interpretation, and print-out of results; and saving of time because samples are not filed sequentially and are automatically identified when the results are obtained. Although the importance of syphilis in blood transfusion seems small, estimates of the risk are difficult and further investigations are planned. Images PMID:1098731
Apparatus for microbiological sampling. [including automatic swabbing
NASA Technical Reports Server (NTRS)
Wilkins, J. R.; Mills, S. M. (Inventor)
1974-01-01
An automatic apparatus is described for microbiologically sampling surface using a cotton swab which eliminates human error. The apparatus includes a self-powered transport device, such as a motor-driven wheeled cart, which mounts a swabbing motor drive for a crank arm which supports a swab in the free end thereof. The swabbing motor is pivotably mounted and an actuator rod movable responsive to the cart traveling a predetermined distance provides lifting of the swab from the surface being sampled and reversal of the direction of travel of the cart.
Hsieh, Cheng-Huan; Meher, Anil Kumar; Chen, Yu-Chie
2013-01-01
Contactless atmospheric pressure ionization (C-API) method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm), an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.
Fully automatic characterization and data collection from crystals of biological macromolecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander
A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to themore » optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.« less
System automatically supplies precise analytical samples of high-pressure gases
NASA Technical Reports Server (NTRS)
Langdon, W. M.
1967-01-01
High-pressure-reducing and flow-stabilization system delivers analytical gas samples from a gas supply. The system employs parallel capillary restrictors for pressure reduction and downstream throttling valves for flow control. It is used in conjunction with a sampling valve and minimizes alterations of the sampled gas.
Sampling Error in a Particulate Mixture: An Analytical Chemistry Experiment.
ERIC Educational Resources Information Center
Kratochvil, Byron
1980-01-01
Presents an undergraduate experiment demonstrating sampling error. Selected as the sampling system is a mixture of potassium hydrogen phthalate and sucrose; using a self-zeroing, automatically refillable buret to minimize titration time of multiple samples and employing a dilute back-titrant to obtain high end-point precision. (CS)
Automatic frequency control for FM transmitter
NASA Technical Reports Server (NTRS)
Honnell, M. A. (Inventor)
1974-01-01
An automatic frequency control circuit for an FM television transmitter is described. The frequency of the transmitter is sampled during what is termed the back porch portion of the horizontal synchronizing pulse which occurs during the retrace interval, the frequency sample compared with the frequency of a reference oscillator, and a correction applied to the frequency of the transmitter during this portion of the retrace interval.
The dynamics of rupture in porous media
NASA Astrophysics Data System (ADS)
Stopiński, Wojciech; Ponomaryov, Aleksandr V.; Loś, Vladimir
1991-05-01
This paper presents a laboratory investigation of electric resistivity parameter for samples subject to loading in automatic press of “INOVA” type. The procedure of automatic quasi-continuous measurements of resistivity is briefly outlined. The distribution of mini-electrodes within the sample is described. Also shown is the manner in which reliability can be improved by increasing the repetition of resistivity measurements (every 7 16 s).
Presley, Todd K.; Jamison, Marcael T.J.
2009-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream, and to assess the effects from the H-1 storm drain on Manoa Stream. For this program, rainfall data were collected at three stations, continuous discharge data at five stations, and water-quality data at six stations, which include the five continuous discharge stations. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2008, and June 30, 2009. Within the Halawa Stream drainage area, three storms (October 25 and December 11, 2008, and February 3, 2009) were sampled during July 1, 2008, to June 30, 2009. A total of 43 environmental samples were collected during these three storms. During the storm of October 25, 2009, 31 samples were collected and analyzed individually for metals only. The other 12 samples from the other two storms were analyzed for some or all of the following analytes: total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, and zinc). Additionally, grab samples were analyzed for some or all of the following analytes: oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Some grab and composite samples were analyzed for only a partial list of these analytes, either because samples could not be delivered to the laboratory in a timely manner, or an insufficient volume of sample was collected by the automatic samplers. Two quality-assurance/quality-control samples were collected after cleaning automatic sampler lines to verify that the sampling lines were not contaminated. Four environmental samples were collected at the H-1 Storm Drain during July 1, 2008, to June 30, 2009. An oil and grease sample and a composite sample were collected during the storm on November 15, 2008, and two composite samples were collected during the January 11, 2009, storm. All samples at this site were collected using an automatic sampler. Samples were analyzed for some or all of the following analytes: total suspended solids, nutrients, oil and grease, total petroleum hydrocarbons, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). One qualityassurance/quality-control sample was collected after cleaning automatic sampler lines to verify that the sampling lines were not contaminated. During the storm of January 11, 2009, the two composite samples collected at H-1 Storm Drain were collected about three hours apart. Higher constituent concentrations were detected in the first 2 composite sample relative to the second composite sample, although the average discharge was higher during the period when the second sample was collected.
An automated atmospheric sampling system operating on 747 airliners
NASA Technical Reports Server (NTRS)
Perkins, P.; Gustafsson, U. R. C.
1975-01-01
An air sampling system that automatically measures the temporal and spatial distribution of selected particulate and gaseous constituents of the atmosphere has been installed on a number of commercial airliners and is collecting data on commercial air routes covering the world. Measurements of constituents related to aircraft engine emissions and other pollutants are made in the upper troposphere and lower stratosphere (6 to 12 km) in support of the Global Air Sampling Program (GASP). Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This system includes specialized instrumentation for measuring carbon monoxide, ozone, water vapor, and particulates, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituents and related flight data are tape recorded in flight for later computer processing on the ground.
Rotor assembly and method for automatically processing liquids
Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.
1992-01-01
A rotor assembly for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water, includes a rotor body for rotation about an axis and including a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses.
Implementation guide for turbidity threshold sampling: principles, procedures, and analysis
Jack Lewis; Rand Eads
2009-01-01
Turbidity Threshold Sampling uses real-time turbidity and river stage information to automatically collect water quality samples for estimating suspended sediment loads. The system uses a programmable data logger in conjunction with a stage measurement device, a turbidity sensor, and a pumping sampler. Specialized software enables the user to control the sampling...
Automatic bio-sample bacteria detection system
NASA Technical Reports Server (NTRS)
Chappelle, E. W.; Colburn, M.; Kelbaugh, B. N.; Picciolo, G. L.
1971-01-01
Electromechanical device analyzes urine specimens in 15 minutes and processes one sample per minute. Instrument utilizes bioluminescent reaction between luciferase-luciferin mixture and adenosine triphosphate (ATP) to determine number of bacteria present in the sample. Device has potential application to analysis of other body fluids.
NASA Astrophysics Data System (ADS)
Hu, Li; Zhao, Nanjing; Liu, Wenqing; Meng, Deshuo; Fang, Li; Wang, Yin; Yu, Yang; Ma, Mingjun
2015-08-01
Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. supported by National Natural Science Foundation of China (No. 60908018), National High Technology Research and Development Program of China (No. 2013AA065502) and Anhui Province Outstanding Youth Science Fund of China (No. 1108085J19)
Fully automatic characterization and data collection from crystals of biological macromolecules.
Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W
2015-08-01
Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.
Rotor assembly and method for automatically processing liquids
Burtis, C.A.; Johnson, W.F.; Walker, W.A.
1992-12-22
A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.
SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments
NASA Technical Reports Server (NTRS)
Leonard, R. F.
1977-01-01
A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.
Development and testing of a portable wind sensitive directional air sampler
NASA Technical Reports Server (NTRS)
Deyo, J.; Toma, J.; King, R. B.
1975-01-01
A portable wind sensitive directional air sampler was developed as part of an air pollution source identification system. The system is designed to identify sources of air pollution based on the directional collection of field air samples and their analysis for TSP and trace element characteristics. Sources can be identified by analyzing the data on the basis of pattern recognition concepts. The unit, designated Air Scout, receives wind direction signals from an associated wind vane. Air samples are collected on filter slides using a standard high volume air sampler drawing air through a porting arrangement which tracks the wind direction and permits collection of discrete samples. A preset timer controls the length of time each filter is in the sampling position. At the conclusion of the sampling period a new filter is automatically moved into sampling position displacing the previous filter to a storage compartment. Thus the Air Scout may be set up at a field location, loaded with up to 12 filter slides, and left to acquire air samples automatically, according to the wind, at any timer interval desired from 1 to 30 hours.
46 CFR 161.002-2 - Types of fire-protective systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., but not be limited to, automatic fire and smoke detecting systems, manual fire alarm systems, sample extraction smoke detection systems, watchman's supervisory systems, and combinations of these systems. (b) Automatic fire detecting systems. For the purpose of this subpart, automatic fire and smoke detecting...
46 CFR 161.002-2 - Types of fire-protective systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., but not be limited to, automatic fire and smoke detecting systems, manual fire alarm systems, sample extraction smoke detection systems, watchman's supervisory systems, and combinations of these systems. (b) Automatic fire detecting systems. For the purpose of this subpart, automatic fire and smoke detecting...
NASA Technical Reports Server (NTRS)
Berdahl, B. J.; Carle, G. C.; Oyama, V. I.
1971-01-01
Analyzer operates unattended or up to 15 hours. It has an automatic sample injection system and can be programmed. All fluid-flow valve switching is accomplished pneumatically from miniature three-way solenoid pilot valves.
Urine sampling and collection system
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.
1971-01-01
This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.
[Automatic adjustment control system for DC glow discharge plasma source].
Wan, Zhen-zhen; Wang, Yong-qing; Li, Xiao-jia; Wang, Hai-zhou; Shi, Ning
2011-03-01
There are three important parameters in the DC glow discharge process, the discharge current, discharge voltage and argon pressure in discharge source. These parameters influence each other during glow discharge process. This paper presents an automatic control system for DC glow discharge plasma source. This system collects and controls discharge voltage automatically by adjusting discharge source pressure while the discharge current is constant in the glow discharge process. The design concept, circuit principle and control program of this automatic control system are described. The accuracy is improved by this automatic control system with the method of reducing the complex operations and manual control errors. This system enhances the control accuracy of glow discharge voltage, and reduces the time to reach discharge voltage stability. The glow discharge voltage stability test results with automatic control system are provided as well, the accuracy with automatic control system is better than 1% FS which is improved from 4% FS by manual control. Time to reach discharge voltage stability has been shortened to within 30 s by automatic control from more than 90 s by manual control. Standard samples like middle-low alloy steel and tin bronze have been tested by this automatic control system. The concentration analysis precision has been significantly improved. The RSDs of all the test result are better than 3.5%. In middle-low alloy steel standard sample, the RSD range of concentration test result of Ti, Co and Mn elements is reduced from 3.0%-4.3% by manual control to 1.7%-2.4% by automatic control, and that for S and Mo is also reduced from 5.2%-5.9% to 3.3%-3.5%. In tin bronze standard sample, the RSD range of Sn, Zn and Al elements is reduced from 2.6%-4.4% to 1.0%-2.4%, and that for Si, Ni and Fe is reduced from 6.6%-13.9% to 2.6%-3.5%. The test data is also shown in this paper.
Presley, Todd K.; Jamison, Marcael T.J.
2010-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff collected by the H-1 storm drain on the Manoa-Palolo Drainage Canal. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2009, and June 30, 2010. As part of this program, rainfall and continuous discharge data were collected at the H-1 storm drain. During the year, sampling strategy and sample processing methods were modified to improve the characterization of the effects of discharge from the storm drain on the Manoa-Palolo Drainage Canal. During July 1, 2009, to February 1, 2010, samples were collected from only the H-1 storm drain. Beginning February 2, 2010, samples were collected simultaneously from the H-1 storm drain and the Manoa-Palolo Drainage Canal at a location about 50 feet upstream of the discharge point of the H-1 storm drain. Three storms were sampled during July 1, 2009, to June 30, 2010. All samples were collected using automatic samplers. For the storm of August 12, 2009, grab samples (for oil and grease, and total petroleum hydrocarbons) and a composite sample were collected. The composite sample was analyzed for total suspended solids, nutrients, and selected dissolved and total (filtered and unfiltered) trace metals (cadmium, chromium, nickel, copper, lead, and zinc). Two storms were sampled in March 2010 at the H-1 storm drain and from the Manoa-Palolo Drainage Canal. Two samples were collected during the storm of March 4, 2010, and six samples were collected during the storm of March 8, 2010. These two storms were sampled using the modified strategy, in which discrete samples from the automatic sampler were processed and analyzed individually, rather than as a composite sample, using the simultaneously collected samples from the H-1 storm drain and from the Manoa-Palolo Drainage Canal. The discrete samples were analyzed for some or all of the following constituents: total suspended solids, nutrients, oil and grease, and selected dissolved (filtered) trace metals (cadmium, chromium, nickel, copper, lead, and zinc). Five quality-assurance/quality-control samples were analyzed during the year. These samples included one laboratory-duplicate, one field-duplicate, and one matrix-spike sample prepared and analyzed with the storm samples. In addition, two inorganic blank-water samples, one sample at the H-1 storm drain and one sample at the Manoa-Palolo Drainage Canal, were collected by running the blank water (water purified of all inorganic constituents) through the sampling and processing systems after cleaning automatic sampler lines to verify that the sampling lines were not contaminated.
NASA Astrophysics Data System (ADS)
Min, M.
2017-10-01
Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.
Erchinger, Friedemann; Engjom, Trond; Gudbrandsen, Oddrun Anita; Tjora, Erling; Gilja, Odd H; Dimcevski, Georg
2016-01-01
We have recently evaluated a short endoscopic secretin test for exocrine pancreatic function. Bicarbonate concentration in duodenal juice is an important parameter in this test. Measurement of bicarbonate by back titration as the gold standard method is time consuming, expensive and technically difficult, thus a simplified method is warranted. We aimed to evaluate an automated spectrophotometric method in samples spanning the effective range of bicarbonate concentrations in duodenal juice. We also evaluated if freezing of samples before analyses would affect its results. Patients routinely examined with short endoscopic secretin test suspected to have decreased pancreatic function of various reasons were included. Bicarbonate in duodenal juice was quantified by back titration and automatic spectrophotometry. Both fresh and thawed samples were analysed spectrophotometrically. 177 samples from 71 patients were analysed. Correlation coefficient of all measurements was r = 0.98 (p < 0.001). Correlation coefficient of fresh versus frozen samples conducted with automatic spectrophotometry (n = 25): r = 0.96 (p < 0.001) CONCLUSIONS: The measurement of bicarbonate in fresh and thawed samples by automatic spectrophotometrical analysis correlates excellent with the back titration gold standard. This is a major simplification of direct pancreas function testing, and allows a wider distribution of bicarbonate testing in duodenal juice. Extreme values for Bicarbonate concentration achieved by the autoanalyser method have to be interpreted with caution. Copyright © 2016 IAP and EPC. Published by Elsevier India Pvt Ltd. All rights reserved.
Tarte, Stephen R.; Schmidt, A.R.; Sullivan, Daniel J.
1992-01-01
A floating sample-collection platform is described for stream sites where the vertical or horizontal distance between the stream-sampling point and a safe location for the sampler exceed the suction head of the sampler. The platform allows continuous water sampling over the entire storm-runoff hydrogrpah. The platform was developed for a site in southern Illinois.
Areeckal, Anu Shaju; Kamath, Jagannath; Zawadynski, Sophie; Kocher, Michel; S, Sumam David
2018-05-26
Osteoporosis is a bone disorder characterized by bone loss and decreased bone strength. The most widely used technique for detection of osteoporosis is the measurement of bone mineral density (BMD) using dual energy X-ray absorptiometry (DXA). But DXA scans are expensive and not widely available in low-income economies. In this paper, we propose a low cost pre-screening tool for the detection of low bone mass, using cortical radiogrammetry of third metacarpal bone and trabecular texture analysis of distal radius from hand and wrist radiographs. An automatic segmentation algorithm to automatically locate and segment the third metacarpal bone and distal radius region of interest (ROI) is proposed. Cortical measurements such as combined cortical thickness (CCT), cortical area (CA), percent cortical area (PCA) and Barnett Nordin index (BNI) were taken from the shaft of third metacarpal bone. Texture analysis of trabecular network at the distal radius was performed using features obtained from histogram, gray level Co-occurrence matrix (GLCM) and morphological gradient method (MGM). The significant cortical and texture features were selected using independent sample t-test and used to train classifiers to classify healthy subjects and people with low bone mass. The proposed pre-screening tool was validated on two ethnic groups, Indian sample population and Swiss sample population. Data of 134 subjects from Indian sample population and 65 subjects from Swiss sample population were analysed. The proposed automatic segmentation approach shows a detection accuracy of 86% in detecting the third metacarpal bone shaft and 90% in accurately locating the distal radius ROI. Comparison of the automatic radiogrammetry to the ground truth provided by experts show a mean absolute error of 0.04 mm for cortical width of healthy group, 0.12 mm for cortical width of low bone mass group, 0.22 mm for medullary width of healthy group, and 0.26 mm for medullary width of low bone mass group. Independent sample t-test was used to select the most discriminant features, to be used as input for training the classifiers. Pearson correlation analysis of the extracted features with DXA-BMD of lumbar spine (DXA-LS) shows significantly high correlation values. Classifiers were trained with the most significant features in the Indian and Swiss sample data. Weighted KNN classifier shows the best test accuracy of 78% for Indian sample data and 100% for Swiss sample data. Hence, combined automatic radiogrammetry and texture analysis is shown to be an effective low cost pre-screening tool for early diagnosis of osteoporosis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Contamination of successive samples in portable pumping systems
Robert B. Thomas; Rand E. Eads
1983-01-01
Automatic discrete sample pumping systems used to monitor water quality should deliver to storage all materials pumped in a given cycle. If they do not, successive samples will be contaminated, a severe problem with highly variable suspended sediment concentrations in small streams. The cross-contamination characteristics of two small commonly used portable pumping...
Domínguez, Marina A; Grünhut, Marcos; Pistonesi, Marcelo F; Di Nezio, María S; Centurión, María E
2012-05-16
An automatic flow-batch system that includes two borosilicate glass chambers to perform sample digestion and cold vapor atomic absorption spectroscopy determination of mercury in honey samples was designed. The sample digestion was performed by using a low-cost halogen lamp to obtain the optimum temperature. Optimization of the digestion procedure was done using a Box-Behnken experimental design. A linear response was observed from 2.30 to 11.20 μg Hg L(-1). The relative standard deviation was 3.20% (n = 11, 6.81 μg Hg L(-1)), the sample throughput was 4 sample h(-1), and the detection limit was 0.68 μg Hg L(-1). The obtained results with the flow-batch method are in good agreement with those obtained with the reference method. The flow-batch system is simple, allows the use of both chambers simultaneously, is seen as a promising methodology for achieving green chemistry goals, and is a good proposal to improving the quality control of honey.
Use of intumescent compounds in fire curtains
NASA Astrophysics Data System (ADS)
Nedryshkin, Oleg; Gravit, Marina; Mukhamedzhanova, Olga
2017-10-01
Automatic fire curtains are designed to divide sections of premises and structures into fire compartments for the purpose of localizing a fire, as well as filling openings in fire barriers. If a fire occurs due to a signal from a fire alarm sensor or a signal from a fire station, the blind automatically falls and locates the source of ignition. The paper presents the results of testing nine samples of fire curtains with an applied intumescent composition. Tests were conducted for 60 minutes before loss of sample integrity. The average temperature from the heated side of the sample reached 800 ∼ 1000 ° C. Depending on the sample, the temperature from the unheated side ranged from 70 ° C to 294 ° C. The best result was shown by a sample from a layer of needle-punched heat-insulating material with a thermal conductivity of 0.036 W/(m×K) placed between layers of foil and treated with water-based intumescent composition of silica material.
10 CFR 431.135 - Units to be tested.
Code of Federal Regulations, 2011 CFR
2011-01-01
... EQUIPMENT Automatic Commercial Ice Makers Test Procedures § 431.135 Units to be tested. For each basic model of automatic commercial ice maker selected for testing, a sample of sufficient size shall be selected...
Hanson, M.L.; Tabor, C.D. Jr.
1961-12-01
A mass spectrometer for analyzing the components of a gas is designed which is capable of continuous automatic operation such as analysis of samples of process gas from a continuous production system where the gas content may be changing. (AEC)
Automatic energy calibration algorithm for an RBS setup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, Tiago F.; Moro, Marcos V.; Added, Nemitala
2013-05-06
This work describes a computer algorithm for automatic extraction of the energy calibration parameters from a Rutherford Back-Scattering Spectroscopy (RBS) spectrum. Parameters like the electronic gain, electronic offset and detection resolution (FWHM) of a RBS setup are usually determined using a standard sample. In our case, the standard sample comprises of a multi-elemental thin film made of a mixture of Ti-Al-Ta that is analyzed at the beginning of each run at defined beam energy. A computer program has been developed to extract automatically the calibration parameters from the spectrum of the standard sample. The code evaluates the first derivative ofmore » the energy spectrum, locates the trailing edges of the Al, Ti and Ta peaks and fits a first order polynomial for the energy-channel relation. The detection resolution is determined fitting the convolution of a pre-calculated theoretical spectrum. To test the code, data of two years have been analyzed and the results compared with the manual calculations done previously, obtaining good agreement.« less
Sampling theory and automated simulations for vertical sections, applied to human brain.
Cruz-Orive, L M; Gelšvartas, J; Roberts, N
2014-02-01
In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images - although convenient manual application is always an option - and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the 'lambda method') is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable--one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Validity Evidence of the Spanish Version of the Automatic Thoughts Questionnaire-8 in Colombia.
Ruiz, Francisco J; Suárez-Falcón, Juan C; Riaño-Hernández, Diana
2017-02-13
The Automatic Thoughts Questionnaire (ATQ) is a widely used, 30-item, 5-point Likert-type scale that measures the frequency of negative automatic thoughts as experienced by individuals suffering from depression. However, there is some controversy about the factor structure of the ATQ, and its application can be too time-consuming for survey research. Accordingly, an abbreviated, 8-item version of the ATQ has been proposed. The aim of this study was to analyze the validity evidence of the Spanish version of the ATQ-8 in Colombia. The ATQ-8 was administered to a total of 1587 participants, including a sample of undergraduates, one of general population, and a clinical sample. The internal consistency across the different samples was good (α = .89). The one-factor model found in the original scale showed a good fit to the data (RMSEA = .083, 90% CI [.074, .092]; CFI = .96; NNFI = .95). The clinical sample's mean score on the ATQ-8 was significantly higher than the scores of the nonclinical samples. The ATQ-8 was sensitive to the effects of a 1-session acceptance and commitment therapy focused on disrupting negative repetitive thinking. ATQ-8 scores were significantly related to dysfunctional schemas, emotional symptoms, mindfulness, experiential avoidance, satisfaction with life, and dysfunctional attitudes. In conclusion, the Spanish version of the ATQ-8 showed good psychometric properties in Colombia.
Automatic Bayes Factors for Testing Equality- and Inequality-Constrained Hypotheses on Variances.
Böing-Messing, Florian; Mulder, Joris
2018-05-03
In comparing characteristics of independent populations, researchers frequently expect a certain structure of the population variances. These expectations can be formulated as hypotheses with equality and/or inequality constraints on the variances. In this article, we consider the Bayes factor for testing such (in)equality-constrained hypotheses on variances. Application of Bayes factors requires specification of a prior under every hypothesis to be tested. However, specifying subjective priors for variances based on prior information is a difficult task. We therefore consider so-called automatic or default Bayes factors. These methods avoid the need for the user to specify priors by using information from the sample data. We present three automatic Bayes factors for testing variances. The first is a Bayes factor with equal priors on all variances, where the priors are specified automatically using a small share of the information in the sample data. The second is the fractional Bayes factor, where a fraction of the likelihood is used for automatic prior specification. The third is an adjustment of the fractional Bayes factor such that the parsimony of inequality-constrained hypotheses is properly taken into account. The Bayes factors are evaluated by investigating different properties such as information consistency and large sample consistency. Based on this evaluation, it is concluded that the adjusted fractional Bayes factor is generally recommendable for testing equality- and inequality-constrained hypotheses on variances.
Automatic control and detector for three-terminal resistance measurement
Fasching, George E.
1976-10-26
A device is provided for automatic control and detection in a three-terminal resistance measuring instrument. The invention is useful for the rapid measurement of the resistivity of various bulk material with a three-terminal electrode system. The device maintains the current through the sample at a fixed level while measuring the voltage across the sample to detect the sample resistance. The three-electrode system contacts the bulk material and the current through the sample is held constant by means of a control circuit connected to a first of the three electrodes and works in conjunction with a feedback controlled amplifier to null the voltage between the first electrode and a second electrode connected to the controlled amplifier output. An A.C. oscillator provides a source of sinusoidal reference voltage of the frequency at which the measurement is to be executed. Synchronous reference pulses for synchronous detectors in the control circuit and an output detector circuit are provided by a synchronous pulse generator. The output of the controlled amplifier circuit is sampled by an output detector circuit to develop at an output terminal thereof a D.C. voltage which is proportional to the sample resistance R. The sample resistance is that segment of the sample between the area of the first electrode and the third electrode, which is connected to ground potential.
Automatic HTS force measurement instrument
Sanders, Scott T.; Niemann, Ralph C.
1999-01-01
A device for measuring the levitation force of a high temperature superconductor sample with respect to a reference magnet includes a receptacle for holding several high temperature superconductor samples each cooled to superconducting temperature. A rotatable carousel successively locates a selected one of the high temperature superconductor samples in registry with the reference magnet. Mechanism varies the distance between one of the high temperature superconductor samples and the reference magnet, and a sensor measures levitation force of the sample as a function of the distance between the reference magnet and the sample. A method is also disclosed.
Enhanced monitor system for water protection
Hill, David E [Knoxville, TN; Rodriquez, Jr., Miguel [Oak Ridge, TN; Greenbaum, Elias [Knoxville, TN
2009-09-22
An automatic, self-contained device for detecting toxic agents in a water supply includes an analyzer for detecting at least one toxic agent in a water sample, introducing a means for introducing a water sample into the analyzer and discharging the water sample from the analyzer, holding means for holding a water sample for a pre-selected period of time before the water sample is introduced into the analyzer, and an electronics package that analyzes raw data from the analyzer and emits a signal indicating the presence of at least one toxic agent in the water sample.
Machine for Automatic Bacteriological Pour Plate Preparation
Sharpe, A. N.; Biggs, D. R.; Oliver, R. J.
1972-01-01
A fully automatic system for preparing poured plates for bacteriological analyses has been constructed and tested. The machine can make decimal dilutions of bacterial suspensions, dispense measured amounts into petri dishes, add molten agar, mix the dish contents, and label the dishes with sample and dilution numbers at the rate of 2,000 dishes per 8-hr day. In addition, the machine can be programmed to select different media so that plates for different types of bacteriological analysis may be made automatically from the same sample. The machine uses only the components of the media and sterile polystyrene petri dishes; requirements for all other materials, such as sterile pipettes and capped bottles of diluents and agar, are eliminated. Images PMID:4560475
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.
An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less
Evaluation of Automatic Vehicle Location accuracy
DOT National Transportation Integrated Search
1999-01-01
This study assesses the accuracy of the Automatic Vehicle Location (AVL) data provided for the buses of the Ann Arbor Transportation Authority with Global Positioning System (GPS) technology. In a sample of eighty-nine bus trips two kinds of accuracy...
Peeters, R; Galesloot, P J B
2002-03-01
The objective of this study was to estimate the daily fat yield and fat percentage from one sampled milking per cow per test day in an automatic milking system herd, when the milking times and milk yields of all individual milkings are recorded by the automatic milking system. Multiple regression models were used to estimate the 24-h fat percentage when only one milking is sampled for components and milk yields and milking times are known for all milkings in the 24-h period before the sampled milking. In total, 10,697 cow test day records, from 595 herd tests at 91 Dutch herds milked with an automatic milking system, were used. The best model to predict 24-h fat percentage included fat percentage, protein percentage, milk yield and milking interval of the sampled milking, milk yield, and milking interval of the preceding milking, and the interaction between milking interval and the ratio of fat and protein percentage of the sampled milking. This model gave a standard deviation of the prediction error (SE) for 24-h fat percentage of 0.321 and a correlation between the predicted and actual 24-h fat percentage of 0.910. For the 24-h fat yield, we found SE = 90 g and correlation = 0.967. This precision is slightly better than that of present a.m.-p.m. testing schemes. Extra attention must be paid to correctly matching the sample jars and the milkings. Furthermore, milkings with an interval of less than 4 h must be excluded from sampling as well as milkings that are interrupted or that follow an interrupted milking. Under these restrictions (correct matching, interval of at least 4 h, and no interrupted milking), one sampled milking suffices to get a satisfactory estimate for the test-day fat yield.
Components of Implicit Stigma against Mental Illness among Chinese Students
Wang, Xiaogang; Huang, Xiting; Jackson, Todd; Chen, Ruijun
2012-01-01
Although some research has examined negative automatic aspects of attitudes toward mental illness via relatively indirect measures among Western samples, it is unclear whether negative attitudes can be automatically activated in individuals from non-Western countries. This study attempted to validate results from Western samples with Chinese college students. We first examined the three-component model of implicit stigma (negative cognition, negative affect, and discriminatory tendencies) toward mental illness with the Single Category Implicit Association Test (SC-IAT). We also explored the relationship between explicit and implicit stigma among 56 Chinese university college students. In the three separate SC-IATs and the combined SC-IAT, automatic associations between mental illness and negative descriptors were stronger relative to those with positive descriptors and the implicit effect of cognitive and affective SC-IATs were significant. Explicit and implicit measures of stigma toward mental illness were unrelated. In our sample, women's overall attitudes toward mental illness were more negative than men's were, but no gender differences were found for explicit measures. These findings suggested that implicit stigma toward mental illness exists in Chinese students, and provide some support for the three-component model of implicit stigma toward mental illness. Future studies that focus on automatic components of stigmatization and stigma-reduction in China are warranted. PMID:23029366
DALMATIAN: An Algorithm for Automatic Cell Detection and Counting in 3D.
Shuvaev, Sergey A; Lazutkin, Alexander A; Kedrov, Alexander V; Anokhin, Konstantin V; Enikolopov, Grigori N; Koulakov, Alexei A
2017-01-01
Current 3D imaging methods, including optical projection tomography, light-sheet microscopy, block-face imaging, and serial two photon tomography enable visualization of large samples of biological tissue. Large volumes of data obtained at high resolution require development of automatic image processing techniques, such as algorithms for automatic cell detection or, more generally, point-like object detection. Current approaches to automated cell detection suffer from difficulties originating from detection of particular cell types, cell populations of different brightness, non-uniformly stained, and overlapping cells. In this study, we present a set of algorithms for robust automatic cell detection in 3D. Our algorithms are suitable for, but not limited to, whole brain regions and individual brain sections. We used watershed procedure to split regional maxima representing overlapping cells. We developed a bootstrap Gaussian fit procedure to evaluate the statistical significance of detected cells. We compared cell detection quality of our algorithm and other software using 42 samples, representing 6 staining and imaging techniques. The results provided by our algorithm matched manual expert quantification with signal-to-noise dependent confidence, including samples with cells of different brightness, non-uniformly stained, and overlapping cells for whole brain regions and individual tissue sections. Our algorithm provided the best cell detection quality among tested free and commercial software.
ERIC Educational Resources Information Center
Sideridis, Georgios D.; Simos, Panagiotis; Mouzaki, Angeliki; Stamovlasis, Dimitrios
2016-01-01
The study explored the moderating role of rapid automatized naming (RAN) in reading achievement through a cusp-catastrophe model grounded on nonlinear dynamic systems theory. Data were obtained from a community sample of 496 second through fourth graders who were followed longitudinally over 2 years and split into 2 random subsamples (validation…
Bacteria-free water for automatic washer-disinfectors: an impossible dream?
Cooke, R P; Whymant-Morris, A; Umasankar, R S; Goddard, S V
1998-05-01
The ability of a new automatic washer-disinfector system (AWDS), fitted with a water filtration system to provide bacteria-free water and so avoid the risk of mycobacterial contamination of fibreoptic bronchoscopes, was examined. Four new Astec 'MP' Safescope washer-disinfectors, with coarse and fine (0.2 micron) filters attached close to the outlet taps, were supplied with non-softened mains water. Water samples from the tank supply and outlet taps were regularly assessed for bacterial quality over a six-month period. Outlet samples were also analysed after fine filter change and purgation with peracetic acid. All bronchoalveolar lavage specimens (BALS) were stained and cultured for mycobacteria. Only 13 out of 53 outlet samples (24%) were culture-negative. There was no improvement after filter change. Residual anti-bacterial effect of peracetic acid lasted up to 48 h following AWDS purgation. No tank samples were bacteria-free. Sixty BALS were processed, two samples were culture-positive and grew M. tuberculosis and one was also smear-positive. Though mycobacterial contamination of bronchoscopes was not evident, the water filtration system was unable to reliably provide sterile rinse water.
Kudr, Jiri; Nguyen, Hoai Viet; Gumulec, Jaromir; Nejdl, Lukas; Blazkova, Iva; Ruttkay-Nedecky, Branislav; Hynek, David; Kynicky, Jindrich; Adam, Vojtech; Kizek, Rene
2015-01-01
In this study a device for automatic electrochemical analysis was designed. A three electrodes detection system was attached to a positioning device, which enabled us to move the electrode system from one well to another of a microtitre plate. Disposable carbon tip electrodes were used for Cd(II), Cu(II) and Pb(II) ion quantification, while Zn(II) did not give signal in this electrode configuration. In order to detect all mentioned heavy metals simultaneously, thin-film mercury electrodes (TFME) were fabricated by electrodeposition of mercury on the surface of carbon tips. In comparison with bare electrodes the TMFEs had lower detection limits and better sensitivity. In addition to pure aqueous heavy metal solutions, the assay was also performed on mineralized rock samples, artificial blood plasma samples and samples of chicken embryo organs treated with cadmium. An artificial neural network was created to evaluate the concentrations of the mentioned heavy metals correctly in mixture samples and an excellent fit was observed (R2 = 0.9933). PMID:25558996
Automatic HTS force measurement instrument
Sanders, S.T.; Niemann, R.C.
1999-03-30
A device is disclosed for measuring the levitation force of a high temperature superconductor sample with respect to a reference magnet includes a receptacle for holding several high temperature superconductor samples each cooled to superconducting temperature. A rotatable carousel successively locates a selected one of the high temperature superconductor samples in registry with the reference magnet. Mechanism varies the distance between one of the high temperature superconductor samples and the reference magnet, and a sensor measures levitation force of the sample as a function of the distance between the reference magnet and the sample. A method is also disclosed. 3 figs.
The relationship between parents' and children's automatic thoughts in a college student sample.
Donnelly, Reesa; Renk, Kimberly; Sims, Valerie K; McGuire, Jack
2011-04-01
Research demonstrates the importance of early social interactions in the development of schemas and automatic thoughts. It does not appear, however, that the existing research examines intergenerational correlations in automatic thoughts. As a result, this study explores the relationship between the automatic thoughts of parents and those of their college-age children in a sample of 252 college students and their mothers and fathers. Results of this study suggest that there are significant relationships between parents' and college students' positive automatic thoughts. Different trends by gender also are noted in the relationships among variables for male and female college students with their mothers and fathers. Further, mothers' positive ATs predicted the positive ATs of their college students, with mothers' ratings of their own communication with their college students mediating partially this relationship. Finally, college students' anxiety and self-esteem is predicted significantly by their mothers' anxiety and self-esteem (respectively) as well as their own positive and negative ATs. These findings suggest the possibility that ATs play a role in the intergenerational transmission of certain domains of psychological functioning.
Zeng, Xueqiang; Luo, Gang
2017-12-01
Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.
NASA Astrophysics Data System (ADS)
Singla, Neeru; Srivastava, Vishal; Singh Mehta, Dalip
2018-02-01
We report the first fully automated detection of human skin burn injuries in vivo, with the goal of automatic surgical margin assessment based on optical coherence tomography (OCT) images. Our proposed automated procedure entails building a machine-learning-based classifier by extracting quantitative features from normal and burn tissue images recorded by OCT. In this study, 56 samples (28 normal, 28 burned) were imaged by OCT and eight features were extracted. A linear model classifier was trained using 34 samples and 22 samples were used to test the model. Sensitivity of 91.6% and specificity of 90% were obtained. Our results demonstrate the capability of a computer-aided technique for accurately and automatically identifying burn tissue resection margins during surgical treatment.
Hinsmann, P; Arce, L; Ríos, A; Valcárcel, M
2000-01-07
The separation of seven pesticides by micellar electrokinetic capillary chromatography in spiked water samples is described, allowing the analysis of pesticides mixtures down to a concentration of 50 microg l(-1) in less than 13 min. Calibration, pre-concentration, elution and injection into the sample vial was carried out automatically by a continuous flow system (CFS) coupled to a capillary electrophoresis system via a programmable arm. The whole system was electronically coupled by a micro-processor and completely controlled by a computer. A C18 solid-phase mini-column was used for the pre-concentration, allowing a 12-fold enrichment (as an average value) of the pesticides from fortified water samples. Under the optimal extraction conditions, recoveries between 90 and 114% for most of the pesticides were obtained.
Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation
Butler, Emily E.; Ward, Robert; Ramsey, Richard
2015-01-01
Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest. PMID:26079137
Investigating the Relationship between Stable Personality Characteristics and Automatic Imitation.
Butler, Emily E; Ward, Robert; Ramsey, Richard
2015-01-01
Automatic imitation is a cornerstone of nonverbal communication that fosters rapport between interaction partners. Recent research has suggested that stable dimensions of personality are antecedents to automatic imitation, but the empirical evidence linking imitation with personality traits is restricted to a few studies with modest sample sizes. Additionally, atypical imitation has been documented in autism spectrum disorders and schizophrenia, but the mechanisms underpinning these behavioural profiles remain unclear. Using a larger sample than prior studies (N=243), the current study tested whether performance on a computer-based automatic imitation task could be predicted by personality traits associated with social behaviour (extraversion and agreeableness) and with disorders of social cognition (autistic-like and schizotypal traits). Further personality traits (narcissism and empathy) were assessed in a subsample of participants (N=57). Multiple regression analyses showed that personality measures did not predict automatic imitation. In addition, using a similar analytical approach to prior studies, no differences in imitation performance emerged when only the highest and lowest 20 participants on each trait variable were compared. These data weaken support for the view that stable personality traits are antecedents to automatic imitation and that neural mechanisms thought to support automatic imitation, such as the mirror neuron system, are dysfunctional in autism spectrum disorders or schizophrenia. In sum, the impact that personality variables have on automatic imitation is less universal than initial reports suggest.
Matuszewska, Renata; Szczotko, Maciej; Krogulska, Bozena
2012-01-01
The presence of parasitic protozoa in drinking water is mostly a result of improperly maintened the water treatment process. Currently, in Poland the testing of Cryptosporidium and Giardia in water as a part of routine monitoring of water is not perform. The aim of this study was the optimization of the method of Cryptosporidium and Giardia detection in water according to the main principles of standard ISO 15553:2006 and using Filta-Max xpress automatic elution station. Preliminary tests were performed on the samples contaminated with oocysts and cysts of reference strains of both parasitic protozoa. Further studies were carried out on environmental samples of surface water sampled directly from the intakes of water (21 samples from Vistula River and 8 samples from Zegrzynski Lake). Filtration process and samples volume reducing were performed using an automatic elution system Filta-Max xpress. Next, samples were purified during immunomagnetic separation process (IMS). Isolated cysts and oocysts were stained with FITC and DAPI and than the microscopic observation using an epifluorescence microscope was carried out. Recovery of parasite protozoa in all contaminated water samples after 9-cycles elution process applied was mean 60.6% for Cryptosporidium oocysts and 36.1% for Giardia cysts. Studies on the environmental surface water samples showed the presence of both parasitic protozoa. Number of detected Giardia cysts ranged from 1.0/10 L up to 4.5/10 L in samples from Zegrzynski Lake and from 1.0/10 L up to 38.9/10 L in samples from Vistula River. Cryptosporidium oocysts were present in 50% of samples from the Zegrzynski Lake and in 47.6% of samples from the Vistula River, and their number in both cases was similar and ranged from 0.5 up to 2.5 oocyst/10 L. The results show that applied procedure is appropriate for detection the presence of parasitic protosoan in water, but when water contains much amount of inorganic matter and suspended solids test method have to be modified like subsamples preparation and filtration process speed reduction. The applied method with the modification using Filta-Max xpress system can be useful for the routine monitoring of water. Detection of Cryptosporidium and Giardia in all samples of water taken from the intakes of surface water shows the possibility oftransfering of the protozoan cysts into the water intended for the consumption, therefore the testing of Cryptosporidium and Giardia should be included into the monitoring of water.
Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Miró, Manuel; Cerdà, Victor
2010-05-01
In the present work, it is proposed, for the first time, an on-line automatic renewable molecularly imprinted solid-phase extraction (MISPE) protocol for sample preparation prior to liquid chromatographic analysis. The automatic microscale procedure was based on the bead injection (BI) concept under the lab-on-valve (LOV) format, using a multisyringe burette as propulsion unit for handling solutions and suspensions. A high precision on handling the suspensions containing irregularly shaped molecularly imprinted polymer (MIP) particles was attained, enabling the use of commercial MIP as renewable sorbent. The features of the proposed BI-LOV manifold also allowed a strict control of the different steps within the extraction protocol, which are essential for promoting selective interactions in the cavities of the MIP. By using this on-line method, it was possible to extract and quantify riboflavin from different foodstuff samples in the range between 0.450 and 5.00 mg L(-1) after processing 1,000 microL of sample (infant milk, pig liver extract, and energy drink) without any prior treatment. For milk samples, LOD and LOQ values were 0.05 and 0.17 mg L(-1), respectively. The method was successfully applied to the analysis of two certified reference materials (NIST 1846 and BCR 487) with high precision (RSD < 5.5%). Considering the downscale and simplification of the sample preparation protocol and the simultaneous performance of extraction and chromatographic assays, a cost-effective and enhanced throughput (six determinations per hour) methodology for determination of riboflavin in foodstuff samples is deployed here.
A device for automatic photoelectric control of the analytical gap for emission spectrographs
Dietrich, John A.; Cooley, Elmo F.; Curry, Kenneth J.
1977-01-01
A photoelectric device has been built that automatically controls the analytical gap between electrodes during excitation period. The control device allows for precise control of the analytical gap during the arcing process of samples, resulting in better precision of analysis.
Exposure to violent video games increases automatic aggressiveness.
Uhlmann, Eric; Swanson, Jane
2004-02-01
The effects of exposure to violent video games on automatic associations with the self were investigated in a sample of 121 students. Playing the violent video game Doom led participants to associate themselves with aggressive traits and actions on the Implicit Association Test. In addition, self-reported prior exposure to violent video games predicted automatic aggressive self-concept, above and beyond self-reported aggression. Results suggest that playing violent video games can lead to the automatic learning of aggressive self-views.
Presley, Todd K.; Jamison, Marcael T.J.; Young-Smith, Stacie T. M.
2006-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous discharge data at one station, continuous streamflow data at two stations, and water-quality data at five stations, which include the continuous discharge and streamflow stations. This report summarizes rainfall, discharge, streamflow, and water-quality data collected between July 1, 2005 and June 30, 2006. A total of 23 samples was collected over five storms during July 1, 2005 to June 30, 2006. The goal was to collect grab samples nearly simultaneously at all five stations, and flow-weighted time-composite samples at the three stations equipped with automatic samplers; however, all five storms were partially sampled owing to lack of flow at the time of sampling at some sites, or because some samples collected by the automatic sampler did not represent water from the storm. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
DOT National Transportation Integrated Search
1989-06-01
Author's abstract: A nonrandom sample of 120 disproportionately short, tall, and overweight drivers compared the comfort and convenience of the automatic safety belt systems used in seventeen automobiles. Nine vehicles had motorized shoulder belts wi...
Unsupervised Learning —A Novel Clustering Method for Rolling Bearing Faults Identification
NASA Astrophysics Data System (ADS)
Kai, Li; Bo, Luo; Tao, Ma; Xuefeng, Yang; Guangming, Wang
2017-12-01
To promptly process the massive fault data and automatically provide accurate diagnosis results, numerous studies have been conducted on intelligent fault diagnosis of rolling bearing. Among these studies, such as artificial neural networks, support vector machines, decision trees and other supervised learning methods are used commonly. These methods can detect the failure of rolling bearing effectively, but to achieve better detection results, it often requires a lot of training samples. Based on above, a novel clustering method is proposed in this paper. This novel method is able to find the correct number of clusters automatically the effectiveness of the proposed method is validated using datasets from rolling element bearings. The diagnosis results show that the proposed method can accurately detect the fault types of small samples. Meanwhile, the diagnosis results are also relative high accuracy even for massive samples.
The microbiological quality of pasteurized milk sold by automatic vending machines.
Angelidis, A S; Tsiota, S; Pexara, A; Govaris, A
2016-06-01
The microbiological quality of pasteurized milk samples (n = 39) collected during 13 weekly intervals from three automatic vending machines (AVM) in Greece was investigated. Microbiological counts (total aerobic (TAC), total psychrotrophic (TPC), Enterobacteriaceae (EC), and psychrotrophic aerobic bacterial spore counts (PABSC)) were obtained at the time of sampling and at the end of shelf-life (3 days) after storage of the samples at 4 or 8°C. TAC were found to be below the 10(7 ) CFU ml(-1) limit of pasteurized milk spoilage both during sampling as well as when milk samples were stored at either storage temperature for 3 days. Enterobacteriaceae populations were below 1 CFU ml(-1) in 69·2% of the samples tested at the time of sampling, whereas the remaining samples contained low numbers, typically less than 10 CFU ml(-1) . All samples tested negative for the presence of Listeria monocytogenes. Analogous microbiological data were also obtained by sampling and testing prepackaged, retail samples of pasteurized milk from two dairy companies in Greece (n = 26). From a microbiological standpoint, the data indicate that the AVM milk samples meet the quality standards of pasteurized milk. However, the prepackaged, retail milk samples yielded better results in terms of TAC, TPC and EC, compared to the AVM samples at the end of shelf-life. Recently, Greek dairy farmers organized in cooperatives launched the sale of pasteurized milk via AVM and this study reports on the microbiological quality of this product. The data show that AVM milk is sold at proper refrigeration temperatures and meets the quality standards of pasteurized milk throughout the manufacturer's specified shelf-life. However, based on the microbiological indicators tested, the keeping quality of the tested prepackaged, retail samples of pasteurized milk at the end of shelf-life upon storage under suboptimal refrigeration temperature (8°C) was better. © 2016 The Society for Applied Microbiology.
Automatic differential analysis of NMR experiments in complex samples.
Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André
2018-06-01
Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.
Towards an automatic lab-on-valve-ion mobility spectrometric system for detection of cocaine abuse.
Cocovi-Solberg, David J; Esteve-Turrillas, Francesc A; Armenta, Sergio; de la Guardia, Miguel; Miró, Manuel
2017-08-25
A lab-on-valve miniaturized system integrating on-line disposable micro-solid phase extraction has been interfaced with ion mobility spectrometry for the accurate and sensitive determination of cocaine and ecgonine methyl ester in oral fluids. The method is based on the automatic loading of 500μL of oral fluid along with the retention of target analytes and matrix clean-up by mixed-mode cationic/reversed-phase solid phase beads, followed by elution with 100μL of 2-propanol containing (3% v/v) ammonia, which are online injected into the IMS. The sorptive particles are automatically discarded after every individual assay inasmuch as the sorptive capacity of the sorbent material is proven to be dramatically deteriorated with reuse. The method provided a limit of detection of 0.3 and 0.14μgL -1 for cocaine and ecgonine methyl ester, respectively, with relative standard deviation values from 8 till 14% with a total analysis time per sample of 7.5min. Method trueness was evaluated by analyzing oral fluid samples spiked with cocaine at different concentration levels (1, 5 and 25μgL -1 ) affording relative recoveries within the range of 85±24%. Fifteen saliva samples were collected from volunteers and analysed following the proposed automatic procedure, showing a 40% cocaine occurrence with concentrations ranging from 1.3 to 97μgL -1 . Field saliva samples were also analysed by reference methods based on lateral flow immunoassay and gas chromatography-mass spectrometry. The application of this procedure to the control of oral fluids of cocaine consumers represents a step forward towards the development of a point-of-care cocaine abuse sensing system. Copyright © 2017 Elsevier B.V. All rights reserved.
Sample introduction apparatus for a flow cytometer
Van den Engh, Ger
1998-01-01
A sample introduction system for a flow cytometer allows easy change of sample containers such as test tubes and facilitates use in high pressure environments. The sample container includes a cap having a pressure supply chamber and a sample container attachment cavity. A sample container may be automatically positioned into the attachment cavity so as to sealably engage the end of the sample container as its outer surface. This positioning may be accomplished through some sample introduction mechanism. To facilitate cleaning HPLC tubing and fittings may be used in a manner which facilitates removable of the entire tubing from both the nozzle container and other sample container cap to permit its replacement to avoid contamination. The sample container support may include horizontal stops which loosely limit the movement of the sample container and thus avoid further stresses upon it.
Sample introduction system for a flow cytometer
Van den Engh, Ger
1997-01-01
A sample introduction system for a flow cytometer allows easy change of sample containers such as test tubes and facilitates use in high pressure environments. The sample container includes a cap having a pressure supply chamber and a sample container attachment cavity. A sample container may be automatically positioned into the attachment cavity so as to sealably engage the end of the sample container as its outer surface. This positioning may be accomplished through some sample introduction mechanism. To facilitate cleaning, HPLC tubing and fittings may be used in a manner which facilitates removing of the entire tubing from both the nozzle container and other sample container cap to permit its replacement to avoid contamination. The sample container support may include horizontal stops which loosely limit the movement of the sample container and thus avoid further stresses upon it.
Sample introduction apparatus for a flow cytometer
Van den Engh, G.
1998-03-10
A sample introduction system for a flow cytometer allows easy change of sample containers such as test tubes and facilitates use in high pressure environments. The sample container includes a cap having a pressure supply chamber and a sample container attachment cavity. A sample container may be automatically positioned into the attachment cavity so as to sealably engage the end of the sample container as its outer surface. This positioning may be accomplished through some sample introduction mechanism. To facilitate cleaning HPLC tubing and fittings may be used in a manner which facilitates removable of the entire tubing from both the nozzle container and other sample container cap to permit its replacement to avoid contamination. The sample container support may include horizontal stops which loosely limit the movement of the sample container and thus avoid further stresses upon it. 3 figs.
Sample introduction system for a flow cytometer
Engh, G. van den
1997-02-11
A sample introduction system for a flow cytometer allows easy change of sample containers such as test tubes and facilitates use in high pressure environments. The sample container includes a cap having a pressure supply chamber and a sample container attachment cavity. A sample container may be automatically positioned into the attachment cavity so as to sealably engage the end of the sample container as its outer surface. This positioning may be accomplished through some sample introduction mechanism. To facilitate cleaning, HPLC tubing and fittings may be used in a manner which facilitates removing of the entire tubing from both the nozzle container and other sample container cap to permit its replacement to avoid contamination. The sample container support may include horizontal stops which loosely limit the movement of the sample container and thus avoid further stresses upon it. 3 figs.
Automatic multiple-sample applicator and electrophoresis apparatus
NASA Technical Reports Server (NTRS)
Grunbaum, B. W. (Inventor)
1977-01-01
An apparatus for performing electrophoresis and a multiple-sample applicator is described. Electrophoresis is a physical process in which electrically charged molecules and colloidal particles, upon the application of a dc current, migrate along a gel or a membrane that is wetted with an electrolyte. A multiple-sample applicator is provided which coacts with a novel tank cover to permit an operator either to depress a single button, thus causing multiple samples to be deposited on the gel or on the membrane simultaneously, or to depress one or more sample applicators separately by means of a separate button for each applicator.
NASA Astrophysics Data System (ADS)
Lelièvre, Peter G.; Grey, Melissa
2017-08-01
Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.
The TraceDetect's SafeGuard is designed to automatically measure total arsenic concentrations in drinking water samples (including raw water and treated water) over a range from 1 ppb to over 100 ppb. Once the operator has introduced the sample vial and selected "measure&qu...
Reading Ability Is Negatively Related to Stroop Interference
ERIC Educational Resources Information Center
Protopapas, Athanassios; Archonti, Anastasia; Skaloumbakas, Christos
2007-01-01
Stroop interference is often taken as evidence for reading automaticity even though young and poor readers, who presumably lack reading automaticity, present strong interference. Here the relationship between reading skills and Stroop interference was studied in a 7th-grade sample. Greater interference was observed in children diagnosed with…
An automatic gas chromatograph with a flame photometric detector that samples and analyzes hydrogen sulfide and carbonyl sulfide at 30-s intervals is described. Temperature programming was used to elute trace amounts of carbon disulfide present in each injection from a Supelpak-S...
NASA Astrophysics Data System (ADS)
Johnsen, Elin; Leknes, Siri; Wilson, Steven Ray; Lundanes, Elsa
2015-03-01
Neurons communicate via chemical signals called neurotransmitters (NTs). The numerous identified NTs can have very different physiochemical properties (solubility, charge, size etc.), so quantification of the various NT classes traditionally requires several analytical platforms/methodologies. We here report that a diverse range of NTs, e.g. peptides oxytocin and vasopressin, monoamines adrenaline and serotonin, and amino acid GABA, can be simultaneously identified/measured in small samples, using an analytical platform based on liquid chromatography and high-resolution mass spectrometry (LC-MS). The automated platform is cost-efficient as manual sample preparation steps and one-time-use equipment are kept to a minimum. Zwitter-ionic HILIC stationary phases were used for both on-line solid phase extraction (SPE) and liquid chromatography (capillary format, cLC). This approach enabled compounds from all NT classes to elute in small volumes producing sharp and symmetric signals, and allowing precise quantifications of small samples, demonstrated with whole blood (100 microliters per sample). An additional robustness-enhancing feature is automatic filtration/filter back-flushing (AFFL), allowing hundreds of samples to be analyzed without any parts needing replacement. The platform can be installed by simple modification of a conventional LC-MS system.
Improved sampling and analysis of images in corneal confocal microscopy.
Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R
2017-10-01
Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the CCM images in order to obtain more objective corneal nerve fibre measurements. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Sun, Ziheng; Fang, Hui; Di, Liping; Yue, Peng
2016-09-01
It was an untouchable dream for remote sensing experts to realize total automatic image classification without inputting any parameter values. Experts usually spend hours and hours on tuning the input parameters of classification algorithms in order to obtain the best results. With the rapid development of knowledge engineering and cyberinfrastructure, a lot of data processing and knowledge reasoning capabilities become online accessible, shareable and interoperable. Based on these recent improvements, this paper presents an idea of parameterless automatic classification which only requires an image and automatically outputs a labeled vector. No parameters and operations are needed from endpoint consumers. An approach is proposed to realize the idea. It adopts an ontology database to store the experiences of tuning values for classifiers. A sample database is used to record training samples of image segments. Geoprocessing Web services are used as functionality blocks to finish basic classification steps. Workflow technology is involved to turn the overall image classification into a total automatic process. A Web-based prototypical system named PACS (Parameterless Automatic Classification System) is implemented. A number of images are fed into the system for evaluation purposes. The results show that the approach could automatically classify remote sensing images and have a fairly good average accuracy. It is indicated that the classified results will be more accurate if the two databases have higher quality. Once the experiences and samples in the databases are accumulated as many as an expert has, the approach should be able to get the results with similar quality to that a human expert can get. Since the approach is total automatic and parameterless, it can not only relieve remote sensing workers from the heavy and time-consuming parameter tuning work, but also significantly shorten the waiting time for consumers and facilitate them to engage in image classification activities. Currently, the approach is used only on high resolution optical three-band remote sensing imagery. The feasibility using the approach on other kinds of remote sensing images or involving additional bands in classification will be studied in future.
90-kilobar diamond-anvil high-pressure cell for use on an automatic diffractometer.
Schiferl, D; Jamieson, J C; Lenko, J E
1978-03-01
A gasketed diamond-anvil high-pressure cell is described which can be used on a four-circle automatic diffractometer to collect x-ray intensity data from single-crystal samples subjected to truly hydrostatic pressures of over 90 kilobars. The force generating system exerts only forces normal to the diamond faces to obtain maximum reliability. A unique design allows exceptionally large open areas for maximum x-ray access and is particularly well suited for highly absorbing materials, as the x rays are not transmitted through the sample. Studies on ruby show that high-pressure crystal structure determinations may be done rapidly, reliably, and routinely with this system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uchida, Y., E-mail: h1312101@mailg.nc-toyama.ac.jp; Takada, E.; Fujisaki, A.
Neutron and γ-ray (n-γ) discrimination with a digital signal processing system has been used to measure the neutron emission profile in magnetic confinement fusion devices. However, a sampling rate must be set low to extend the measurement time because the memory storage is limited. Time jitter decreases a discrimination quality due to a low sampling rate. As described in this paper, a new charge comparison method was developed. Furthermore, automatic n-γ discrimination method was examined using a probabilistic approach. Analysis results were investigated using the figure of merit. Results show that the discrimination quality was improved. Automatic discrimination was appliedmore » using the EM algorithm and k-means algorithm.« less
a Sensor Based Automatic Ovulation Prediction System for Dairy Cows
NASA Astrophysics Data System (ADS)
Mottram, Toby; Hart, John; Pemberton, Roy
2000-12-01
Sensor scientists have been successful in developing detectors for tiny concentrations of rare compounds, but the work is rarely applied in practice. Any but the most trivial application of sensors requires a specification that should include a sampling system, a sensor, a calibration system and a model of how the information is to be used to control the process of interest. The specification of the sensor system should ask the following questions. How will the material to be analysed be sampled? What decision can be made with the information available from a proposed sensor? This project provides a model of a systems approach to the implementation of automatic ovulation prediction in dairy cows. A healthy well managed dairy cow should calve every year to make the best use of forage. As most cows are inseminated artificially it is of vital importance mat cows are regularly monitored for signs of oestrus. The pressure on dairymen to manage more cows often leads to less time being available for observation of cows to detect oestrus. This, together with breeding and feeding for increased yields, has led to a reduction in reproductive performance. In the UK the typical dairy farmer could save € 12800 per year if ovulation could be predicted accurately. Research over a number of years has shown that regular analysis of milk samples with tests based on enzyme linked immunoassay (ELISA) can map the ovulation cycle. However, these tests require the farmer to implement a manually operated sampling and analysis procedure and the technique has not been widely taken up. The best potential method of achieving 98% specificity of prediction of ovulation is to adapt biosensor techniques to emulate the ELISA tests automatically in the milking system. An automated ovulation prediction system for dairy cows is specified. The system integrates a biosensor with automatic milk sampling and a herd management database. The biosensor is a screen printed carbon electrode system capable of measuring concentrations of progesterone in milk in the range 0.3-25 ng/ml. The system is operational in the laboratory is described here and will be working on a test farm in the near future to automatically predict the ovulation of dairy cows routinely.
Automation of TL brick dating by ADAM-1
NASA Astrophysics Data System (ADS)
Čechák, T.; Gerndt, J.; Hiršl, P.; Jiroušek, P.; Kanaval, J.; Kubelík, M.; Musílek, L.
2001-06-01
A specially adapted machine ADAM-1 for the thermoluminescence fine grain dating of bricks was constructed in an interdisciplinary research project, undertaken by a team recruited from three faculties of the Czech Technical University in Prague. This TL-reader is able to measure and evaluate automatically numerous samples. The sample holder has 60 sample positions, which allow the irradiation and evaluation of samples taken from two locations. All procedures of alpha and beta irradiation by varying doses and the TL-signal measurement as also the age evaluation and error assessment are programmable and fully automated.
Automatic Residential/Commercial Classification of Parcels with Solar Panel Detections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Omitaomu, Olufemi A; Kotikot, Susan
A computational method to automatically detect solar panels on rooftops to aid policy and financial assessment of solar distributed generation. The code automatically classifies parcels containing solar panels in the U.S. as residential or commercial. The code allows the user to specify an input dataset containing parcels and detected solar panels, and then uses information about the parcels and solar panels to automatically classify the rooftops as residential or commercial using machine learning techniques. The zip file containing the code includes sample input and output datasets for the Boston and DC areas.
NASA Technical Reports Server (NTRS)
Coggeshall, M. E.; Hoffer, R. M.
1973-01-01
Remote sensing equipment and automatic data processing techniques were employed as aids in the institution of improved forest resource management methods. On the basis of automatically calculated statistics derived from manually selected training samples, the feature selection processor of LARSYS selected, upon consideration of various groups of the four available spectral regions, a series of channel combinations whose automatic classification performances (for six cover types, including both deciduous and coniferous forest) were tested, analyzed, and further compared with automatic classification results obtained from digitized color infrared photography.
NASA Astrophysics Data System (ADS)
Wessley, G. Jims John
2017-10-01
The propagation of shock waves through any media results in an instantaneous increase in pressure and temperature behind the shockwave. The scope of utilizing this sudden rise in pressure and temperature in new industrial, biological and commercial areas has been explored and the opportunities are tremendous. This paper presents the design and testing of a portable semi-automatic shock tube on water samples mixed with salt. The preliminary analysis shows encouraging results as the salinity of water samples were reduced up to 5% when bombarded with 250 shocks generated using a pressure ratio of 2. 5. Paper used for normal printing is used as the diaphragm to generate the shocks. The impact of shocks of much higher intensity obtained using different diaphragms will lead to more reduction in the salinity of the sea water, thus leading to production of potable water from saline water, which is the need of the hour.
Schroder, LeRoy J.; Malo, Bernard A.; ,
1985-01-01
The purpose of the National Trends Network is to delineate the major inorganic constituents in the wet deposition in the United States. The approach chosen to monitor the Nation's wet deposition is to install approximately 150 automatic sampling devices with at least one collector in each state. Samples are collected at one week intervals, removed from collectors, and transported to an analytical laboratory for chemical analysis. The quality assurance program has divided wet deposition monitoring into 5 parts: (1) Sampling site selection, (2) sampling device, (3) sample container, (4) sample handling, and (5) laboratory analysis. Each of these five components is being examined using existing designs or new designs. Each existing or proposed sampling site is visited and a criteria audit is performed.
Optimization and automation of quantitative NMR data extraction.
Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos
2013-06-18
NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.
Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen
2017-02-21
To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.
Presley, Todd K.; Jamison, Marcael T.J.; Young, Stacie T.M.
2008-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream and to assess the effects from the H-1 storm drain on Manoa Stream. For this program, rainfall data were collected at three stations, continuous discharge data at four stations, and water-quality data at six stations, which include the four continuous discharge stations. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2007, and June 30, 2008. A total of 16 environmental samples were collected over two storms during July 1, 2007, to June 30, 2008, within the Halawa Stream drainage area. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Some samples were analyzed for only a partial list of these analytes because an insufficient volume of sample was collected by the automatic samplers. Three additional quality-assurance/quality-control samples were collected concurrently with the storm samples. A total of 16 environmental samples were collected over four storms during July 1, 2007, to June 30, 2008 at the H-1 Storm Drain. All samples at this site were collected using an automatic sampler. Samples generally were analyzed for total suspended solids, nutrients, chemical oxygen demand, oil and grease, total petroleum hydrocarbons, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc), although some samples were analyzed for only a partial list of these analytes. During the storm of January 29, 2008, 10 discrete samples were collected. Varying constituent concentrations were detected for the samples collected at different times during this storm event. Two quality-assurance/quality-control samples were collected concurrently with the storm samples. Three additional quality-assurance/quality-control samples were collected during routine sampler maintenance to check the effectiveness of equipment-cleaning procedures.
40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicable procedures specified in American Society for Testing and Materials (ASTM) method D 4057-95(2000... applicable procedures specified in ASTM method D 4177-95(2000), entitled “Standard Practice for Automatic... applicable procedures in ASTM method D 5842-95(2000), entitled “Standard Practice for Sampling and Handling...
Automated biowaste sampling system urine subsystem operating model, part 1
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Rosen, F.
1973-01-01
The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
Larson, I. Lauren; Chiles, Marion M.; Miller, V. Clint
1993-01-01
Disclosed herein is a radiation detector providing for the in situ automatic sampling of fluids containing substances emitting radiation, especially Cerenkov radiation. The detector permits sampling within well casings and is self-purging such that no additional provisions must be established for the storage and disposal of contaminated fluids.
Automatic counting and classification of bacterial colonies using hyperspectral imaging
USDA-ARS?s Scientific Manuscript database
Detection and counting of bacterial colonies on agar plates is a routine microbiology practice to get a rough estimate of the number of viable cells in a sample. There have been a variety of different automatic colony counting systems and software algorithms mainly based on color or gray-scale pictu...
Exposure to Violent Video Games Increases Automatic Aggressiveness
ERIC Educational Resources Information Center
Uhlmann, Eric; Swanson, Jane
2004-01-01
The effects of exposure to violent video games on automatic associations with the self were investigated in a sample of 121 students. Playing the violent video game Doom led participants to associate themselves with aggressive traits and actions on the Implicit Association Test. In addition, self-reported prior exposure to violent video games…
Investigating Prompt Difficulty in an Automatically Scored Speaking Performance Assessment
ERIC Educational Resources Information Center
Cox, Troy L.
2013-01-01
Speaking assessments for second language learners have traditionally been expensive to administer because of the cost of rating the speech samples. To reduce the cost, many researchers are investigating the potential of using automatic speech recognition (ASR) as a means to score examinee responses to open-ended prompts. This study examined the…
NASA Astrophysics Data System (ADS)
Sánchez, Clara I.; Niemeijer, Meindert; Kockelkorn, Thessa; Abràmoff, Michael D.; van Ginneken, Bram
2009-02-01
Computer-aided Diagnosis (CAD) systems for the automatic identification of abnormalities in retinal images are gaining importance in diabetic retinopathy screening programs. A huge amount of retinal images are collected during these programs and they provide a starting point for the design of machine learning algorithms. However, manual annotations of retinal images are scarce and expensive to obtain. This paper proposes a dynamic CAD system based on active learning for the automatic identification of hard exudates, cotton wool spots and drusen in retinal images. An uncertainty sampling method is applied to select samples that need to be labeled by an expert from an unlabeled set of 4000 retinal images. It reduces the number of training samples needed to obtain an optimum accuracy by dynamically selecting the most informative samples. Results show that the proposed method increases the classification accuracy compared to alternative techniques, achieving an area under the ROC curve of 0.87, 0.82 and 0.78 for the detection of hard exudates, cotton wool spots and drusen, respectively.
Wojtas-Niziurski, Wojciech; Meng, Yilin; Roux, Benoit; Bernèche, Simon
2013-01-01
The potential of mean force describing conformational changes of biomolecules is a central quantity that determines the function of biomolecular systems. Calculating an energy landscape of a process that depends on three or more reaction coordinates might require a lot of computational power, making some of multidimensional calculations practically impossible. Here, we present an efficient automatized umbrella sampling strategy for calculating multidimensional potential of mean force. The method progressively learns by itself, through a feedback mechanism, which regions of a multidimensional space are worth exploring and automatically generates a set of umbrella sampling windows that is adapted to the system. The self-learning adaptive umbrella sampling method is first explained with illustrative examples based on simplified reduced model systems, and then applied to two non-trivial situations: the conformational equilibrium of the pentapeptide Met-enkephalin in solution and ion permeation in the KcsA potassium channel. With this method, it is demonstrated that a significant smaller number of umbrella windows needs to be employed to characterize the free energy landscape over the most relevant regions without any loss in accuracy. PMID:23814508
Using Isotope Ratio Infrared Spectrometer to determine δ13C and δ18O of carbonate samples
NASA Astrophysics Data System (ADS)
Smajgl, Danijela; Stöbener, Nils; Mandic, Magda
2017-04-01
The isotopic composition of calcifying organisms is a key tool for reconstruction past seawater temperature and water chemistry. Therefore stable carbon and oxygen isotopes (δ13C and δ18O) in carbonates have been widely used for reconstruction of paleoenvironments. Precise and accurate determination of isotopic composition of carbon (13C) and oxygen (18O) from carbonate sample with proper referencing and data evaluation algorithm presents a challenge for scientists. Mass spectrometry was the only widely used technique for this kind of analysis, but recent advances make laser based spectroscopy a viable alternative. The Thermo Scientific Delta Ray Isotope Ratio Infrared Spectrometer (IRIS) analyzer with the Universal Reference Interface (URI) Connect is one of those alternatives and with TELEDYNE Cetac ASX-7100 autosampler extends the traditional offerings with a system of high precision and throughput of samples. To establish precision and accuracy of measurements and also to develop optimal sample preparation method for measurements with Delta Ray IRIS and URI Connect, IAEA reference materials were used. Preparation is similar to a Gas Bench II method. Carbonate material is added into the vials, flushed with CO2 free synthetic air and acidified with few droplets of 104% H3PO4. Sample amount used for analysis can be as low as 200 μg. Samples are measured after acidification and equilibration time of one hour at 70°C. The CO2 gas generated by reaction is flushed into the variable volume inside the URI Connect through the Nafion based built-in water trap. For this step, carrier gas (CO2 free air) is used to flush the gas from the vial into the variable volume with a maximum volume of 100 ml. A small amount of the sample is then used for automatic concentration determination present in the variable volume. The Thermo Scientific Qtegra Software automatically adjusts any additional dilution of the sample to achieve the desired concentration (usually 400 ppm) in the analyzer. As part of the workflow, reference gas measurements are regularly measured at the same concentration as the sample to allow for automatic drift and linearity correction. With described sample preparation and measurement method, samples are measured with standard deviation less than 0.1‰ δ13C and δ18O, respectively and accuracy of <0.01‰. The system can measure up to 100 samples per day. Equivalent of about 80 µg of pure CO2 gas is needed to complete an analysis. Due to it's small weight and robustness, sample analysis can be performed in the field. Applying new technology of Isotope Ratio Infrared Spectrometers in environmental and paleoenvironmental research can extend the knowledge of complex seawater history and CO2 cycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bizyaev, D. V.; Kautsch, S. J.; Mosenkov, A. V.
We present a catalog of true edge-on disk galaxies automatically selected from the Seventh Data Release of the Sloan Digital Sky Survey (SDSS). A visual inspection of the g, r, and i images of about 15,000 galaxies allowed us to split the initial sample of edge-on galaxy candidates into 4768 (31.8% of the initial sample) genuine edge-on galaxies, 8350 (55.7%) non-edge-on galaxies, and 1865 (12.5%) edge-on galaxies not suitable for simple automatic analysis because these objects either show signs of interaction and warps, or nearby bright stars project on it. We added more candidate galaxies from RFGC, EFIGI, RC3, andmore » Galaxy Zoo catalogs found in the SDSS footprints. Our final sample consists of 5747 genuine edge-on galaxies. We estimate the structural parameters of the stellar disks (the stellar disk thickness, radial scale length, and central surface brightness) in the galaxies by analyzing photometric profiles in each of the g, r, and i images. We also perform simplified three-dimensional modeling of the light distribution in the stellar disks of edge-on galaxies from our sample. Our large sample is intended to be used for studying scaling relations in the stellar disks and bulges and for estimating parameters of the thick disks in different types of galaxies via the image stacking. In this paper, we present the sample selection procedure and general description of the sample.« less
An electrochemical albumin-sensing system utilizing microfluidic technology
NASA Astrophysics Data System (ADS)
Huang, Chao-June; Lu, Chiu-Chun; Lin, Thong-Yueh; Chou, Tse-Chuan; Lee, Gwo-Bin
2007-04-01
This paper reports an integrated microfluidic chip capable of detecting the concentration of albumin in urine by using an electrochemical method in an automatic format. The integrated microfluidic chip was fabricated by using microelectromechanical system techniques. The albumin detection was conducted by using the electrochemical sensing method, in which the albumin in urine was detected by measuring the difference of peak currents between a bare reference electrode and an albumin-adsorption electrode. To perform the detection of the albumin in an automatic format, pneumatic microvalves and micropumps were integrated onto the microfluidic chip. The albumin sample and interference mixture solutions such as homovanillic acid, dopamine, norepinephrine and epinephrine were first stored in one of the three reservoirs. Then the solution comprising the albumin sample and interference solutions was transported to pass through the detection zone utilizing the pneumatic micropump. Experimental data showed that the developed system can successfully detect the concentration of the albumin in the existence of interference materials. When compared with the traditional albumin-sensing method, smaller amounts of samples were required to perform faster detection by using the integrated microfluidic chip. Additionally, the microfluidic chip integrated with pneumatic micropumps and microvalves facilitates the transportation of the samples in an automatic mode with lesser human intervention. The development of the integrated microfluidic albumin-sensing system may be promising for biomedical applications. Preliminary results of the current paper were presented at the 2nd International Meeting on Microsensors and Microsystems 2006 (National Cheng Kung University, Tainan, Taiwan, 15-18 January).
NASA Astrophysics Data System (ADS)
Nelke, M.; Selker, J. S.; Udell, C.
2017-12-01
Reliable automatic water samplers allow repetitive sampling of various water sources over long periods of time without requiring a researcher on site, reducing human error as well as the monetary and time costs of traveling to the field, particularly when the scale of the sample period is hours or days. The high fixed cost of buying a commercial sampler with little customizability can be a barrier to research requiring repetitive samples, such as the analysis of septic water pre- and post-treatment. DIY automatic samplers proposed in the past sacrifice maximum volume, customizability, or scope of applications, among other features, in exchange for a lower net cost. The purpose of this project was to develop a low-cost, highly customizable, robust water sampler that is capable of sampling many sources of water for various analytes. A lightweight aluminum-extrusion frame was designed and assembled, chosen for its mounting system, strength, and low cost. Water is drawn from two peristaltic pumps through silicone tubing and directed into 24 foil-lined 250mL bags using solenoid valves. A programmable Arduino Uno microcontroller connected to a circuit board communicates with a battery operated real-time clock, initiating sampling stages. Period and volume settings are programmable in-field by the user via serial commands. The OPEnSampler is an open design, allowing the user to decide what components to use and the modular theme of the frame allows fast mounting of new manufactured or 3D printed components. The 24-bag system weighs less than 10kg and the material cost is under $450. Up to 6L of sample water can be drawn at a rate of 100mL/minute in either direction. Faster flowrates are achieved by using more powerful peristaltic pumps. Future design changes could allow a greater maximum volume by filling the unused space with more containers and adding GSM communications to send real time status information.
Michael, Joseph R.; Goehner, Raymond P.; Schlienger, Max E.
2001-01-01
A method and apparatus for determining the crystalline phase and crystalline characteristics of a sample. This invention provides a method and apparatus for unambiguously identifying and determining the crystalline phase and crystalline characteristics of a sample by using an electron beam generator, such as a scanning electron microscope, to obtain a backscattered electron Kikuchi pattern of a sample, and extracting crystallographic and composition data that is matched to database information to provide a quick and automatic method to identify crystalline phases.
Potato Operation: automatic detection of potato diseases
NASA Astrophysics Data System (ADS)
Lefebvre, Marc; Zimmerman, Thierry; Baur, Charles; Guegerli, Paul; Pun, Thierry
1995-01-01
The Potato Operation is a collaborative, multidisciplinary project in the domain of destructive testing of agricultural products. It aims at automatizing pulp sampling of potatoes in order to detect possible viral diseases. Such viruses can decrease fields productivity by a factor of up to ten. A machine, composed of three conveyor belts, a vision system, a robotic arm and controlled by a PC has been built. Potatoes are brought one by one from a bulk to the vision system, where they are seized by a rotating holding device. The sprouts, where the viral activity is maximum, are then detected by an active vision process operating on multiple views. The 3D coordinates of the sampling point are communicated to the robot arm holding a drill. Some flesh is then sampled by the drill, then deposited into an Elisa plate. After sampling, the robot arm washes the drill in order to prevent any contamination. The PC computer simultaneously controls these processes, the conveying of the potatoes, the vision algorithms and the sampling procedure. The master process, that is the vision procedure, makes use of three methods to achieve the sprouts detection. A profile analysis first locates the sprouts as protuberances. Two frontal analyses, respectively based on fluorescence and local variance, confirm the previous detection and provide the 3D coordinate of the sampling zone. The other two processes work by interruption of the master process.
Recent Research on the Automated Mass Measuring System
NASA Astrophysics Data System (ADS)
Yao, Hong; Ren, Xiao-Ping; Wang, Jian; Zhong, Rui-Lin; Ding, Jing-An
The research development of robotic measurement system as well as the representative automatic system were introduced in the paper, and then discussed a sub-multiple calibration scheme adopted on a fully-automatic CCR10 system effectively. Automatic robot system can be able to perform the dissemination of the mass scale without any manual intervention as well as the fast speed calibration of weight samples against a reference weight. At the last, evaluation of the expanded uncertainty was given out.
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William F.; Yegian, Derek T.; Earnest, Thomas N.; Jaklevich, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2007-09-25
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
Integrated crystal mounting and alignment system for high-throughput biological crystallography
Nordmeyer, Robert A.; Snell, Gyorgy P.; Cornell, Earl W.; Kolbe, William; Yegian, Derek; Earnest, Thomas N.; Jaklevic, Joseph M.; Cork, Carl W.; Santarsiero, Bernard D.; Stevens, Raymond C.
2005-07-19
A method and apparatus for the transportation, remote and unattended mounting, and visual alignment and monitoring of protein crystals for synchrotron generated x-ray diffraction analysis. The protein samples are maintained at liquid nitrogen temperatures at all times: during shipment, before mounting, mounting, alignment, data acquisition and following removal. The samples must additionally be stably aligned to within a few microns at a point in space. The ability to accurately perform these tasks remotely and automatically leads to a significant increase in sample throughput and reliability for high-volume protein characterization efforts. Since the protein samples are placed in a shipping-compatible layered stack of sample cassettes each holding many samples, a large number of samples can be shipped in a single cryogenic shipping container.
[Study on the automatic parameters identification of water pipe network model].
Jia, Hai-Feng; Zhao, Qi-Feng
2010-01-01
Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.
NASA Astrophysics Data System (ADS)
Laurent, B.; Losno, R.; Chevaillier, S.; Vincent, J.; Roullet, P.; Bon Nguyen, E.; Ouboulmane, N.; Triquet, S.; Fornier, M.; Raimbault, P.; Bergametti, G.
2015-07-01
Deposition is one of the key terms of the mineral dust cycle. However, dust deposition remains poorly constrained in transport models simulating the atmospheric dust cycle. This is mainly due to the limited number of relevant deposition measurements. This paper aims to present an automatic collector (CARAGA), specially developed to sample the total (dry and wet) atmospheric deposition of insoluble dust in remote areas. The autonomy of the CARAGA can range from 25 days to almost 1 year depending on the programmed sampling frequency (from 1 day to 2 weeks respectively). This collector is used to sample atmospheric deposition of Saharan dust on the Frioul islands in the Gulf of Lions in the Western Mediterranean. To quantify the mineral dust mass in deposition samples, a weighing and ignition protocol is applied. Almost 2 years of continuous deposition measurements performed on a weekly sampling basis on Frioul Island are presented and discussed with air mass trajectories and satellite observations of dust. Insoluble mineral deposition measured on Frioul Island was 2.45 g m-2 for February to December 2011 and 3.16 g m-2 for January to October 2012. Nine major mineral deposition events, measured during periods with significant MODIS aerosol optical depths, were associated with air masses coming from the southern Mediterranean Basin and North Africa.
ERIC Educational Resources Information Center
Nash, Hannah M.; Gooch, Debbie; Hulme, Charles; Mahajan, Yatin; McArthur, Genevieve; Steinmetzger, Kurt; Snowling, Margaret J.
2017-01-01
The "automatic letter-sound integration hypothesis" (Blomert, [Blomert, L., 2011]) proposes that dyslexia results from a failure to fully integrate letters and speech sounds into automated audio-visual objects. We tested this hypothesis in a sample of English-speaking children with dyslexic difficulties (N = 13) and samples of…
Automatic Method of Pause Measurement for Normal and Dysarthric Speech
ERIC Educational Resources Information Center
Rosen, Kristin; Murdoch, Bruce; Folker, Joanne; Vogel, Adam; Cahill, Louise; Delatycki, Martin; Corben, Louise
2010-01-01
This study proposes an automatic method for the detection of pauses and identification of pause types in conversational speech for the purpose of measuring the effects of Friedreich's Ataxia (FRDA) on speech. Speech samples of [approximately] 3 minutes were recorded from 13 speakers with FRDA and 18 healthy controls. Pauses were measured from the…
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 2005) Standard Practice for Automatic Sampling of Petroleum and Petroleum Products, IBR approved for... from Railroad Cars, Barges, Trucks, or Stockpiles, IBR approved for § 98.164(b). (35) ASTM D7430-08ae1... Liquids—Automatic pipeline sampling—Second Edition 1988-12-01, IBR approved for § 98.164(b). (3) [Reserved...
ERIC Educational Resources Information Center
Wicki, Werner; Hurschler Lichtsteiner, Sibylle
2018-01-01
Although fluency and automaticity of handwriting have been recognized as important research topics for 30 years, empirical data on respective developmental courses among typically developing children as well as clinical samples have remained very limited. To fill this gap, this study investigates the development of handwriting automaticity…
Urine sampling and collection system optimization and testing
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Geating, J. A.; Koesterer, M. G.
1975-01-01
A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.
Automatic interpretation of ERTS data for forest management
NASA Technical Reports Server (NTRS)
Kirvida, L.; Johnson, G. R.
1973-01-01
Automatic stratification of forested land from ERTS-1 data provides a valuable tool for resource management. The results are useful for wood product yield estimates, recreation and wild life management, forest inventory and forest condition monitoring. Automatic procedures based on both multi-spectral and spatial features are evaluated. With five classes, training and testing on the same samples, classification accuracy of 74% was achieved using the MSS multispectral features. When adding texture computed from 8 x 8 arrays, classification accuracy of 99% was obtained.
The role of automatic control in future interplanetary spaceflight
NASA Technical Reports Server (NTRS)
Scull, J. R.; Moore, J. W.
1976-01-01
The paper reviews the guidance and automatic control techniques used in previous U.S. and Soviet lunar and planetary exploration spacecraft, and examines the objectives and requirements of potential future interplanetary missions from the viewpoint of their further demands on automatic control technology. These missions include the Venus orbital imaging radar mission, the Pioneer Mars penetrator mission, the Mars surface sample return mission, Pioneer Saturn/Uranus/Titan probe missions, the Mariner Jupiter orbiter with daughter satellite, and comet and asteroid missions.
NASA Astrophysics Data System (ADS)
Alyassin, Abdal M.
2002-05-01
3D Digital mammography (3DDM) is a new technology that provides high resolution X-ray breast tomographic data. Like any other tomographic medical imaging modalities, viewing a stack of tomographic images may require time especially if the images are of large matrix size. In addition, it may cause difficulty to conceptually construct 3D breast structures. Therefore, there is a need to readily visualize the data in 3D. However, one of the issues that hinder the usage of volume rendering (VR) is finding an automatic way to generate transfer functions that efficiently map the important diagnostic information in the data. We have developed a method that randomly samples the volume. Based on the mean and the standard deviation of these samples, the technique determines the lower limit and upper limit of a piecewise linear ramp transfer function. We have volume rendered several 3DDM data using this technique and compared visually the outcome with the result from a conventional automatic technique. The transfer function generated through the proposed technique provided superior VR images over the conventional technique. Furthermore, the improvement in the reproducibility of the transfer function correlated with the number of samples taken from the volume at the expense of the processing time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Z; Folkerts, M; Jiang, S
Purpose: We have previously developed a GPU-OpenCL-based MC dose engine named goMC with built-in analytical linac beam model. To move goMC towards routine clinical use, we have developed an automatic beam-commissioning method, and an efficient source sampling strategy to facilitate dose calculations for real treatment plans. Methods: Our commissioning method is to automatically adjust the relative weights among the sub-sources, through an optimization process minimizing the discrepancies between calculated dose and measurements. Six models built for Varian Truebeam linac photon beams (6MV, 10MV, 15MV, 18MV, 6MVFFF, 10MVFFF) were commissioned using measurement data acquired at our institution. To facilitate dose calculationsmore » for real treatment plans, we employed inverse sampling method to efficiently incorporate MLC leaf-sequencing into source sampling. Specifically, instead of sampling source particles control-point by control-point and rejecting the particles blocked by MLC, we assigned a control-point index to each sampled source particle, according to MLC leaf-open duration of each control-point at the pixel where the particle intersects the iso-center plane. Results: Our auto-commissioning method decreased distance-to-agreement (DTA) of depth dose at build-up regions by 36.2% averagely, making it within 1mm. Lateral profiles were better matched for all beams, with biggest improvement found at 15MV for which root-mean-square difference was reduced from 1.44% to 0.50%. Maximum differences of output factors were reduced to less than 0.7% for all beams, with largest decrease being from1.70% to 0.37% found at 10FFF. Our new sampling strategy was tested on a Head&Neck VMAT patient case. Achieving clinically acceptable accuracy, the new strategy could reduce the required history number by a factor of ∼2.8 given a statistical uncertainty level and hence achieve a similar speed-up factor. Conclusion: Our studies have demonstrated the feasibility and effectiveness of our auto-commissioning approach and new efficient source sampling strategy, implying the potential of our GPU-based MC dose engine goMC for routine clinical use.« less
Williams, Shannon D.; Farmer, James
2003-01-01
The U.S. Geological Survey (USGS), in cooperation with the Tennessee Department of Environment and Conservation, Division of Superfund, collected discharge, rainfall, continuous water-quality (temperature, dissolved oxygen, specific conductance, and pH), and volatile organic compound (VOC) data from three karst springs in Middle Tennessee from February 2000 to May 2001. Continuous monitoring data indicated that each spring responds differently to storms. Water quality and discharge at Wilson Spring, which is located in the Central Basin karst region of Tennessee, changed rapidly after rainfall. Water quality and discharge also varied at Cascade Spring; however, changes did not occur as frequently or as quickly as changes at Wilson Spring. Water quality and discharge at Big Spring at Rutledge Falls changed little in response to storms. Cascade Spring and Big Spring at Rutledge Falls are located in similar hydrogeologic settings on the escarpment of the Highland Rim. Nonisokinetic dip-sampling methods were used to collect VOC samples from the springs during base-flow conditions. During selected storms, automatic samplers were used to collect water samples at Cascade Spring and Wilson Spring. Water samples were collected as frequently as every 15 minutes at the beginning of a storm, and sampling intervals were gradually increased following a storm. VOC samples were analyzed using a portable gas chromatograph (GC). VOC samples were collected from Wilson, Cascade, and Big Springs during 600, 199, and 55 sampling times, respectively, from February 2000 to May 2001. Chloroform concentrations detected at Wilson Spring ranged from 0.073 to 34 mg/L (milligrams per liter). Chloroform concentrations changed during most storms; the greatest change detected was during the first storm in fall 2000, when chloroform concentrations increased from about 0.5 to about 34 mg/L. Concentrations of cis-1,2-dichloroethylene (cis-1,2-DCE) detected at Cascade Spring ranged from 0.30 to 1.8 ?g/L (micrograms per liter) and gradually decreased between November 2000 and May 2001. In addition to the gradual decrease in cis-1,2-DCE concentrations, some additional decreases were detected during storms. VOC samples collected at weekly intervals from Big Spring indicated a gradual decrease in trichloroethylene (TCE) concentrations from approximately 9 to 6 ?g/L between November 2000 and May 2001. Significant changes in TCE concentrations were not detected during individual storms at Big Spring. Quality-control samples included trip blanks, equipment blanks, replicates, and field-matrix spike samples. VOC concentrations measured using the portable GC were similar to concentrations in replicate samples analyzed by the USGS National Water Quality Laboratory (NWQL) with the exception of chloroform and TCE concentrations. Chloroform and TCE concentrations detected by the portable GC were consistently lower (median percent differences of ?19.2 and ?17.4, respectively) than NWQL results. High correlations, however, were observed between concentrations detected by the portable GC and concentrations detected by the NWQL (Pearson?s r > 0.96). VOC concentrations in automatically collected samples were similar to concentrations in replicates collected using dip-sampling methods. More than 80 percent of the VOC concentrations measured in automatically collected samples were within 12 percent of concentrations in dip samples.
Progressive compressive imager
NASA Astrophysics Data System (ADS)
Evladov, Sergei; Levi, Ofer; Stern, Adrian
2012-06-01
We have designed and built a working automatic progressive sampling imaging system based on the vector sensor concept, which utilizes a unique sampling scheme of Radon projections. This sampling scheme makes it possible to progressively add information resulting in tradeoff between compression and the quality of reconstruction. The uniqueness of our sampling is that in any moment of the acquisition process the reconstruction can produce a reasonable version of the image. The advantage of the gradual addition of the samples is seen when the sparsity rate of the object is unknown, and thus the number of needed measurements. We have developed the iterative algorithm OSO (Ordered Sets Optimization) which employs our sampling scheme for creation of nearly uniform distributed sets of samples, which allows the reconstruction of Mega-Pixel images. We present the good quality reconstruction from compressed data ratios of 1:20.
NASA Astrophysics Data System (ADS)
Santospirito, S. P.; Słyk, Kamil; Luo, Bin; Łopatka, Rafał; Gilmour, Oliver; Rudlin, John
2013-05-01
Detection of defects in Laser Powder Deposition (LPD) produced components has been achieved by laser thermography. An automatic in-process NDT defect detection software system has been developed for the analysis of laser thermography to automatically detect, reliably measure and then sentence defects in individual beads of LPD components. A deposition path profile definition has been introduced so all laser powder deposition beads can be modeled, and the inspection system has been developed to automatically generate an optimized inspection plan in which sampling images follow the deposition track, and automatically control and communicate with robot-arms, the source laser and cameras to implement image acquisition. Algorithms were developed so that the defect sizes can be correctly evaluated and these have been confirmed using test samples. Individual inspection images can also be stitched together for a single bead, a layer of beads or multiple layers of beads so that defects can be mapped through the additive process. A mathematical model was built up to analyze and evaluate the movement of heat throughout the inspection bead. Inspection processes were developed and positional and temporal gradient algorithms have been used to measure the flaw sizes. Defect analysis is then performed to determine if the defect(s) can be further classified (crack, lack of fusion, porosity) and the sentencing engine then compares the most significant defect or group of defects against the acceptance criteria - independent of human decisions. Testing on manufactured defects from the EC funded INTRAPID project has successful detected and correctly sentenced all samples.
Deem, J F; Manning, W H; Knack, J V; Matesich, J S
1989-09-01
A program for the automatic extraction of jitter (PAEJ) was developed for the clinical measurement of pitch perturbations using a microcomputer. The program currently includes 12 implementations of an algorithm for marking the boundary criteria for a fundamental period of vocal fold vibration. The relative sensitivity of these extraction procedures for identifying the pitch period was compared using sine waves. Data obtained to date provide information for each procedure concerning the effects of waveform peakedness and slope, sample duration in cycles, noise level of the analysis system with both direct and tape recorded input, and the influence of interpolation. Zero crossing extraction procedures provided lower jitter values regardless of sine wave frequency or sample duration. The procedures making use of positive- or negative-going zero crossings with interpolation provided the lowest measures of jitter with the sine wave stimuli. Pilot data obtained with normal-speaking adults indicated that jitter measures varied as a function of the speaker, vowel, and sample duration.
Ophus, Colin; Rasool, Haider I.; Linck, Martin; ...
2016-11-30
We develop an automatic and objective method to measure and correct residual aberrations in atomic-resolution HRTEM complex exit waves for crystalline samples aligned along a low-index zone axis. Our method uses the approximate rotational point symmetry of a column of atoms or single atom to iteratively calculate a best-fit numerical phase plate for this symmetry condition, and does not require information about the sample thickness or precise structure. We apply our method to two experimental focal series reconstructions, imaging a β-Si 3N 4 wedge with O and N doping, and a single-layer graphene grain boundary. We use peak and latticemore » fitting to evaluate the precision of the corrected exit waves. We also apply our method to the exit wave of a Si wedge retrieved by off-axis electron holography. In all cases, the software correction of the residual aberration function improves the accuracy of the measured exit waves.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ophus, Colin; Rasool, Haider I.; Linck, Martin
We develop an automatic and objective method to measure and correct residual aberrations in atomic-resolution HRTEM complex exit waves for crystalline samples aligned along a low-index zone axis. Our method uses the approximate rotational point symmetry of a column of atoms or single atom to iteratively calculate a best-fit numerical phase plate for this symmetry condition, and does not require information about the sample thickness or precise structure. We apply our method to two experimental focal series reconstructions, imaging a β-Si 3N 4 wedge with O and N doping, and a single-layer graphene grain boundary. We use peak and latticemore » fitting to evaluate the precision of the corrected exit waves. We also apply our method to the exit wave of a Si wedge retrieved by off-axis electron holography. In all cases, the software correction of the residual aberration function improves the accuracy of the measured exit waves.« less
Evaluation of a depth proportional intake device for automatic pumping samplers
Rand E. Eads; Robert B. Thomas
1983-01-01
Abstract - A depth proportional intake boom for portable pumping samplers was used to collect suspended sediment samples in two coastal streams for three winters. The boom pivots on the stream bed while a float on the downstream end allows debris to depress the boom and pass without becoming trapped. This equipment modifies point sampling by maintaining the intake...
Electron paramagnetic resonance of several lunar rock samples
NASA Technical Reports Server (NTRS)
Marov, P. N.; Dubrov, Y. N.; Yermakov, A. N.
1974-01-01
The results are presented of investigating lunar rock samples returned by the Luna 16 automatic station, using electron paramagnetic resonance (EPR). The EPR technique makes it possible to detect paramagnetic centers and investigate their nature, with high sensitivity. Regolith (finely dispersed material) and five particles from it, 0.3 mm in size, consisting mostly of olivine, were investigated with EPR.
NASA Astrophysics Data System (ADS)
Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.
2016-01-01
We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.
Integration and segregation of large-scale brain networks during short-term task automatization
Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F.; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes
2016-01-01
The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes. PMID:27808095
Enhancing the Automatic Generation of Hints with Expert Seeding
ERIC Educational Resources Information Center
Stamper, John; Barnes, Tiffany; Croy, Marvin
2011-01-01
The Hint Factory is an implementation of our novel method to automatically generate hints using past student data for a logic tutor. One disadvantage of the Hint Factory is the time needed to gather enough data on new problems in order to provide hints. In this paper we describe the use of expert sample solutions to "seed" the hint generation…
Assessing Children's Home Language Environments Using Automatic Speech Recognition Technology
ERIC Educational Resources Information Center
Greenwood, Charles R.; Thiemann-Bourque, Kathy; Walker, Dale; Buzhardt, Jay; Gilkerson, Jill
2011-01-01
The purpose of this research was to replicate and extend some of the findings of Hart and Risley using automatic speech processing instead of human transcription of language samples. The long-term goal of this work is to make the current approach to speech processing possible by researchers and clinicians working on a daily basis with families and…
Selby, Edward A; Nock, Matthew K; Kranzler, Amy
2014-02-28
One of the most frequently reported, yet understudied, motivations for non-suicidal self-injury (NSSI) involves automatic positive reinforcement (APR), wherein sensations arising from NSSI reinforce and promote the behavior. The current study used experience sampling methodology with a clinical sample of self-injuring adolescents (N=30) over a 2-week period during which the adolescents reported NSSI behaviors, and rated if an APR motivation was present, and if so whether that motivation pertained to feeling "pain," "stimulation," or "satisfaction." Over 50% of the sample reported at least one instance of NSSI for APR reasons. No significant differences were found on demographic factors or psychiatric comorbidity for those with and without an APR motivation. However, those with an APR motivation reported elevated NSSI thoughts, longer duration of those thoughts, and more NSSI behaviors. They also reported more alcohol use thoughts, alcohol use, impulsive spending, and binge eating. The most commonly reported sensation following NSSI for APR was "satisfaction." However those endorsing feeling pain reported the most NSSI behaviors. These findings provide new information about the APR motivations for NSSI and shed light on the different sensations felt. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Becker, Holger; Schattschneider, Sebastian; Klemm, Richard; Hlawatsch, Nadine; Gärtner, Claudia
2015-03-01
The continuous monitoring of the environment for lethal pathogens is a central task in the field of biothreat detection. Typical scenarios involve air-sampling in locations such as public transport systems or large public events and a subsequent analysis of the samples by a portable instrument. Lab-on-a-chip technologies are one of the promising technological candidates for such a system. We have developed an integrated microfluidic system with automatic sampling for the detection of CBRNE-related pathogens. The chip contains a two-pronged analysis strategy, on the one hand an immunological track using antibodies immobilized on a frit and a subsequent photometric detection, on the other hand a molecular biology approach using continuous-flow PCR with a fluorescence end-point detection. The cartridge contains two-component molded rotary valve to allow active fluid control and switching between channels. The accompanying instrument contains all elements for fluidic and valve actuation, thermal control, as well as the two detection modalities. Reagents are stored in dedicated reagent packs which are connected directly to the cartridge. With this system, we have been able to demonstrate the detection of a variety of pathogen species.
Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes
NASA Astrophysics Data System (ADS)
Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao
2010-06-01
To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.
Cognitive tasks promote automatization of postural control in young and older adults.
Potvin-Desrochers, Alexandra; Richer, Natalie; Lajoie, Yves
2017-09-01
Researchers looking at the effects of performing a concurrent cognitive task on postural control in young and older adults using traditional center-of-pressure measures and complexity measures found discordant results. Results of experiments showing improvements of stability have suggested the use of strategies such as automatization of postural control or stiffening strategy. This experiment aimed to confirm in healthy young and older adults that performing a cognitive task while standing leads to improvements that are due to automaticity of sway by using sample entropy. Twenty-one young adults and twenty-five older adults were asked to stand on a force platform while performing a cognitive task. There were four cognitive tasks: simple reaction time, go/no-go reaction time, equation and occurrence of a digit in a number sequence. Results demonstrated decreased sway area and variability as well as increased sample entropy for both groups when performing a cognitive task. Results suggest that performing a concurrent cognitive task promotes the adoption of an automatic postural control in young and older adults as evidenced by an increased postural stability and postural sway complexity. Copyright © 2017 Elsevier B.V. All rights reserved.
Uchida, Y.; Takada, E.; Fujisaki, A.; Isobe, M.; Shinohara, K.; Tomita, H.; Kawarabayashi, J.; Iguchi, T.
2014-01-01
Neutron and γ-ray (n-γ) discrimination with a digital signal processing system has been used to measure the neutron emission profile in magnetic confinement fusion devices. However, a sampling rate must be set low to extend the measurement time because the memory storage is limited. Time jitter decreases a discrimination quality due to a low sampling rate. As described in this paper, a new charge comparison method was developed. Furthermore, automatic n-γ discrimination method was examined using a probabilistic approach. Analysis results were investigated using the figure of merit. Results show that the discrimination quality was improved. Automatic discrimination was applied using the EM algorithm and k-means algorithm. PMID:25430297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolme, David S; Tokola, Ryan A; Boehnen, Chris Bensing
Automatic recognition systems are a valuable tool for identifying unknown deceased individuals. Immediately af- ter death fingerprint and face biometric samples are easy to collect using standard sensors and cameras and can be easily matched to anti-mortem biometric samples. Even though post-mortem fingerprints and faces have been used for decades, there are no studies that track these biomet- rics through the later stages of decomposition to determine the length of time the biometrics remain viable. This paper discusses a multimodal dataset of fingerprints, faces, and irises from 14 human cadavers that decomposed outdoors under natural conditions. Results include predictive modelsmore » relating time and temperature, measured as Accumulated Degree Days (ADD), and season (winter, spring, summer) to the predicted probably of automatic verification using a commercial algorithm.« less
Currens, J.C.
1999-01-01
Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.
Dielectrophoretic manipulation of particles for use in microfluidic devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belgrader, P; Bettencourt, K; Hamilton, J
1999-06-23
Amplification and hybridization of DNA are commonly used techniques to verify the presence of a specific DNA sequence in a test sample. Automatic sample handling to concentrate and purify sample prior to amplification is desirable both from the cost standpoint and from the standpoint of reducing the possibility of sample contamination. This paper explores the use of the dielectrophoretic force to manipulate DNA, Bacillus globigii spores, and Erwinia herbicola bacteria to provide concentration and purification as part of the sample handling functions in biological monitoring equipment. It was found that for what would be considered a typical microfabricated structure withmore » electrode gaps at 30 {micro}m operating at 5V, that concentration of the particles is very effective.« less
Brown, G.E.; McLain, B.J.
1994-01-01
The analysis of natural-water samples for antimony by automated-hydride atomic absorption spectrophotometry is described. Samples are prepared for analysis by addition of potassium and hydrochloric acid followed by an autoclave digestion. After the digestion, potassium iodide and sodium borohydride are added automatically. Antimony hydride (stibine) gas is generated, then swept into a heated quartz cell for determination of antimony by atomic absorption spectrophotometry. Precision and accuracy data are presented. Results obtained on standard reference water samples agree with means established by interlaboratory studies. Spike recoveries for actual samples range from 90 to 114 percent. Replicate analyses of water samples of varying matrices give relative standard deviations from 3 to 10 percent.
Planetary protection issues for sample return missions.
DeVincenzi, D L; Klein, H P
1989-01-01
Sample return missions from a comet nucleus and the Mars surface are currently under study in the US, USSR, and by ESA. Guidance on Planetary Protection (PP) issues is needed by mission scientists and engineers for incorporation into various elements of mission design studies. Although COSPAR has promulgated international policy on PP for various classes of solar system exploration missions, the applicability of this policy to sample return missions, in particular, remains vague. In this paper, we propose a set of implementing procedures to maintain the scientific integrity of these samples. We also propose that these same procedures will automatically assure that COSPAR-derived PP guidelines are achieved. The recommendations discussed here are the first step toward development of official COSPAR implementation requirements for sample return missions.
[A quickly methodology for drug intelligence using profiling of illicit heroin samples].
Zhang, Jianxin; Chen, Cunyi
2012-07-01
The aim of the paper was to evaluate a link between two heroin seizures using a descriptive method. The system involved the derivation and gas chromatographic separation of samples followed by a fully automatic data analysis and transfer to a database. Comparisons used the square cosine function between two chromatograms assimilated to vectors. The method showed good discriminatory capabilities. The probability of false positives was extremely slight. In conclusion, this method proved to be efficient and reliable, which appeared suitable for estimating the links between illicit heroin samples.
Burnett, Andrew D; Fan, Wenhui; Upadhya, Prashanth C; Cunningham, John E; Hargreaves, Michael D; Munshi, Tasnim; Edwards, Howell G M; Linfield, Edmund H; Davies, A Giles
2009-08-01
Terahertz frequency time-domain spectroscopy has been used to analyse a wide range of samples containing cocaine hydrochloride, heroin and ecstasy--common drugs-of-abuse. We investigated real-world samples seized by law enforcement agencies, together with pure drugs-of-abuse, and pure drugs-of-abuse systematically adulterated in the laboratory to emulate real-world samples. In order to investigate the feasibility of automatic spectral recognition of such illicit materials by terahertz spectroscopy, principal component analysis was employed to cluster spectra of similar compounds.
Sonication standard laboratory module
Beugelsdijk, Tony; Hollen, Robert M.; Erkkila, Tracy H.; Bronisz, Lawrence E.; Roybal, Jeffrey E.; Clark, Michael Leon
1999-01-01
A standard laboratory module for automatically producing a solution of cominants from a soil sample. A sonication tip agitates a solution containing the soil sample in a beaker while a stepper motor rotates the sample. An aspirator tube, connected to a vacuum, draws the upper layer of solution from the beaker through a filter and into another beaker. This beaker can thereafter be removed for analysis of the solution. The standard laboratory module encloses an embedded controller providing process control, status feedback information and maintenance procedures for the equipment and operations within the standard laboratory module.
Automatic photointerpretation for plant species and stress identification (ERTS-A1)
NASA Technical Reports Server (NTRS)
Swanlund, G. D. (Principal Investigator); Kirvida, L.; Johnson, G. R.
1973-01-01
The author has identified the following significant results. Automatic stratification of forested land from ERTS-1 data provides a valuable tool for resource management. The results are useful for wood product yield estimates, recreation and wildlife management, forest inventory, and forest condition monitoring. Automatic procedures based on both multispectral and spatial features are evaluated. With five classes, training and testing on the same samples, classification accuracy of 74 percent was achieved using the MSS multispectral features. When adding texture computed from 8 x 8 arrays, classification accuracy of 90 percent was obtained.
Howell, W.D.
1957-08-20
An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.
[Evaluation of Medical Instruments Cleaning Effect of Fluorescence Detection Technique].
Sheng, Nan; Shen, Yue; Li, Zhen; Li, Huijuan; Zhou, Chaoqun
2016-01-01
To compare the cleaning effect of automatic cleaning machine and manual cleaning on coupling type surgical instruments. A total of 32 cleaned medical instruments were randomly sampled from medical institutions in Putuo District medical institutions disinfection supply center. Hygiena System SUREII ATP was used to monitor the ATP value, and the cleaning effect was evaluated. The surface ATP values of the medical instrument of manual cleaning were higher than that of the automatic cleaning machine. Coupling type surgical instruments has better cleaning effect of automatic cleaning machine before disinfection, the application is recommended.
Water sampling using a drone at Yugama crater lake, Kusatsu-Shirane volcano, Japan
NASA Astrophysics Data System (ADS)
Terada, Akihiko; Morita, Yuichi; Hashimoto, Takeshi; Mori, Toshiya; Ohba, Takeshi; Yaguchi, Muga; Kanda, Wataru
2018-04-01
Remote sampling of water from Yugama crater lake at Kusatsu-Shirane volcano, Japan, was performed using a drone. Despite the high altitude of over 2000 m above sea level, our simple method was successful in retrieving a 250 mL sample of lake water. The procedure presented here is easy for any researcher to follow who operates a drone without additional special apparatus. We compare the lake water sampled by drone with that sampled by hand at a site where regular samplings have previously been carried out. Chemical concentrations and stable isotope ratios are largely consistent between the two techniques. As the drone can fly automatically with the aid of navigation by Global Navigation Satellite System (GNSS), it is possible to repeatedly sample lake water from the same location, even when entry to Yugama crater lake is restricted due to the risk of eruption.[Figure not available: see fulltext.
Ging, Patricia B.
1999-01-01
Surface-water sampling protocols of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program specify samples for most properties and constituents to be collected manually in equal-width increments across a stream channel and composited for analysis. Single-point sampling with an automated sampler (autosampler) during storms was proposed in the upper part of the South-Central Texas NAWQA study unit, raising the question of whether property and constituent concentrations from automatically collected samples differ significantly from those in samples collected manually. Statistical (Wilcoxon signed-rank test) analyses of 3 to 16 paired concentrations for each of 26 properties and constituents from water samples collected using both methods at eight sites in the upper part of the study unit indicated that there were no significant differences in concentrations for dissolved constituents, other than calcium and organic carbon.
Manufacturing Methods and Technology Project Summary Reports
1984-12-01
are used. The instrument chosen provides a convenient method of artifically aging a propellant sample while automatically analyzing for evolved oxides...and aging . Shortly after the engineering sample run, a change in REMBASS require- ments eliminated the crystal high shock requirements. This resulted...material with minimum outgassing in a precision vacuum QXFF. Minimal outgas- ..- sing reduces aging in the finished unit. A fixture was also developed to
NASA Astrophysics Data System (ADS)
Jomaa, Seifeddine; Jiang, Sanyuan; Yang, Xiaoqiang; Rode, Michael
2016-04-01
It is known that a good evaluation and prediction of surface water pollution is mainly limited by the monitoring strategy and the capability of the hydrological water quality model to reproduce the internal processes. To this end, a compromise sampling frequency, which can reflect the dynamical behaviour of leached nutrient fluxes responding to changes in land use, agriculture practices and point sources, and appropriate process-based water quality model are required. The objective of this study was to test the identification of hydrological water quality model parameters (nitrogen and phosphorus) under two different monitoring strategies: (1) regular grab-sampling approach and (2) regular grab-sampling with additional monitoring during the hydrological events using automatic samplers. First, the semi-distributed hydrological water quality HYPE (Hydrological Predictions for the Environment) model was successfully calibrated (1994-1998) for discharge (NSE = 0.86), nitrate-N (lowest NSE for nitrate-N load = 0.69), particulate phosphorus and soluble phosphorus in the Selke catchment (463 km2, central Germany) for the period 1994-1998 using regular grab-sampling approach (biweekly to monthly for nitrogen and phosphorus concentrations). Second, the model was successfully validated during the period 1999-2010 for discharge, nitrate-N, particulate-phosphorus and soluble-phosphorus (lowest NSE for soluble phosphorus load = 0.54). Results, showed that when additional sampling during the events with random grab-sampling approach was used (period 2011-2013), the hydrological model could reproduce only the nitrate-N and soluble phosphorus concentrations reasonably well. However, when additional sampling during the hydrological events was considered, the HYPE model could not represent the measured particulate phosphorus. This reflects the importance of suspended sediment during the hydrological events increasing the concentrations of particulate phosphorus. The HYPE model could reproduce the total phosphorus during the period 2011-2013 only when the sediment transport-related model parameters was re-identified again considering the automatic sampling during the high-flow conditions.
Song, Yongxin; Li, Mengqi; Pan, Xinxiang; Wang, Qi; Li, Dongqing
2015-02-01
An electrokinetic microfluidic chip is developed to detect and sort target cells by size from human blood samples. Target-cell detection is achieved by a differential resistive pulse sensor (RPS) based on the size difference between the target cell and other cells. Once a target cell is detected, the detected RPS signal will automatically actuate an electromagnetic pump built in a microchannel to push the target cell into a collecting channel. This method was applied to automatically detect and sort A549 cells and T-lymphocytes from a peripheral fingertip blood sample. The viability of A549 cells sorted in the collecting well was verified by Hoechst33342 and propidium iodide staining. The results show that as many as 100 target cells per minute can be sorted out from the sample solution and thus is particularly suitable for sorting very rare target cells, such as circulating tumor cells. The actuation of the electromagnetic valve has no influence on RPS cell detection and the consequent cell-sorting process. The viability of the collected A549 cell is not impacted by the applied electric field when the cell passes the RPS detection area. The device described in this article is simple, automatic, and label-free and has wide applications in size-based rare target cell sorting for medical diagnostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hoffman, Steven J; Justicz, Victoria
2016-07-01
To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.
RaPToRS Sample Delivery System
NASA Astrophysics Data System (ADS)
Henchen, Robert; Shibata, Kye; Krieger, Michael; Pogozelski, Edward; Padalino, Stephen; Glebov, Vladimir; Sangster, Craig
2010-11-01
At various labs (NIF, LLE, NRL), activated material samples are used to measure reaction properties. The Rapid Pneumatic Transport of Radioactive Samples (RaPToRS) system quickly and safely moves these radioactive samples through a closed PVC tube via airflow. The carrier travels from the reaction chamber to the control and analysis station, pneumatically braking at the outlet. A reversible multiplexer routes samples from various locations near the shot chamber to the analysis station. Also, the multiplexer allows users to remotely load unactivated samples without manually approaching the reaction chamber. All elements of the system (pneumatic drivers, flow control valves, optical position sensors, multiplexers, Geiger counters, and release gates at the analysis station) can be controlled manually or automatically using a custom LabVIEW interface. A prototype is currently operating at NRL in Washington DC. Prospective facilities for Raptors systems include LLE and NIF.
Scott, Anna E.; Vasilescu, Dragos M.; Seal, Katherine A. D.; Keyes, Samuel D.; Mavrogordato, Mark N.; Hogg, James C.; Sinclair, Ian; Warner, Jane A.; Hackett, Tillie-Louise; Lackie, Peter M.
2015-01-01
Background Understanding the three-dimensional (3-D) micro-architecture of lung tissue can provide insights into the pathology of lung disease. Micro computed tomography (µCT) has previously been used to elucidate lung 3D histology and morphometry in fixed samples that have been stained with contrast agents or air inflated and dried. However, non-destructive microstructural 3D imaging of formalin-fixed paraffin embedded (FFPE) tissues would facilitate retrospective analysis of extensive tissue archives of lung FFPE lung samples with linked clinical data. Methods FFPE human lung tissue samples (n = 4) were scanned using a Nikon metrology µCT scanner. Semi-automatic techniques were used to segment the 3D structure of airways and blood vessels. Airspace size (mean linear intercept, Lm) was measured on µCT images and on matched histological sections from the same FFPE samples imaged by light microscopy to validate µCT imaging. Results The µCT imaging protocol provided contrast between tissue and paraffin in FFPE samples (15mm x 7mm). Resolution (voxel size 6.7 µm) in the reconstructed images was sufficient for semi-automatic image segmentation of airways and blood vessels as well as quantitative airspace analysis. The scans were also used to scout for regions of interest, enabling time-efficient preparation of conventional histological sections. The Lm measurements from µCT images were not significantly different to those from matched histological sections. Conclusion We demonstrated how non-destructive imaging of routinely prepared FFPE samples by laboratory µCT can be used to visualize and assess the 3D morphology of the lung including by morphometric analysis. PMID:26030902
Effects of local emission sources on the acidification of rainwater in an industrial city in Taiwan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung-Shin Yuan; Der-Yuan Wu
1996-12-31
This study investigated the acidification of precipitation in an industrial city in Taiwan Island. The purposes of this study is two fold. The first is to characterize the status of add precipitation around the industrial city. Rainwater samples were collected by automatic rainwater samplers-located at five sampling sites which covered the entire city. The second is to investigate the potential sources of acidic species in the acid rainwater. Further study was taken to ascertain the effects of local emissions as well as long range transportation on the acidification of precipitation. Investigation of acid rain on the Island of Taiwan hasmore » been conducted since 1984. Most of these researches were short-term and/or large-scale investigations. Long-term sampling of acid rain at heavy polluted region has never been investigated yet. In this investigation, Kaohsiung was selected as the city for the intensive acid rain sampling since it is the largest industrial city as well as the largest harbor in Taiwan Island. Both dry and wet acid samples were collected daily by the automatic rainwater samplers. Major cations (H{sup +}, NH{sub 4}{sup +}, K{sup +}, Ca{sup +2}, and Mg{sup +2}), anions (F{sup -}, Cl{sup -}, NO3{sup -}, and SO4{sup -2}), and conductivity of acid samples were measured simultaneously. Actually, both pH value and conductivity were measured on site. During the period of investigation, 325 collected rainwater samples demonstrated an average pH value of 5.2 with the range of 3.1 to 6.3. This investigation revealed that emissions from local sources such as power plants, petrochemical plants, and cement plants play important roles on the acidification of rainwater in the industrial city in Taiwan.« less
Inexpensive portable drug detector
NASA Technical Reports Server (NTRS)
Dimeff, J.; Heimbuch, A. H.; Parker, J. A.
1977-01-01
Inexpensive, easy-to-use, self-scanning, self-calibrating, portable unit automatically graphs fluorescence spectrum of drug sample. Device also measures rate of movement through chromatographic column for forensic and medical testing.
NASA Astrophysics Data System (ADS)
Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James
2016-04-01
In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter system and present initial results from field tests in winter 2015/2016 under natural conditions at an experimental field site. Fully functional deployment in a forested and an open field location in the Erlenbach subcatchment (0.7 km2) is envisaged for winter 2016/2017. Godsey, S.E.,* J.W. Kirchner and C.L. Tague, Effects of changes in winter snowpacks on summer low flows: case studies in the Sierra Nevada, California, USA, Hydrological Processes, 28, 5048-5064, doi: 10.1002/hyp.9943, 2014.
Parra-Sánchez, Manuel; Zakariya-Yousef Breval, Ismail; Castro Méndez, Carmen; García-Rey, Silvia; Loza Vazquez, Ana; Úbeda Iglesias, Alejandro; Macías Guerrero, Desiree; Romero Mejías, Ana; León Gil, Cristobal; Martín-Mazuelos, Estrella
2017-08-01
Testing for Candida albicans germ-tube antibody IFA IgG assay (CAGTA) is used to detect invasive candidiasis infection. However, most suitable assays lack automation and rapid single-sample testing. The CAGTA assay was adapted in an automatic monotest system (invasive candidiasis [CAGTA] VirClia ® IgG monotest (VirClia ® ), a chemiluminescence assay with ready-to-use reagents that provides a rapid objective result. CAGTA assay was compared with the monotest automatic VirClia ® assay in order to establish the diagnostic reliability, accuracy, and usefulness of this method. A prospective study with 361 samples from 179 non-neutropenic critically ill adults patients was conducted, including 21 patients with candidemia, 18 with intra-abdominal candidiasis, 84 with Candida spp. colonization, and 56 with culture-negative samples, as well as samples from ten healthy subjects. Overall agreement between the two assays (CAGTA and VirCLIA) was 85.3%. These assays were compared with the gold-standard method to determine the sensitivity, specificity as well as positive and negative predictive values. In patients with candidemia, values for CAGTA and VirCLIA assays were 76.2 versus 85.7%, 80.3 versus 75.8%, 55.2 versus 52.9%, and 91.4 versus 94.3%, respectively. The corresponding values in patients with intra-abdominal candidiasis were 61.1 versus 66.7%, 80.3 versus 75.8%, 45.8 versus 42.9%, and 88.3 versus 89.3%, respectively. No differences were found according to the species of Candida isolated in culture, except for Candida albicans and C. parapsilosis, for which VirClia ® was better than CAGTA. According to these results, the automated VirClia ® assay was a reliable, rapid, and very easy to perform technique as tool for the diagnosis invasive candidiasis.
Fast and automatic thermographic material identification for the recycling process
NASA Astrophysics Data System (ADS)
Haferkamp, Heinz; Burmester, Ingo
1998-03-01
Within the framework of the future closed loop recycling process the automatic and economical sorting of plastics is a decisive element. The at the present time available identification and sorting systems are not yet suitable for the sorting of technical plastics since essential demands, as the realization of high recognition reliability and identification rates considering the variety of technical plastics, can not be guaranteed. Therefore the Laser Zentrum Hannover e.V. in cooperation with the Hoerotron GmbH and the Preussag Noell GmbH has carried out investigations on a rapid thermographic and laser-supported material- identification-system for automatic material-sorting- systems. The automatic identification of different engineering plastics coming from electronic or automotive waste is possible. Identification rates up to 10 parts per second are allowed by the effort from fast IR line scanners. The procedure is based on the following principle: within a few milliseconds a spot on the relevant sample is heated by a CO2 laser. The samples different and specific chemical and physical material properties cause different temperature distributions on their surfaces that are measured by a fast IR-linescan system. This 'thermal impulse response' has to be analyzed by means of a computer system. Investigations have shown that it is possible to analyze more than 18 different sorts of plastics at a frequency of 10 Hz. Crucial for the development of such a system is the rapid processing of imaging data, the minimization of interferences caused by oscillating samples geometries, and a wide range of possible additives in plastics in question. One possible application area is sorting of plastics coming from car- and electronic waste recycling.
Solid Phase Microextraction and Related Techniques for Drugs in Biological Samples
Moein, Mohammad Mahdi; Said, Rana; Bassyouni, Fatma
2014-01-01
In drug discovery and development, the quantification of drugs in biological samples is an important task for the determination of the physiological performance of the investigated drugs. After sampling, the next step in the analytical process is sample preparation. Because of the low concentration levels of drug in plasma and the variety of the metabolites, the selected extraction technique should be virtually exhaustive. Recent developments of sample handling techniques are directed, from one side, toward automatization and online coupling of sample preparation units. The primary objective of this review is to present the recent developments in microextraction sample preparation methods for analysis of drugs in biological fluids. Microextraction techniques allow for less consumption of solvent, reagents, and packing materials, and small sample volumes can be used. In this review the use of solid phase microextraction (SPME), microextraction in packed sorbent (MEPS), and stir-bar sorbtive extraction (SBSE) in drug analysis will be discussed. In addition, the use of new sorbents such as monoliths and molecularly imprinted polymers will be presented. PMID:24688797
Tavares, Inês M; Laan, Ellen T M; Nobre, Pedro J
2017-06-01
Cognitive-affective factors contribute to female sexual dysfunctions, defined as clinically significant difficulties in the ability to respond sexually or to experience sexual pleasure. Automatic thoughts and affect presented during sexual activity are acknowledged as maintenance factors for these difficulties. However, there is a lack of studies on the influence of these cognitive-affective dimensions regarding female orgasm. To assess the role of automatic thoughts and affect during sexual activity in predicting female orgasm occurrence and to investigate the mediator role of these variables in the relation between sexual activity and orgasm occurrence. Nine hundred twenty-six sexually active heterosexual premenopausal women reported on frequency of sexual activities and frequency of orgasm occurrence, cognitive factors, and social desirability. Participants completed the Sexual Modes Questionnaire-Automatic Thoughts Subscale, the Positive and Negative Affect Schedule, and the Socially Desirable Response Set. Multiple linear regressions and mediation analyses were performed, controlling for the effect of covariates such as social desirability, sociodemographic and medical characteristics, and relationship factors. The main outcome measurement was orgasm frequency as predicted and mediated by automatic thoughts and affect experienced during sexual activities. The presence of failure thoughts and lack of erotic thoughts during sexual activity significantly and negatively predicted female orgasm, whereas positive affect experienced during sexual activity significantly and positively predicted female orgasm. Moreover, negative automatic thoughts and positive affect during sexual activity were found to mediate the relation between sexual activity and female orgasm occurrence. These data suggest that the cognitive aspects of sexual involvement are critical to enhancing female orgasm experience and can aid the development of strategies that contemplate the central role of automatic thoughts and of positive emotions experienced during sexual activity. Data were not collected face to face, which constitutes a strength of this study, because it is known that social desirability is lower in self-administered online questionnaires compared with traditional paper-and-pencil questionnaires, particularly for more sensitive sexual issues. The fact that the sample was composed of heterosexual, premenopausal, and relatively young women demands some caution regarding generalization of the present results. The findings support the contribution of cognitive and affective factors to female orgasmic functioning. It is recommended that future research confirm these findings with other samples, particularly clinical samples of women with orgasmic difficulties. Tavares IM, Laan ETM, Nobre PJ. Cognitive-Affective Dimensions of Female Orgasm: The Role of Automatic Thoughts and Affect During Sexual Activity. J Sex Med 2017;14:818-828. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
Director, Operational Test and Evaluation FY 2004 Annual Report
2004-01-01
HIGH) Space Based Radar (SBR) Sensor Fuzed Weapon (SFW) P3I (CBU-97/B) Small Diameter Bomb (SDB) Secure Mobile Anti-Jam Reliable Tactical Terminal...detection, identification, and sampling capability for both fixed-site and mobile operations. The system must automatically detect and identify up to ten...staffing within the Services. SYSTEM DESCRIPTION AND MISSION The Services envision JCAD as a hand-held device that automatically detects, identifies, and
Applying Independent Verification and Validation to Automatic Test Equipment
NASA Technical Reports Server (NTRS)
Calhoun, Cynthia C.
1997-01-01
This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.
Mosaic construction, processing, and review of very large electron micrograph composites
NASA Astrophysics Data System (ADS)
Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.
1996-11-01
A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.
NASA Astrophysics Data System (ADS)
Zhang, Chi; Reufer, Mathias; Gaudino, Danila; Scheffold, Frank
2017-11-01
Diffusing wave spectroscopy (DWS) can be employed as an optical rheology tool with numerous applications for studying the structure, dynamics and linear viscoelastic properties of complex fluids, foams, glasses and gels. To carry out DWS measurements, one first needs to quantify the static optical properties of the sample under investigation, i.e. the transport mean free path l * and the absorption length l a. In the absence of absorption this can be done by comparing the diffuse optical transmission to a calibration sample whose l * is known. Performing this comparison however is cumbersome, time consuming, and prone to mistakes by the operator. Moreover, already weak absorption can lead to significant errors. In this paper, we demonstrate the implementation of an automatized approach, based on which the DWS measurement procedure can be simplified significantly. By comparison with a comprehensive set of calibration measurements we cover the entire parameter space relating measured count rates ( CR t , CR b ) to ( l *, l a). Based on this approach we can determine l * and la of an unknown sample accurately thus making the additional measurement of a calibration sample obsolete. We illustrate the use of this approach by monitoring the coarsening of a commercially available shaving foam with DWS.
Hülsheger, Ute R; Lang, Jonas W B; Schewe, Anna F; Zijlstra, Fred R H
2015-03-01
We investigated the relationship between deep acting, automatic regulation and customer tips with 2 different study designs. The first study was a daily diary study using a sample of Dutch waiters and taxi-drivers and assessed the link of employees' daily self-reported levels of deep acting and automatic regulation with the amount of tips provided by customers (N = 166 measurement occasions nested in 34 persons). Whereas deep acting refers to deliberate attempts to modify felt emotions and involves conscious effort, automatic regulation refers to automated emotion regulatory processes that result in the natural experience of desired emotions and do not involve deliberate control and effort. Multilevel analyses revealed that both types of emotion regulation were positively associated with customer tips. The second study was an experimental field study using a sample of German hairdressers (N = 41). Emotion regulation in terms of both deep acting and automatic regulation was manipulated using a brief self-training intervention and daily instructions to use cognitive change and attentional deployment. Results revealed that participants in the intervention group received significantly more tips than participants in the control group. PsycINFO Database Record (c) 2015 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Leifer, R.; Sommers, K. G.; Guggenheim, S. F.; Fisenne, I.
1981-02-01
An ultra-clean, low volume gas sampling system (CLASS), flown aboard a high altitude aircraft (WB-57F), and providing information on stratospheric trace gases is presented. Attention is given to the instrument design and the electronic control design. Since remote operation is mandatory on the WB-57F, a servo pressure transducer, electrical pressure switch for automatic shutdown, and a mechanical safety relief valve were installed on the sampling manifold, indicated on the CLASS flow chart. The electronic control system consists of hermetically sealed solid state timers, relays, and a stepping switch, for controlling the compressor pump and solenoid valves. In designing the automatic control system, vibration, shock, acceleration, extreme low temperature, and aircraft safety were important considerations. CLASS was tested on three separate occasions, and tables of analytical data from these flights are presented. Readiness capability was demonstrated when the Mount St. Helens eruption plume of May 18, 1980, was intercepted, and it was concluded that no large injection of Rn-222 entered the stratosphere or troposphere from the eruption.
G-DYN Multibody Dynamics Engine
NASA Technical Reports Server (NTRS)
Acikmese, Behcet; Blackmore, James C.; Broderick, Daniel
2011-01-01
G-DYN is a multi-body dynamic simulation software engine that automatically assembles and integrates equations of motion for arbitrarily connected multibody dynamic systems. The algorithm behind G-DYN is based on a primal-dual formulation of the dynamics that captures the position and velocity vectors (primal variables) of each body and the interaction forces (dual variables) between bodies, which are particularly useful for control and estimation analysis and synthesis. It also takes full advantage of the spare matrix structure resulting from the system dynamics to numerically integrate the equations of motion efficiently. Furthermore, the dynamic model for each body can easily be replaced without re-deriving the overall equations of motion, and the assembly of the equations of motion is done automatically. G-DYN proved an essential software tool in the simulation of spacecraft systems used for small celestial body surface sampling, specifically in simulating touch-and-go (TAG) maneuvers of a robotic sampling system from a comet and asteroid. It is used extensively in validating mission concepts for small body sample return, such as Comet Odyssey and Galahad New Frontiers proposals.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
A Radio-Map Automatic Construction Algorithm Based on Crowdsourcing
Yu, Ning; Xiao, Chenxian; Wu, Yinfeng; Feng, Renjian
2016-01-01
Traditional radio-map-based localization methods need to sample a large number of location fingerprints offline, which requires huge amount of human and material resources. To solve the high sampling cost problem, an automatic radio-map construction algorithm based on crowdsourcing is proposed. The algorithm employs the crowd-sourced information provided by a large number of users when they are walking in the buildings as the source of location fingerprint data. Through the variation characteristics of users’ smartphone sensors, the indoor anchors (doors) are identified and their locations are regarded as reference positions of the whole radio-map. The AP-Cluster method is used to cluster the crowdsourced fingerprints to acquire the representative fingerprints. According to the reference positions and the similarity between fingerprints, the representative fingerprints are linked to their corresponding physical locations and the radio-map is generated. Experimental results demonstrate that the proposed algorithm reduces the cost of fingerprint sampling and radio-map construction and guarantees the localization accuracy. The proposed method does not require users’ explicit participation, which effectively solves the resource-consumption problem when a location fingerprint database is established. PMID:27070623
Furukawa, Makoto; Takagai, Yoshitaka
2016-10-04
Online solid-phase extraction (SPE) coupled with inductively coupled plasma mass spectrometry (ICPMS) is a useful tool in automatic sequential analysis. However, it cannot simultaneously quantify the analytical targets and their recovery percentages (R%) in one-shot samples. We propose a system that simultaneously acquires both data in a single sample injection. The main flowline of the online solid-phase extraction is divided into main and split flows. The split flow line (i.e., bypass line), which circumvents the SPE column, was placed on the main flow line. Under program-controlled switching of the automatic valve, the ICPMS sequentially measures the targets in a sample before and after column preconcentration and determines the target concentrations and the R% on the SPE column. This paper describes the system development and two demonstrations to exhibit the analytical significance, i.e., the ultratrace amounts of radioactive strontium ( 90 Sr) using commercial Sr-trap resin and multielement adsorbability on the SPE column. This system is applicable to other flow analyses and detectors in online solid phase extraction.
NASA Astrophysics Data System (ADS)
Kawaguchi, S.; Takemoto, M.; Osaka, K.; Nishibori, E.; Moriyoshi, C.; Kubota, Y.; Kuroiwa, Y.; Sugimoto, K.
2017-08-01
In this study, we developed a user-friendly automatic powder diffraction measurement system for Debye-Scherrer geometry using a capillary sample at beamline BL02B2 of SPring-8. The measurement system consists of six one-dimensional solid-state (MYTHEN) detectors, a compact auto-sampler, wide-range temperature control systems, and a gas handling system. This system enables to do the automatic measurement of temperature dependence of the diffraction patterns for multiple samples. We introduced two measurement modes in the MYTHEN system and developed new attachments for the sample environment such as a gas handling system. The measurement modes and the attachments can offer in situ and/or time-resolved measurements in an extended temperature range between 25 K and 1473 K and various gas atmospheres and pressures. The results of the commissioning and performance measurements using reference materials (NIST CeO2 674b and Si 640c), V2O3 and Ti2O3, and a nanoporous coordination polymer are presented.
System for sensing droplet formation time delay in a flow cytometer
Van den Engh, Ger; Esposito, Richard J.
1997-01-01
A droplet flow cytometer system which includes a system to optimize the droplet formation time delay based on conditions actually experienced includes an automatic droplet sampler which rapidly moves a plurality of containers stepwise through the droplet stream while simultaneously adjusting the droplet time delay. Through the system sampling of an actual substance to be processed can be used to minimize the effect of the substances variations or the determination of which time delay is optimal. Analysis such as cell counting and the like may be conducted manually or automatically and input to a time delay adjustment which may then act with analysis equipment to revise the time delay estimate actually applied during processing. The automatic sampler can be controlled through a microprocessor and appropriate programming to bracket an initial droplet formation time delay estimate. When maximization counts through volume, weight, or other types of analysis exists in the containers, the increment may then be reduced for a more accurate ultimate setting. This may be accomplished while actually processing the sample without interruption.
NASA Astrophysics Data System (ADS)
Romagnan, Jean Baptiste; Aldamman, Lama; Gasparini, Stéphane; Nival, Paul; Aubert, Anaïs; Jamet, Jean Louis; Stemmann, Lars
2016-10-01
The present work aims to show that high throughput imaging systems can be useful to estimate mesozooplankton community size and taxonomic descriptors that can be the base for consistent large scale monitoring of plankton communities. Such monitoring is required by the European Marine Strategy Framework Directive (MSFD) in order to ensure the Good Environmental Status (GES) of European coastal and offshore marine ecosystems. Time and cost-effective, automatic, techniques are of high interest in this context. An imaging-based protocol has been applied to a high frequency time series (every second day between April 2003 to April 2004 on average) of zooplankton obtained in a coastal site of the NW Mediterranean Sea, Villefranche Bay. One hundred eighty four mesozooplankton net collected samples were analysed with a Zooscan and an associated semi-automatic classification technique. The constitution of a learning set designed to maximize copepod identification with more than 10,000 objects enabled the automatic sorting of copepods with an accuracy of 91% (true positives) and a contamination of 14% (false positives). Twenty seven samples were then chosen from the total copepod time series for detailed visual sorting of copepods after automatic identification. This method enabled the description of the dynamics of two well-known copepod species, Centropages typicus and Temora stylifera, and 7 other taxonomically broader copepod groups, in terms of size, biovolume and abundance-size distributions (size spectra). Also, total copepod size spectra underwent significant changes during the sampling period. These changes could be partially related to changes in the copepod assemblage taxonomic composition and size distributions. This study shows that the use of high throughput imaging systems is of great interest to extract relevant coarse (i.e. total abundance, size structure) and detailed (i.e. selected species dynamics) descriptors of zooplankton dynamics. Innovative zooplankton analyses are therefore proposed and open the way for further development of zooplankton community indicators of changes.
Bayly, John G.; Booth, Ronald J.
1977-01-01
An apparatus for monitoring the concentration of a vapor, such as heavy water, having at least one narrow bandwidth in its absorption spectrum, in a sample gas such as air. The air is drawn into a chamber in which the vapor content is measured by means of its radiation absorption spectrum. High sensitivity is obtained by modulating the wavelength at a relatively high frequency without changing its optical path, while high stability against zero drift is obtained by the low frequency interchange of the sample gas to be monitored and of a reference sample. The variable HDO background due to natural humidity is automatically corrected.
NASA Astrophysics Data System (ADS)
Chepigin, A.; Leonte, M.; Colombo, F.; Kessler, J. D.
2014-12-01
Dissolved methane, ethane, propane, and butane concentrations in natural waters are traditionally measured using a headspace equilibration technique and gas chromatograph with flame ionization detector (GC-FID). While a relatively simple technique, headspace equilibration suffers from slow equilibration times and loss of sensitivity due to concentration dilution with the pure gas headspace. Here we present a newly developed pre-concentration system and auto-analyzer for use with a GC-FID. This system decreases the time required for each analysis by eliminating the headspace equilibration time, increases the sensitivity and precision with a rapid pre-concentration step, and minimized operator time with an autoanalyzer. In this method, samples are collected from Niskin bottles in newly developed 1 L plastic sample bags rather than glass vials. Immediately following sample collection, the sample bags are placed in an incubator and individually connected to a multiport sampling valve. Water is pumped automatically from the desired sample bag through a small (6.5 mL) Liqui-Cel® membrane contactor where the dissolved gas is vacuum extracted and directly flushed into the GC sample loop. The gases of interest are preferentially extracted with the Liqui-Cel and thus a natural pre-concentration effect is obtained. Daily method calibration is achieved in the field with a five-point calibration curve that is created by analyzing gas standard-spiked water stored in 5 L gas-impermeable bags. Our system has been shown to substantially pre-concentrate the dissolved gases of interest and produce a highly linear response of peak areas to dissolved gas concentration. The system retains the high accuracy, precision, and wide range of measurable concentrations of the headspace equilibration method while simultaneously increasing the sensitivity due to the pre-concentration step. The time and labor involved in the headspace equilibration method is eliminated and replaced with the immediate and automatic analysis of a maximum of 13 sequential samples. The elapsed time between sample collection and analysis is reduced from approximately 12 hrs to < 10 min, enabling dynamic and highly resolved sampling plans.
NASA Astrophysics Data System (ADS)
Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti
2015-04-01
Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.
Plage, Bernd; Berg, Anna-Dolores; Luhn, Steven
2008-05-20
The differentiation of 25 automotive clear coats was evaluated using pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS). The samples were selected from eight different groups of samples which slightly differ in their infrared spectra. Most of the samples could be differentiated by visual inspection of the pyrograms. As an objective mean for evaluation a new software based on the comparison of chromatograms was tested for automatic classification considering retention times as well as mass spectra. The database was formed by the triplicate results of the set of the 25 samples. Normally a replicate measurement of a sample yields the best fit by library search. In addition, for most groups classification with moderate fits are obtained for samples belonging to the same group. Some samples are completely rearranged forming a new group of similar samples containing five samples from three different IR groups and four samples of three other groups, respectively. Furthermore detailed visual recognition of individual pyrolysis products allows subgrouping. Therefore, most samples can be differentiated from each other by Py-GC/MS. The exception were three sample groups containing two samples each, which could not be differentiated from each other neither by library search nor by recognition of minor individual pyrolysis products.
Samusik, Nikolay; Wang, Xiaowei; Guan, Leying; Nolan, Garry P.
2017-01-01
Mass cytometry (CyTOF) has greatly expanded the capability of cytometry. It is now easy to generate multiple CyTOF samples in a single study, with each sample containing single-cell measurement on 50 markers for more than hundreds of thousands of cells. Current methods do not adequately address the issues concerning combining multiple samples for subpopulation discovery, and these issues can be quickly and dramatically amplified with increasing number of samples. To overcome this limitation, we developed Partition-Assisted Clustering and Multiple Alignments of Networks (PAC-MAN) for the fast automatic identification of cell populations in CyTOF data closely matching that of expert manual-discovery, and for alignments between subpopulations across samples to define dataset-level cellular states. PAC-MAN is computationally efficient, allowing the management of very large CyTOF datasets, which are increasingly common in clinical studies and cancer studies that monitor various tissue samples for each subject. PMID:29281633
Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.
2016-01-01
Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827
2010-03-23
Micron 41 (2010) 615–621 619 Fig. 4 . XPS binding energy (eV) versus sputtering time (s) results for the Ti 2p peaks for the titanium samples: (a...improved the IQ values. 4 . Conclusions The electrochemical–mechanical polishing system (ECMP) removed material from titanium and nickel alloys at a...March 2014 4 . TITLE AND SUBTITLE NOVEL AUTOMATIC ELECTROCHEMICAL-MECHANICAL POLISHING (ECMP) OF METALS FOR SCANNING ELECTRON MICROSCOPY
NASA Astrophysics Data System (ADS)
Bhattacharjee, Sudipta; Deb, Debasis
2016-07-01
Digital image correlation (DIC) is a technique developed for monitoring surface deformation/displacement of an object under loading conditions. This method is further refined to make it capable of handling discontinuities on the surface of the sample. A damage zone is referred to a surface area fractured and opened in due course of loading. In this study, an algorithm is presented to automatically detect multiple damage zones in deformed image. The algorithm identifies the pixels located inside these zones and eliminate them from FEM-DIC processes. The proposed algorithm is successfully implemented on several damaged samples to estimate displacement fields of an object under loading conditions. This study shows that displacement fields represent the damage conditions reasonably well as compared to regular FEM-DIC technique without considering the damage zones.
Pulse-Echo Ultrasonic Imaging Method for Eliminating Sample Thickness Variation Effects
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1997-01-01
A pulse-echo, immersion method for ultrasonic evaluation of a material which accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer and automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjustments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.
Chen, Wen; Zhong, Guanping; Zhou, Zaide; Wu, Peng; Hou, Xiandeng
2005-10-01
A simple spectrophotometric system, based on a prolonged pseudo-liquid drop device as an optical cell and a handheld charge coupled device (CCD) as a detector, was constructed for automatic liquid-liquid extraction and spectrophotometric speciation of trace Cr(VI) and Cr(III) in water samples. A tungsten halogen lamp was used as the light source, and a laboratory-constructed T-tube with two open ends was used to form the prolonged pseudo-liquid drop inside the tube. In the medium of perchloric acid solution, Cr(VI) reacted with 1,5-diphenylcarbazide (DPC); the formed complex was automatically extracted into n-pentanol, with a preconcentration ratio of about 5. The organic phase with extracted chromium complex was then pumped through the optical cell for absorbance measurement at 548 nm. Under optimal conditions, the calibration curve was linear in the range of 7.5 - 350 microg L(-1), with a correlation coefficient of 0.9993. The limit of detection (3sigma) was 7.5 microg L(-1). That Cr(III) species cannot react with DPC, but can be oxidized to Cr(VI) prior to determination, is the basis of the speciation analysis. The proposed speciation analysis was sensitive, yet simple, labor-effective, and cost-effective. It has been preliminarily applied for the speciation of Cr(VI) and Cr(III) in spiked river and tap water samples. It can also be used for other automatic liquid-liquid extraction-spectrophotometric determinations.
Comparison of filters for concentrating microbial indicators and pathogens in lake-water samples
Francy, Donna S.; Stelzer, Erin A.; Brady, Amie M.G.; Huitger, Carrie; Bushon, Rebecca N.; Ip, Hon S.; Ware, Michael W.; Villegas, Eric N.; Gallardo, Vincent; Lindquist, H.D. Alan
2013-01-01
Bacterial indicators are used to indicate increased health risk from pathogens and to make beach closure and advisory decisions; however, beaches are seldom monitored for the pathogens themselves. Studies of sources and types of pathogens at beaches are needed to improve estimates of swimming-associated health risks. It would be advantageous and cost-effective, especially for studies conducted on a regional scale, to use a method that can simultaneously filter and concentrate all classes of pathogens from the large volumes of water needed to detect pathogens. In seven recovery experiments, stock cultures of viruses and protozoa were seeded into 10-liter lake water samples, and concentrations of naturally occurring bacterial indicators were used to determine recoveries. For the five filtration methods tested, the highest median recoveries were as follows: glass wool for adenovirus (4.7%); NanoCeram for enterovirus (14.5%) and MS2 coliphage (84%); continuous-flow centrifugation (CFC) plus Virocap (CFC+ViroCap) for Escherichia coli (68.3%) and Cryptosporidium (54%); automatic ultrafiltration (UF) for norovirus GII (2.4%); and dead-end UF for Enterococcus faecalis (80.5%), avian influenza virus (0.02%), and Giardia (57%). In evaluating filter performance in terms of both recovery and variability, the automatic UF resulted in the highest recovery while maintaining low variability for all nine microorganisms. The automatic UF was used to demonstrate that filtration can be scaled up to field deployment and the collection of 200-liter lake water samples.
Comparison of Filters for Concentrating Microbial Indicators and Pathogens in Lake Water Samples
Stelzer, Erin A.; Brady, Amie M. G.; Huitger, Carrie; Bushon, Rebecca N.; Ip, Hon S.; Ware, Michael W.; Villegas, Eric N.; Gallardo, Vicente; Lindquist, H. D. Alan
2013-01-01
Bacterial indicators are used to indicate increased health risk from pathogens and to make beach closure and advisory decisions; however, beaches are seldom monitored for the pathogens themselves. Studies of sources and types of pathogens at beaches are needed to improve estimates of swimming-associated health risks. It would be advantageous and cost-effective, especially for studies conducted on a regional scale, to use a method that can simultaneously filter and concentrate all classes of pathogens from the large volumes of water needed to detect pathogens. In seven recovery experiments, stock cultures of viruses and protozoa were seeded into 10-liter lake water samples, and concentrations of naturally occurring bacterial indicators were used to determine recoveries. For the five filtration methods tested, the highest median recoveries were as follows: glass wool for adenovirus (4.7%); NanoCeram for enterovirus (14.5%) and MS2 coliphage (84%); continuous-flow centrifugation (CFC) plus Virocap (CFC+ViroCap) for Escherichia coli (68.3%) and Cryptosporidium (54%); automatic ultrafiltration (UF) for norovirus GII (2.4%); and dead-end UF for Enterococcus faecalis (80.5%), avian influenza virus (0.02%), and Giardia (57%). In evaluating filter performance in terms of both recovery and variability, the automatic UF resulted in the highest recovery while maintaining low variability for all nine microorganisms. The automatic UF was used to demonstrate that filtration can be scaled up to field deployment and the collection of 200-liter lake water samples. PMID:23263948
Sample Selection for Training Cascade Detectors.
Vállez, Noelia; Deniz, Oscar; Bueno, Gloria
2015-01-01
Automatic detection systems usually require large and representative training datasets in order to obtain good detection and false positive rates. Training datasets are such that the positive set has few samples and/or the negative set should represent anything except the object of interest. In this respect, the negative set typically contains orders of magnitude more images than the positive set. However, imbalanced training databases lead to biased classifiers. In this paper, we focus our attention on a negative sample selection method to properly balance the training data for cascade detectors. The method is based on the selection of the most informative false positive samples generated in one stage to feed the next stage. The results show that the proposed cascade detector with sample selection obtains on average better partial AUC and smaller standard deviation than the other compared cascade detectors.
Chango, Gabriela; Palacio, Edwin; Cerdà, Víctor
2018-08-15
A simple potentiometric chip-based multipumping flow system (MPFS) has been developed for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples. The proposed system was developed by using a poly(methyl methacrylate) chip microfluidic-conductor using the advantages of flow techniques with potentiometric detection. For this purpose, an automatic system has been designed and built by optimizing the variables involved in the process, such as: pH, ionic strength, stirring and sample volume. This system was applied successfully to water samples getting a versatile system with an analysis frequency of 12 samples per hour. Good correlation between chloride and fluoride concentration measured with ISE and ionic chromatography technique suggests satisfactory reliability of the system. Copyright © 2018 Elsevier B.V. All rights reserved.
Sample collection system for gel electrophoresis
Olivares, Jose A.; Stark, Peter C.; Dunbar, John M.; Hill, Karen K.; Kuske, Cheryl R.; Roybal, Gustavo
2004-09-21
An automatic sample collection system for use with an electrophoretic slab gel system is presented. The collection system can be used with a slab gel have one or more lanes. A detector is used to detect particle bands on the slab gel within a detection zone. Such detectors may use a laser to excite fluorescently labeled particles. The fluorescent light emitted from the excited particles is transmitted to low-level light detection electronics. Upon the detection of a particle of interest within the detection zone, a syringe pump is activated, sending a stream of buffer solution across the lane of the slab gel. The buffer solution collects the sample of interest and carries it through a collection port into a sample collection vial.
2012-01-01
Background Ultrasonic scalpel (UC) and monopolar electrocautery (ME) are common tools for soft tissue dissection. However, morphological data on the related tissue alteration are discordant. We developed an automatic device for standardized sample excision and compared quality and depth of morphological changes caused by UC and ME in a pig model. Methods 100 tissue samples (5 × 3 cm) of the abdominal wall were excised in 16 pigs. Excisions were randomly performed manually or by using the self-constructed automatic device at standard power levels (60 W cutting in ME, level 5 in UC) for abdominal surgery. Quality of tissue alteration and depth of coagulation necrosis were examined histopathologically. Device (UC vs. ME) and mode (manually vs. automatic) effects were studied by two-way analysis of variance at a significance level of 5%. Results At the investigated power level settings UC and ME induced qualitatively similar coagulation necroses. Mean depth of necrosis was 450.4 ± 457.8 μm for manual UC and 553.5 ± 326.9 μm for automatic UC versus 149.0 ± 74.3 μm for manual ME and 257.6 ± 119.4 μm for automatic ME. Coagulation necrosis was significantly deeper (p < 0.01) when UC was used compared to ME. The mode of excision (manual versus automatic) did not influence the depth of necrosis (p = 0.85). There was no significant interaction between dissection tool and mode of excision (p = 0.93). Conclusions Thermal injury caused by UC and ME results in qualitatively similar coagulation necrosis. The depth of necrosis is significantly greater in UC compared to ME at investigated standard power levels. PMID:22361346
Homayounfar, Kia; Meis, Johanna; Jung, Klaus; Klosterhalfen, Bernd; Sprenger, Thilo; Conradi, Lena-Christin; Langer, Claus; Becker, Heinz
2012-02-23
Ultrasonic scalpel (UC) and monopolar electrocautery (ME) are common tools for soft tissue dissection. However, morphological data on the related tissue alteration are discordant. We developed an automatic device for standardized sample excision and compared quality and depth of morphological changes caused by UC and ME in a pig model. 100 tissue samples (5 × 3 cm) of the abdominal wall were excised in 16 pigs. Excisions were randomly performed manually or by using the self-constructed automatic device at standard power levels (60 W cutting in ME, level 5 in UC) for abdominal surgery. Quality of tissue alteration and depth of coagulation necrosis were examined histopathologically. Device (UC vs. ME) and mode (manually vs. automatic) effects were studied by two-way analysis of variance at a significance level of 5%. At the investigated power level settings UC and ME induced qualitatively similar coagulation necroses. Mean depth of necrosis was 450.4 ± 457.8 μm for manual UC and 553.5 ± 326.9 μm for automatic UC versus 149.0 ± 74.3 μm for manual ME and 257.6 ± 119.4 μm for automatic ME. Coagulation necrosis was significantly deeper (p < 0.01) when UC was used compared to ME. The mode of excision (manual versus automatic) did not influence the depth of necrosis (p = 0.85). There was no significant interaction between dissection tool and mode of excision (p = 0.93). Thermal injury caused by UC and ME results in qualitatively similar coagulation necrosis. The depth of necrosis is significantly greater in UC compared to ME at investigated standard power levels.
Ramos, Inês I; Magalhães, Luís M; Barreiros, Luisa; Reis, Salette; Lima, José L F C; Segundo, Marcela A
2018-01-01
Immunoglobulin G (IgG) represents the major fraction of antibodies in healthy adult human serum, and deviations from physiological levels are a generic marker of disease corresponding to different pathologies. Therefore, screening methods for IgG evaluation are a valuable aid to diagnostics. The present work proposes a rapid, automatic, and miniaturized method based on UV-vis micro-bead injection spectroscopy (μ-BIS) for the real-time determination of human serum IgG with label-free detection. Relying on attachment of IgG in rec-protein G immobilized in Sepharose 4B, a bioaffinity column is automatically assembled, where IgG is selectively retained and determined by on-column optical density measurement. A "dilution-and-shoot" approach (50 to 200 times) was implemented without further sample treatment because interferences were flushed out of the column upon sample loading, with minimization of carryover and cross-contamination by automatically discarding the sorbent (0.2 mg) after each determination. No interference from human serum albumin at 60 mg mL -1 in undiluted sample was found. The method allowed IgG determination in the range 100-300 μg mL -1 (corresponding to 5.0-60 mg mL -1 in undiluted samples), with a detection limit of 33 μg mL -1 (1.7 mg mL -1 for samples, dilution factor of 50). RSD values were < 9.4 and < 11.7%, for intra and inter-assay precision, respectively, while recovery values for human serum spiked with IgG at high pathological levels were 97.8-101.4%. Comparison to commercial ELISA kit showed no significant difference for tested samples (n = 8). Moreover, time-to-result decreased from several hours to < 5 min and analysis cost decreased 10 times, showing the potential of the proposed approach as a point-of-care method. Graphical abstract Micro-Bead Injection Spectroscopy method for real time, automated and label-free determination of total serum human Immunoglobulin G (IgG). The method was designed for Lab-on-Valve (LOV) platforms using a miniaturised protein G bioaffinity separative approach. IgG are separated from serum matrix components upon quantification with low non-specific binding in less than 5 min.
A comparison of fitness-case sampling methods for genetic programming
NASA Astrophysics Data System (ADS)
Martínez, Yuliana; Naredo, Enrique; Trujillo, Leonardo; Legrand, Pierrick; López, Uriel
2017-11-01
Genetic programming (GP) is an evolutionary computation paradigm for automatic program induction. GP has produced impressive results but it still needs to overcome some practical limitations, particularly its high computational cost, overfitting and excessive code growth. Recently, many researchers have proposed fitness-case sampling methods to overcome some of these problems, with mixed results in several limited tests. This paper presents an extensive comparative study of four fitness-case sampling methods, namely: Interleaved Sampling, Random Interleaved Sampling, Lexicase Selection and Keep-Worst Interleaved Sampling. The algorithms are compared on 11 symbolic regression problems and 11 supervised classification problems, using 10 synthetic benchmarks and 12 real-world data-sets. They are evaluated based on test performance, overfitting and average program size, comparing them with a standard GP search. Comparisons are carried out using non-parametric multigroup tests and post hoc pairwise statistical tests. The experimental results suggest that fitness-case sampling methods are particularly useful for difficult real-world symbolic regression problems, improving performance, reducing overfitting and limiting code growth. On the other hand, it seems that fitness-case sampling cannot improve upon GP performance when considering supervised binary classification.
Geraghty, John P; Grogan, Garry; Ebert, Martin A
2013-04-30
This study investigates the variation in segmentation of several pelvic anatomical structures on computed tomography (CT) between multiple observers and a commercial automatic segmentation method, in the context of quality assurance and evaluation during a multicentre clinical trial. CT scans of two prostate cancer patients ('benchmarking cases'), one high risk (HR) and one intermediate risk (IR), were sent to multiple radiotherapy centres for segmentation of prostate, rectum and bladder structures according to the TROG 03.04 "RADAR" trial protocol definitions. The same structures were automatically segmented using iPlan software for the same two patients, allowing structures defined by automatic segmentation to be quantitatively compared with those defined by multiple observers. A sample of twenty trial patient datasets were also used to automatically generate anatomical structures for quantitative comparison with structures defined by individual observers for the same datasets. There was considerable agreement amongst all observers and automatic segmentation of the benchmarking cases for bladder (mean spatial variations < 0.4 cm across the majority of image slices). Although there was some variation in interpretation of the superior-inferior (cranio-caudal) extent of rectum, human-observer contours were typically within a mean 0.6 cm of automatically-defined contours. Prostate structures were more consistent for the HR case than the IR case with all human observers segmenting a prostate with considerably more volume (mean +113.3%) than that automatically segmented. Similar results were seen across the twenty sample datasets, with disagreement between iPlan and observers dominant at the prostatic apex and superior part of the rectum, which is consistent with observations made during quality assurance reviews during the trial. This study has demonstrated quantitative analysis for comparison of multi-observer segmentation studies. For automatic segmentation algorithms based on image-registration as in iPlan, it is apparent that agreement between observer and automatic segmentation will be a function of patient-specific image characteristics, particularly for anatomy with poor contrast definition. For this reason, it is suggested that automatic registration based on transformation of a single reference dataset adds a significant systematic bias to the resulting volumes and their use in the context of a multicentre trial should be carefully considered.
An Automatic System for Global Monitoring of ELF and VLF Radio Noise Phenomena.
1985-06-01
second low-jitter synchronization signal is also provided for precise triggering of analog-to- digital conversion samples. Both the clock and the...building in 1985 are two riometers (30 MHz and 51.4 MHz), a 3-axis fluxgate magnetometer , a 3-axis micropulsation magnetometer , an all-sky camera, and...of these filters 1s continuously sampled by a computerized recording system, and statistical averages are computed on-site and recorded on digital tape
Automatic recognition and analysis of synapses. [in brain tissue
NASA Technical Reports Server (NTRS)
Ungerleider, J. A.; Ledley, R. S.; Bloom, F. E.
1976-01-01
An automatic system for recognizing synaptic junctions would allow analysis of large samples of tissue for the possible classification of specific well-defined sets of synapses based upon structural morphometric indices. In this paper the three steps of our system are described: (1) cytochemical tissue preparation to allow easy recognition of the synaptic junctions; (2) transmitting the tissue information to a computer; and (3) analyzing each field to recognize the synapses and make measurements on them.
The application of charge-coupled device processors in automatic-control systems
NASA Technical Reports Server (NTRS)
Mcvey, E. S.; Parrish, E. A., Jr.
1977-01-01
The application of charge-coupled device (CCD) processors to automatic-control systems is suggested. CCD processors are a new form of semiconductor component with the unique ability to process sampled signals on an analog basis. Specific implementations of controllers are suggested for linear time-invariant, time-varying, and nonlinear systems. Typical processing time should be only a few microseconds. This form of technology may become competitive with microprocessors and minicomputers in addition to supplementing them.
Automatic microscopy for mitotic cell location.
NASA Technical Reports Server (NTRS)
Herron, J.; Ranshaw, R.; Castle, J.; Wald, N.
1972-01-01
Advances are reported in the development of an automatic microscope with which to locate hematologic or other cells in mitosis for subsequent chromosome analysis. The system under development is designed to perform the functions of: slide scanning to locate metaphase cells; conversion of images of selected cells into binary form; and on-line computer analysis of the digitized image for significant cytogenetic data. Cell detection criteria are evaluated using a test sample of 100 mitotic cells and 100 artifacts.
Automated Proposition Density Analysis for Discourse in Aphasia.
Fromm, Davida; Greenhouse, Joel; Hou, Kaiyue; Russell, G Austin; Cai, Xizhen; Forbes, Margaret; Holland, Audrey; MacWhinney, Brian
2016-10-01
This study evaluates how proposition density can differentiate between persons with aphasia (PWA) and individuals in a control group, as well as among subtypes of aphasia, on the basis of procedural discourse and personal narratives collected from large samples of participants. Participants were 195 PWA and 168 individuals in a control group from the AphasiaBank database. PWA represented 6 aphasia types on the basis of the Western Aphasia Battery-Revised (Kertesz, 2006). Narrative samples were stroke stories for PWA and illness or injury stories for individuals in the control group. Procedural samples were from the peanut-butter-and-jelly-sandwich task. Language samples were transcribed using Codes for the Human Analysis of Transcripts (MacWhinney, 2000) and analyzed using Computerized Language Analysis (MacWhinney, 2000), which automatically computes proposition density (PD) using rules developed for automatic PD measurement by the Computerized Propositional Idea Density Rater program (Brown, Snodgrass, & Covington, 2007; Covington, 2007). Participants in the control group scored significantly higher than PWA on both tasks. PD scores were significantly different among the aphasia types for both tasks. Pairwise comparisons for both discourse tasks revealed that PD scores for the Broca's group were significantly lower than those for all groups except Transcortical Motor. No significant quadratic or linear association between PD and severity was found. Proposition density is differentially sensitive to aphasia type and most clearly differentiates individuals with Broca's aphasia from the other groups.
NASA Astrophysics Data System (ADS)
Huang, Shih-Chiang; Lee, Gwo-Bin; Chien, Fan-Ching; Chen, Shean-Jen; Chen, Wen-Janq; Yang, Ming-Chang
2006-07-01
This paper presents a novel microfluidic system with integrated molecular imprinting polymer (MIP) films designed for surface plasmon resonance (SPR) biosensing of multiple nanoscale biomolecules. The innovative microfluidic chip uses pneumatic microvalves and micropumps to transport a precise amount of the biosample through multiple microchannels to sensing regions containing the locally spin-coated MIP films. The signals of SPR biosensing are basically proportional to the number of molecules adsorbed on the MIP films. Hence, a precise control of flow rates inside microchannels is important to determine the adsorption amount of the molecules in the SPR/MIP chips. The integration of micropumps and microvalves can automate the sample introduction process and precisely control the amount of the sample injection to the microfluidic system. The proposed biochip enables the label-free biosensing of biomolecules in an automatic format, and provides a highly sensitive, highly specific and high-throughput detection performance. Three samples, i.e. progesterone, cholesterol and testosterone, are successfully detected using the developed system. The experimental results show that the proposed SPR/MIP microfluidic chip provides a comparable sensitivity to that of large-scale SPR techniques, but with reduced sample consumption and an automatic format. As such, the developed biochip has significant potential for a wide variety of nanoscale biosensing applications. The preliminary results of the current paper were presented at Transducers 2005, Seoul, Korea, 5-9 June 2005.
[Comparison of MPure-12 Automatic Nucleic Acid Purification and Chelex-100 Method].
Shen, X; Li, M; Wang, Y L; Chen, Y L; Lin, Y; Zhao, Z M; Que, T Z
2017-04-01
To explore the forensic application value of MPure-12 automatic nucleic acid purification (MPure-12 Method) for DNA extraction by extracting and typing DNA from bloodstains and various kinds of biological samples with different DNA contents. Nine types of biological samples, such as bloodstains, semen stains, and saliva were collected. DNA were extracted using MPure-12 method and Chelex-100 method, followed by PCR amplification and electrophoresis for obtaining STR-profiles. The samples such as hair root, chutty, butt, muscular tissue, saliva stain, bloodstain and semen stain were typed successfully by MPure-12 method. Partial alleles were lacked in the samples of saliva, and the genotyping of contact swabs was unsatisfactory. Additional, all of the bloodstains (20 μL, 15 μL, 10 μL, 5 μL, 1 μL) showed good typing results using Chelex-100 method. But the loss of alleles occurred in 1 μL blood volume by MPure-12 method. MPure-12 method is suitable for DNA extraction of a certain concentration blood samples.Chelex-100 method may be better for the extraction of trace blood samples.This instrument used in nucleic acid extraction has the advantages of simplicity of operator, rapidity, high extraction efficiency, high rate of reportable STR-profiles and lower man-made pollution. Copyright© by the Editorial Department of Journal of Forensic Medicine
Automatic detection of spiculation of pulmonary nodules in computed tomography images
NASA Astrophysics Data System (ADS)
Ciompi, F.; Jacobs, C.; Scholten, E. T.; van Riel, S. J.; W. Wille, M. M.; Prokop, M.; van Ginneken, B.
2015-03-01
We present a fully automatic method for the assessment of spiculation of pulmonary nodules in low-dose Computed Tomography (CT) images. Spiculation is considered as one of the indicators of nodule malignancy and an important feature to assess in order to decide on a patient-tailored follow-up procedure. For this reason, lung cancer screening scenario would benefit from the presence of a fully automatic system for the assessment of spiculation. The presented framework relies on the fact that spiculated nodules mainly differ from non-spiculated ones in their morphology. In order to discriminate the two categories, information on morphology is captured by sampling intensity profiles along circular patterns on spherical surfaces centered on the nodule, in a multi-scale fashion. Each intensity profile is interpreted as a periodic signal, where the Fourier transform is applied, obtaining a spectrum. A library of spectra is created by clustering data via unsupervised learning. The centroids of the clusters are used to label back each spectrum in the sampling pattern. A compact descriptor encoding the nodule morphology is obtained as the histogram of labels along all the spherical surfaces and used to classify spiculated nodules via supervised learning. We tested our approach on a set of nodules from the Danish Lung Cancer Screening Trial (DLCST) dataset. Our results show that the proposed method outperforms other 3-D descriptors of morphology in the automatic assessment of spiculation.
Water vapor measurement system in global atmospheric sampling program, appendix
NASA Technical Reports Server (NTRS)
Englund, D. R.; Dudzinski, T. J.
1982-01-01
The water vapor measurement system used in the NASA Global Atmospheric Sampling Program (GASP) is described. The system used a modified version of a commercially available dew/frostpoint hygrometer with a thermoelectrically cooled mirror sensor. The modifications extended the range of the hygrometer to enable air sample measurements with frostpoint temperatures down to -80 C at altitudes of 6 to 13 km. Other modifications were made to permit automatic, unattended operation in an aircraft environment. This report described the hygrometer, its integration with the GASP system, its calibration, and operational aspects including measurement errors. The estimated uncertainty of the dew/frostpoint measurements was + or - 1.7 Celsius.
Instrumentation for a dry-pond detention study
Pope, L.M.; Jennings, M.E.; Thibodeaux, K.G.
1988-01-01
A 12.3-acre, fully urbanized, residential land-use catchment was instrumented by the U. S. Geological Survey in Topeka, Kansas. Hydraulic instrumentation for flow measurement includes two types of flumes, a pipe-insert flume and a culvert-inlet (manhole) flume. Samples of rainfall and runoff for water-quality analyses were collected by automatic, 3-liter, 24-sample capacity water samples controlled by multichannel data loggers. Ancillary equipment included a raingage and wet/dry atmospheric-deposition sampler. Nineteen stormwater runoff events were monitored at the site using the instrumentation system. The system has a high reliability of data capture and permits an accurate determination of storm-water loads.
Dynamic multistation photometer
Bauer, Martin L.; Johnson, Wayne F.; Lakomy, Dale G.
1977-01-01
A portable fast analyzer is provided that uses a magnetic clutch/brake to rapidly accelerate the analyzer rotor, and employs a microprocessor for automatic analyzer operation. The rotor is held stationary while the drive motor is run up to speed. When it is desired to mix the sample(s) and reagent(s), the brake is deenergized and the clutch is energized wherein the rotor is very rapidly accelerated to the running speed. The parallel path rotor that is used allows the samples and reagents to be mixed the moment they are spun out into the rotor cuvetes and data acquisition begins immediately. The analyzer will thus have special utility for fast reactions.
[The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].
Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh
2012-10-01
The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santamaria, L.; Siller, H. R.; Garcia-Ortiz, C. E., E-mail: cegarcia@cicese.mx
In this work, we present an alternative optical method to determine the probe-sample separation distance in a scanning near-field optical microscope. The experimental method is based in a Lloyd’s mirror interferometer and offers a measurement precision deviation of ∼100 nm using digital image processing and numerical analysis. The technique can also be strategically combined with the characterization of piezoelectric actuators and stability evaluation of the optical system. It also opens the possibility for the development of an automatic approximation control system valid for probe-sample distances from 5 to 500 μm.
Evaluation of a new automated microscopy urine sediment analyser - sediMAX conTRUST®.
Bogaert, Laura; Peeters, Bart; Billen, Jaak
2017-04-01
This study evaluated the performance of the stand-alone sediMAX conTRUST (77Elektronika, Budapest, Hungary) analyser as an alternative to microscopic analysis of urine. The validation included a precision, carry-over, categorical correlation and diagnostic performance study with manual phase-contrast microscopy as reference method. A total of 260 routine urine samples were assessed. The within-run precision was much better at higher concentrations than at very low concentrations. The precision met our predefined limits for all the elements at the different concentrations, with the exception of the lowest RBC, the WBC, pathological casts and crystals count. There was no sample carry-over. The analyser showed good categorical agreement with manual microscopy for RBC and WBC counts, moderate agreement for yeast cells, crystals and squamous epithelial cells and bad agreement for non-squamous epithelial cells, bacteria and casts. Diagnostic performance was satisfying only for RBC, WBC and yeast cells. The number of false negative results was acceptable (≤4%) for all elements after connecting the sediMAX conTRUST with an automatic strip reader (AutionMAX) and after implementation of review rules. We conclude that the sediMAX conTRUST should be used as a screening tool in combination with an automatic strip reader, for the identification of normal samples. Therefore, adequate review rules should be defined. Manual microscopy is still required in 'flagged' pathological samples. Despite the poor analytical performance on pathological samples, the images on the screen can be used for interpretation without the microscope and can be stored as PDF-documents for archiving the results.
Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias
2016-11-01
Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS 2 experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10 5 in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo
2018-07-01
Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.
Jia, Yu; Ehlert, Ludwig; Wahlskog, Cecilia; Lundberg, Angela; Maurice, Christian
2017-12-05
Monitoring pollutants in stormwater discharge in cold climates is challenging. An environmental survey was performed by sampling the stormwater from Luleå Airport, Northern Sweden, during the period 2010-2013, when urea was used as a main component of aircraft deicing/anti-icing fluids (ADAFs). The stormwater collected from the runway was led through an oil trap to an infiltration pond to store excess water during precipitation periods and enhance infiltration and water treatment. Due to insufficient capacity, an emergency spillway was established and equipped with a flow meter and an automatic sampler. This study proposes a program for effective monitoring of pollutant discharge with a minimum number of sampling occasions when use of automatic samplers is not possible. The results showed that 90% of nitrogen discharge occurs during late autumn before the water pipes freeze and during snow melting, regardless of the precipitation during the remaining months when the pollutant discharge was negligible. The concentrations of other constituents in the discharge were generally low compared to guideline values. The best data quality was obtained using flow controlled sampling. Intensive time-controlled sampling during late autumn (few weeks) and snow melting (2 weeks) would be sufficient for necessary information. The flow meters installed at the rectangular notch appeared to be difficult to calibrate and gave contradictory results. Overall, the spillway was dry, as water infiltrated into the pond, and stagnant water close to the edge might be registered as flow. Water level monitoring revealed that the infiltration capacity gradually decreased with time.
Communication: Multiple atomistic force fields in a single enhanced sampling simulation
NASA Astrophysics Data System (ADS)
Hoang Viet, Man; Derreumaux, Philippe; Nguyen, Phuong H.
2015-07-01
The main concerns of biomolecular dynamics simulations are the convergence of the conformational sampling and the dependence of the results on the force fields. While the first issue can be addressed by employing enhanced sampling techniques such as simulated tempering or replica exchange molecular dynamics, repeating these simulations with different force fields is very time consuming. Here, we propose an automatic method that includes different force fields into a single advanced sampling simulation. Conformational sampling using three all-atom force fields is enhanced by simulated tempering and by formulating the weight parameters of the simulated tempering method in terms of the energy fluctuations, the system is able to perform random walk in both temperature and force field spaces. The method is first demonstrated on a 1D system and then validated by the folding of the 10-residue chignolin peptide in explicit water.
Rugged large volume injection for sensitive capillary LC-MS environmental monitoring
NASA Astrophysics Data System (ADS)
Roberg-Larsen, Hanne; Abele, Silvija; Demir, Deniz; Dzabijeva, Diana; Amundsen, Sunniva F.; Wilson, Steven R.; Bartkevics, Vadims; Lundanes, Elsa
2017-08-01
A rugged and high throughput capillary column (cLC) LC-MS switching platform using large volume injection and on-line automatic filtration and filter back-flush (AFFL) solid phase extraction (SPE) for analysis of environmental water samples with minimal sample preparation is presented. Although narrow columns and on-line sample preparation are used in the platform, high ruggedness is achieved e.g. injection of 100 non-filtrated water samples would did not result in a pressure rise/clogging of the SPE/capillary columns (inner diameter 300 µm). In addition, satisfactory retention time stability and chromatographic resolution were also features of the system. The potential of the platform for environmental water samples was demonstrated with various pharmaceutical products, which had detection limits (LOD) in the 0.05 - 12.5 ng/L range. Between-day and within-day repeatability of selected analytes were < 20% RSD.
Automatic liquid handling for life science: a critical review of the current state of the art.
Kong, Fanwei; Yuan, Liang; Zheng, Yuan F; Chen, Weidong
2012-06-01
Liquid handling plays a pivotal role in life science laboratories. In experiments such as gene sequencing, protein crystallization, antibody testing, and drug screening, liquid biosamples frequently must be transferred between containers of varying sizes and/or dispensed onto substrates of varying types. The sample volumes are usually small, at the micro- or nanoliter level, and the number of transferred samples can be huge when investigating large-scope combinatorial conditions. Under these conditions, liquid handling by hand is tedious, time-consuming, and impractical. Consequently, there is a strong demand for automated liquid-handling methods such as sensor-integrated robotic systems. In this article, we survey the current state of the art in automatic liquid handling, including technologies developed by both industry and research institutions. We focus on methods for dealing with small volumes at high throughput and point out challenges for future advancements.
Automatic detection of spermatozoa for laser capture microdissection.
Vandewoestyne, Mado; Van Hoofstat, David; Van Nieuwerburgh, Filip; Deforce, Dieter
2009-03-01
In sexual assault crimes, differential extraction of spermatozoa from vaginal swab smears is often ineffective, especially when only a few spermatozoa are present in an overwhelming amount of epithelial cells. Laser capture microdissection (LCM) enables the precise separation of spermatozoa and epithelial cells. However, standard sperm-staining techniques are non-specific and rely on sperm morphology for identification. Moreover, manual screening of the microscope slides is time-consuming and labor-intensive. Here, we describe an automated screening method to detect spermatozoa stained with Sperm HY-LITER. Different ratios of spermatozoa and epithelial cells were used to assess the automatic detection method. In addition, real postcoital samples were also screened. Detected spermatozoa were isolated using LCM and DNA analysis was performed. Robust DNA profiles without allelic dropout could be obtained from as little as 30 spermatozoa recovered from postcoital samples, showing that the staining had no significant influence on DNA recovery.
Automatic measurements and computations for radiochemical analyses
Rosholt, J.N.; Dooley, J.R.
1960-01-01
In natural radioactive sources the most important radioactive daughter products useful for geochemical studies are protactinium-231, the alpha-emitting thorium isotopes, and the radium isotopes. To resolve the abundances of these thorium and radium isotopes by their characteristic decay and growth patterns, a large number of repeated alpha activity measurements on the two chemically separated elements were made over extended periods of time. Alpha scintillation counting with automatic measurements and sample changing is used to obtain the basic count data. Generation of the required theoretical decay and growth functions, varying with time, and the least squares solution of the overdetermined simultaneous count rate equations are done with a digital computer. Examples of the complex count rate equations which may be solved and results of a natural sample containing four ??-emitting isotopes of thorium are illustrated. These methods facilitate the determination of the radioactive sources on the large scale required for many geochemical investigations.
Pulse-echo ultrasonic imaging method for eliminating sample thickness variation effects
NASA Technical Reports Server (NTRS)
Roth, Don J. (Inventor)
1995-01-01
A pulse-echo, immersion method for ultrasonic evaluation of a material is discussed. It accounts for and eliminates nonlevelness in the equipment set-up and sample thickness variation effects employs a single transducer, automatic scanning and digital imaging to obtain an image of a property of the material, such as pore fraction. The nonlevelness and thickness variation effects are accounted for by pre-scan adjusments of the time window to insure that the echoes received at each scan point are gated in the center of the window. This information is input into the scan file so that, during the automatic scanning for the material evaluation, each received echo is centered in its time window. A cross-correlation function calculates the velocity at each scan point, which is then proportionalized to a color or grey scale and displayed on a video screen.
Wang, Qinghua; Ri, Shien; Tsuda, Hiroshi; Kodera, Masako; Suguro, Kyoichi; Miyashita, Naoto
2017-09-19
Quantitative detection of defects in atomic structures is of great significance to evaluating product quality and exploring quality improvement process. In this study, a Fourier transform filtered sampling Moire technique was proposed to visualize and detect defects in atomic arrays in a large field of view. Defect distributions, defect numbers and defect densities could be visually and quantitatively determined from a single atomic structure image at low cost. The effectiveness of the proposed technique was verified from numerical simulations. As an application, the dislocation distributions in a GaN/AlGaN atomic structure in two directions were magnified and displayed in Moire phase maps, and defect locations and densities were detected automatically. The proposed technique is able to provide valuable references to material scientists and engineers by checking the effect of various treatments for defect reduction. © 2017 IOP Publishing Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd
Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective ofmore » the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.« less
Unanticipated error in HbA(1c) measurement on the HLC-723 G7 analyzer.
van den Ouweland, Johannes M W; de Keijzer, Marinus H; van Daal, Henny
2010-04-01
Investigation of falsely elevated HbA(1c) measurements on the HLC-723 G7 analyser. Comparison of HbA(1c) in blood samples that were diluted either in hemolysis reagent or water. HbA(1c) results became falsely elevated when samples were diluted in hemolysis reagent, but not in water. QC-procedures failed to detect this error as calibrator and QC samples were manually diluted in water, according to manufacturer's instructions, whereas patient samples were automatically diluted using hemolysing reagent. After replacement of the instruments' sample-loop and rotor seal comparable HbA(1c) results were obtained, irrespective of dilution with hemolysing reagent or water. This case illustrates the importance of treating calibrator and QC materials similar to routine patient samples in order to prevent unnoticed drift in patient HbA(1c) results. Copyright 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Instruction manual, optical effects module electronic controller and processor, model OEMCP
NASA Technical Reports Server (NTRS)
1975-01-01
The OEM-1 electronic module is discussed; it is comprised of four subsystems: the signal processing and display; the stepper motor controls; the chopper controls; and the dc-dc invertor. The OEM-1 module controls the sample wheel so that the relative transmittance of the samples can be compared to the clear aperture position. The 3-1/2 digit digital voltmeter displays the clear aperture signal level as well as the ratio of the remaining sample positions relative to the clear aperture position. The sample wheel position is decoded so that the signals and ratios can be correlated to the data. The OEM is automatically reset to the I sub o on initial turn-on and can be reset to the '0' position by actuating a front panel switch. The sample wheel can be interrupted to change samples or induce a longer integration time if desired by a front panel command. Integration times from 1 - 50 seconds are provided at the front panel, and BCD data for external interfacing is provided.
NASA Astrophysics Data System (ADS)
Raza, Shan-e.-Ahmed; Marjan, M. Q.; Arif, Muhammad; Butt, Farhana; Sultan, Faisal; Rajpoot, Nasir M.
2015-03-01
One of the main factors for high workload in pulmonary pathology in developing countries is the relatively large proportion of tuberculosis (TB) cases which can be detected with high throughput using automated approaches. TB is caused by Mycobacterium tuberculosis, which appears as thin, rod-shaped acid-fast bacillus (AFB) in Ziehl-Neelsen (ZN) stained sputum smear samples. In this paper, we present an algorithm for automatic detection of AFB in digitized images of ZN stained sputum smear samples under a light microscope. A key component of the proposed algorithm is the enhancement of raw input image using a novel anisotropic tubular filter (ATF) which suppresses the background noise while simultaneously enhancing strong anisotropic features of AFBs present in the image. The resulting image is then segmented using color features and candidate AFBs are identified. Finally, a support vector machine classifier using morphological features from candidate AFBs decides whether a given image is AFB positive or not. We demonstrate the effectiveness of the proposed ATF method with two different feature sets by showing that the proposed image analysis pipeline results in higher accuracy and F1-score than the same pipeline with standard median filtering for image enhancement.
Automatic calibration and control system for a combined oxygen and combustibles analyzer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolbert, G.D.; Jewett, S.Y.; Robertson, J.W. Jr.
1989-08-01
This patent describes an automatic, periodically calibrating system for continuous output of calibrated signals from a combined oxygen and combustibles analyzer. It comprises: a combined oxygen and combustibles analyzer for sensing a level of oxygen and a level of combustibles in a volatile atmosphere and for producing a first sample signal indicative of the oxygen level and a second sample signal indicative of the combustibles level; means for introducing zero and span calibration test gases into the analyzer; means for periodically calibrating the analyzer. This including: a data control unit; a timer unit; a mechanical unit, means for calculating zeromore » and span values for oxygen and combustibles, means for comparing the calculated zero and span values for oxygen and combustibles to the preset alarm limits for oxygen and combustibles, means for activating an operator alarm, means for calculating oxygen and combustibles drift adjustments, a memory unit; and means for applying the oxygen and combustibles drift adjustments concurrently to the first and second sample signals, according to predetermined mathematical relationship, to obtain calibrated output signals indicative of the oxygen and combustibles level in the volatile atmosphere.« less
Simple automatic strategy for background drift correction in chromatographic data analysis.
Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin
2016-06-03
Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.
Cascaded deep decision networks for classification of endoscopic images
NASA Astrophysics Data System (ADS)
Murthy, Venkatesh N.; Singh, Vivek; Sun, Shanhui; Bhattacharya, Subhabrata; Chen, Terrence; Comaniciu, Dorin
2017-02-01
Both traditional and wireless capsule endoscopes can generate tens of thousands of images for each patient. It is desirable to have the majority of irrelevant images filtered out by automatic algorithms during an offline review process or to have automatic indication for highly suspicious areas during an online guidance. This also applies to the newly invented endomicroscopy, where online indication of tumor classification plays a significant role. Image classification is a standard pattern recognition problem and is well studied in the literature. However, performance on the challenging endoscopic images still has room for improvement. In this paper, we present a novel Cascaded Deep Decision Network (CDDN) to improve image classification performance over standard Deep neural network based methods. During the learning phase, CDDN automatically builds a network which discards samples that are classified with high confidence scores by a previously trained network and concentrates only on the challenging samples which would be handled by the subsequent expert shallow networks. We validate CDDN using two different types of endoscopic imaging, which includes a polyp classification dataset and a tumor classification dataset. From both datasets we show that CDDN can outperform other methods by about 10%. In addition, CDDN can also be applied to other image classification problems.
NASA Astrophysics Data System (ADS)
Miao, Zelang
2017-04-01
Currently, urban dwellers comprise more than half of the world's population and this percentage is still dramatically increasing. The explosive urban growth over the next two decades poses long-term profound impact on people as well as the environment. Accurate and up-to-date delineation of urban settlements plays a fundamental role in defining planning strategies and in supporting sustainable development of urban settlements. In order to provide adequate data about urban extents and land covers, classifying satellite data has become a common practice, usually with accurate enough results. Indeed, a number of supervised learning methods have proven effective in urban area classification, but they usually depend on a large amount of training samples, whose collection is a time and labor expensive task. This issue becomes particularly serious when classifying large areas at the regional/global level. As an alternative to manual ground truth collection, in this work we use geo-referenced social media data. Cities and densely populated areas are an extremely fertile land for the production of individual geo-referenced data (such as GPS and social network data). Training samples derived from geo-referenced social media have several advantages: they are easy to collect, usually they are freely exploitable; and, finally, data from social media are spatially available in many locations, and with no doubt in most urban areas around the world. Despite these advantages, the selection of training samples from social media meets two challenges: 1) there are many duplicated points; 2) method is required to automatically label them as "urban/non-urban". The objective of this research is to validate automatic sample selection from geo-referenced social media and its applicability in one class classification for urban extent mapping from satellite images. The findings in this study shed new light on social media applications in the field of remote sensing.
Sako, Alysson V F; Dolzan, Maressa D; Micke, Gustavo Amadeu
2015-09-01
This paper describes a fast and sensitive method for the determination of methyl, ethyl, propyl, and butylparaben in hair samples by capillary electrophoresis using automatic reverse electrode polarity stacking mode. In the proposed method, solutions are injected using the flush command of the analysis software (940 mbar) and the polarity switching is carried out automatically immediately after the sample injection. The advantages compared with conventional stacking methods are the increased analytical frequency, repeatability, and inter-day precision. All analyses were performed in a fused silica capillary (50 cm, 41.5 cm in effective length, 50 μm i.d.), and the background electrolyte was composed of 20 mmol L(-1) sodium tetraborate in 10 % of methanol, pH 9.3. For the reverse polarity, -25 kV/35 s was applied followed by application of +30 kV for the electrophoretic run. Temperature was set at 20 °C, and all analytes were monitored at 297 nm. The method showed acceptable linearity (r (2) > 0.997) in the studied range of 0.1-5.0 mg L(-1), limits of detection below 0.017 mg L(-1), and inter-day, intra-day, and instrumental precision better than 6.2, 3.6, and 4.6 %, respectively. Considering parabens is widely used as a preservative in many products and the reported possibility of damage to the hair and also to human health caused by these compounds, the proposed method was applied to evaluate the adsorption of parabens in hair samples. The results indicate that there is a greater adsorption of methylparaben compared to the other parabens tested and also dyed hairs had a greater adsorption capacity for parabens than natural hairs.
Iancu, I; Bodner, E; Joubran, S; Ben Zion, I; Ram, E
2015-05-01
Social Anxiety Disorder (SAD) has been repeatedly shown to be very prevalent in the Western society and is characterized by low self-esteem, pessimism, procrastination and also perfectionism. Very few studies on SAD have been done in the Middle East or in Arab countries, and no study tackled the relationship between social anxiety symptoms and perfectionism in non-Western samples. We examined social anxiety symptoms and perfectionism in a group of 132 Israeli Jewish (IJ) and Israeli Arab (IA) students. Subjects completed the Liebowitz Social Anxiety Scale (LSAS), the Multidimensional Perfectionism Scale (MPS), the Negative Automatic Thoughts Questionnaire (ATQ-N), the Positive Automatic Thoughts Questionnaire (ATQ-P) and a socio-demographic questionnaire. The rate of SAD in our sample according to a LSAS score of 60 or more was 17.2% (IJ=13.8%, IA=19%, ns). The correlation between perfectionism and the LSAS was high in both groups, and in particular in the IJ group. The IA group had higher scores of social avoidance, of ATQ-P and of two of the MPS subscales: parental expectations and parental criticism. Concern over mistakes and negative automatic thoughts positively predicted social fear in the IJ group, whereas in the IA group being female, religious and less educated positively predicted social fear. Negative automatic thoughts and age positively predicted social avoidance in the IJ group. In general, the IJ and IA subjects showed higher social anxiety, higher ATQ-N scores and lower parental expectations as compared with non-clinical US samples. Social anxiety symptoms and perfectionism are prevalent in Arab and Jewish students in Israel and seem to be closely related. Further studies among non-western minority groups may detect cultural influences on social anxiety and might add to the growing body of knowledge on this intriguing condition. Copyright © 2014 Elsevier Inc. All rights reserved.
Digital movie-based on automatic titrations.
Lima, Ricardo Alexandre C; Almeida, Luciano F; Lyra, Wellington S; Siqueira, Lucas A; Gaião, Edvaldo N; Paiva Junior, Sérgio S L; Lima, Rafaela L F C
2016-01-15
This study proposes the use of digital movies (DMs) in a flow-batch analyzer (FBA) to perform automatic, fast and accurate titrations. The term used for this process is "Digital movie-based on automatic titrations" (DMB-AT). A webcam records the DM during the addition of the titrant to the mixing chamber (MC). While the DM is recorded, it is decompiled into frames ordered sequentially at a constant rate of 26 frames per second (FPS). The first frame is used as a reference to define the region of interest (ROI) of 28×13pixels and the R, G and B values, which are used to calculate the Hue (H) values for each frame. The Pearson's correlation coefficient (r) is calculated between the H values of the initial frame and each subsequent frame. The titration curves are plotted in real time using the r values and the opening time of the titrant valve. The end point is estimated by the second derivative method. A software written in C language manages all analytical steps and data treatment in real time. The feasibility of the method was attested by application in acid/base test samples and edible oils. Results were compared with classical titration and did not present statistically significant differences when the paired t-test at the 95% confidence level was applied. The proposed method is able to process about 117-128 samples per hour for the test and edible oil samples, respectively, and its precision was confirmed by overall relative standard deviation (RSD) values, always less than 1.0%. Copyright © 2015 Elsevier B.V. All rights reserved.
Scherer, Sebastian; Kowal, Julia; Chami, Mohamed; Dandey, Venkata; Arheit, Marcel; Ringler, Philippe; Stahlberg, Henning
2014-05-01
The introduction of direct electron detectors (DED) to cryo-electron microscopy has tremendously increased the signal-to-noise ratio (SNR) and quality of the recorded images. We discuss the optimal use of DEDs for cryo-electron crystallography, introduce a new automatic image processing pipeline, and demonstrate the vast improvement in the resolution achieved by the use of both together, especially for highly tilted samples. The new processing pipeline (now included in the software package 2dx) exploits the high SNR and frame readout frequency of DEDs to automatically correct for beam-induced sample movement, and reliably processes individual crystal images without human interaction as data are being acquired. A new graphical user interface (GUI) condenses all information required for quality assessment in one window, allowing the imaging conditions to be verified and adjusted during the data collection session. With this new pipeline an automatically generated unit cell projection map of each recorded 2D crystal is available less than 5 min after the image was recorded. The entire processing procedure yielded a three-dimensional reconstruction of the 2D-crystallized ion-channel membrane protein MloK1 with a much-improved resolution of 5Å in-plane and 7Å in the z-direction, within 2 days of data acquisition and simultaneous processing. The results obtained are superior to those delivered by conventional photographic film-based methodology of the same sample, and demonstrate the importance of drift-correction. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Automatic cortical thickness analysis on rodent brain
NASA Astrophysics Data System (ADS)
Lee, Joohwi; Ehlers, Cindy; Crews, Fulton; Niethammer, Marc; Budin, Francois; Paniagua, Beatriz; Sulik, Kathy; Johns, Josephine; Styner, Martin; Oguz, Ipek
2011-03-01
Localized difference in the cortex is one of the most useful morphometric traits in human and animal brain studies. There are many tools and methods already developed to automatically measure and analyze cortical thickness for the human brain. However, these tools cannot be directly applied to rodent brains due to the different scales; even adult rodent brains are 50 to 100 times smaller than humans. This paper describes an algorithm for automatically measuring the cortical thickness of mouse and rat brains. The algorithm consists of three steps: segmentation, thickness measurement, and statistical analysis among experimental groups. The segmentation step provides the neocortex separation from other brain structures and thus is a preprocessing step for the thickness measurement. In the thickness measurement step, the thickness is computed by solving a Laplacian PDE and a transport equation. The Laplacian PDE first creates streamlines as an analogy of cortical columns; the transport equation computes the length of the streamlines. The result is stored as a thickness map over the neocortex surface. For the statistical analysis, it is important to sample thickness at corresponding points. This is achieved by the particle correspondence algorithm which minimizes entropy between dynamically moving sample points called particles. Since the computational cost of the correspondence algorithm may limit the number of corresponding points, we use thin-plate spline based interpolation to increase the number of corresponding sample points. As a driving application, we measured the thickness difference to assess the effects of adolescent intermittent ethanol exposure that persist into adulthood and performed t-test between the control and exposed rat groups. We found significantly differing regions in both hemispheres.
ATMAD: robust image analysis for Automatic Tissue MicroArray De-arraying.
Nguyen, Hoai Nam; Paveau, Vincent; Cauchois, Cyril; Kervrann, Charles
2018-04-19
Over the last two decades, an innovative technology called Tissue Microarray (TMA), which combines multi-tissue and DNA microarray concepts, has been widely used in the field of histology. It consists of a collection of several (up to 1000 or more) tissue samples that are assembled onto a single support - typically a glass slide - according to a design grid (array) layout, in order to allow multiplex analysis by treating numerous samples under identical and standardized conditions. However, during the TMA manufacturing process, the sample positions can be highly distorted from the design grid due to the imprecision when assembling tissue samples and the deformation of the embedding waxes. Consequently, these distortions may lead to severe errors of (histological) assay results when the sample identities are mismatched between the design and its manufactured output. The development of a robust method for de-arraying TMA, which localizes and matches TMA samples with their design grid, is therefore crucial to overcome the bottleneck of this prominent technology. In this paper, we propose an Automatic, fast and robust TMA De-arraying (ATMAD) approach dedicated to images acquired with brightfield and fluorescence microscopes (or scanners). First, tissue samples are localized in the large image by applying a locally adaptive thresholding on the isotropic wavelet transform of the input TMA image. To reduce false detections, a parametric shape model is considered for segmenting ellipse-shaped objects at each detected position. Segmented objects that do not meet the size and the roundness criteria are discarded from the list of tissue samples before being matched with the design grid. Sample matching is performed by estimating the TMA grid deformation under the thin-plate model. Finally, thanks to the estimated deformation, the true tissue samples that were preliminary rejected in the early image processing step are recognized by running a second segmentation step. We developed a novel de-arraying approach for TMA analysis. By combining wavelet-based detection, active contour segmentation, and thin-plate spline interpolation, our approach is able to handle TMA images with high dynamic, poor signal-to-noise ratio, complex background and non-linear deformation of TMA grid. In addition, the deformation estimation produces quantitative information to asset the manufacturing quality of TMAs.
Lassahn, Gordon D.; Lancaster, Gregory D.; Apel, William A.; Thompson, Vicki S.
2013-01-08
Image portion identification methods, image parsing methods, image parsing systems, and articles of manufacture are described. According to one embodiment, an image portion identification method includes accessing data regarding an image depicting a plurality of biological substrates corresponding to at least one biological sample and indicating presence of at least one biological indicator within the biological sample and, using processing circuitry, automatically identifying a portion of the image depicting one of the biological substrates but not others of the biological substrates.
Electro-Chemical-Mechanical, Low Stress, Automatic Polishing (ECMP) Device (Preprint)
2010-01-01
into models that predict mechanical response [ 4 - 6 ]. In addition, surface preparation steps are critical to the imaging of ceramic and hybrid...2p 3/2 peak in the spectral data found in Figure 4 . The Ti 2p 3/2 peak is initially observed at 458.4 eV indicating that titanium is present in its...above 6 acceptable limits for both (average IQ values were higher than 2000). For the titanium samples, the samples processed without applied
Compact, Non-Pneumatic Rock-Powder Samplers
NASA Technical Reports Server (NTRS)
Sherrit, Stewart; Bar-Cohen, Yoseph; Badescu, Mircea; Bao, Xiaoqi; Chang, Zensheu; Jones, Christopher; Aldrich, Jack
2008-01-01
Tool bits that automatically collect powdered rock, permafrost, or other hard material generated in repeated hammering action have been invented. The present invention pertains to the special case in which it is desired to collect samples in powder form for analysis by x-ray diffraction and possibly other techniques. The present invention eliminates the need for both the mechanical collection equipment and the crushing chamber and the pneumatic collection equipment of prior approaches, so that it becomes possible to make the overall sample-acquisition apparatus more compact.
NASA Technical Reports Server (NTRS)
Gialdini, M.; Titus, S. J.; Nichols, J. D.; Thomas, R.
1975-01-01
An approach to information acquisition is discussed in the context of meeting user-specified needs in a cost-effective, timely manner through the use of remote sensing data, ground data, and multistage sampling techniques. The roles of both LANDSAT imagery and Skylab photography are discussed as first stages of three separate multistage timber inventory systems and results are given for each system. Emphasis is placed on accuracy and meeting user needs.
A system for programming experiments and for recording and analyzing data automatically1
Herrick, Robert M.; Denelsbeck, John S.
1963-01-01
A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967
Implementation of a microcontroller-based semi-automatic coagulator.
Chan, K; Kirumira, A; Elkateeb, A
2001-01-01
The coagulator is an instrument used in hospitals to detect clot formation as a function of time. Generally, these coagulators are very expensive and therefore not affordable by a doctors' office and small clinics. The objective of this project is to design and implement a low cost semi-automatic coagulator (SAC) prototype. The SAC is capable of assaying up to 12 samples and can perform the following tests: prothrombin time (PT), activated partial thromboplastin time (APTT), and PT/APTT combination. The prototype has been tested successfully.
NASA Astrophysics Data System (ADS)
Larin, A. B.; Kolegov, A. V.
2012-10-01
Results of industrial tests of the new method used for the automatic chemical control of the quality of boiler water of the drum-type power boiler ( P d = 13.8 MPa) are described. The possibility of using an H-cationite column for measuring the electric conductivity of an H-cationized sample of boiler water over a long period of time is shown.
Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J
2001-08-01
The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.
Automatic cross-sectioning and monitoring system locates defects in electronic devices
NASA Technical Reports Server (NTRS)
Jacobs, G.; Slaughter, B.
1971-01-01
System consists of motorized grinding and lapping apparatus, sample holder, and electronic control circuit. Low power microscope examines device to pinpoint location of circuit defect, and monitor displays output signal when defect is located exactly.
NASA Astrophysics Data System (ADS)
Assoumani, Azziz; Margoum, Christelle; Guillemain, Céline; Coquery, Marina
2014-05-01
The monitoring of water bodies regarding organic contaminants, and the determination of reliable estimates of concentrations are challenging issues, in particular for the implementation of the Water Framework Directive. Several strategies can be applied to collect water samples for the determination of their contamination level. Grab sampling is fast, easy, and requires little logistical and analytical needs in case of low frequency sampling campaigns. However, this technique lacks of representativeness for streams with high variations of contaminant concentrations, such as pesticides in rivers located in small agricultural watersheds. Increasing the representativeness of this sampling strategy implies greater logistical needs and higher analytical costs. Average automated sampling is therefore a solution as it allows, in a single analysis, the determination of more accurate and more relevant estimates of concentrations. Two types of automatic samplings can be performed: time-related sampling allows the assessment of average concentrations, whereas flow-dependent sampling leads to average flux concentrations. However, the purchase and the maintenance of automatic samplers are quite expensive. Passive sampling has recently been developed as an alternative to grab or average automated sampling, to obtain at lower cost, more realistic estimates of the average concentrations of contaminants in streams. These devices allow the passive accumulation of contaminants from large volumes of water, resulting in ultratrace level detection and smoothed integrative sampling over periods ranging from days to weeks. They allow the determination of time-weighted average (TWA) concentrations of the dissolved fraction of target contaminants, but they need to be calibrated in controlled conditions prior to field applications. In other words, the kinetics of the uptake of the target contaminants into the sampler must be studied in order to determine the corresponding sampling rate constants (Rs). Each constant links the mass of the a target contaminant accumulated in the sampler to its concentration in water. At the end of the field application, the Rs are used to calculate the TWA concentration of each target contaminant with the final mass of the contaminants accumulated in the sampler. Stir Bar Sorptive Extraction (SBSE) is a solvent free sample preparation technique dedicated to the analysis of moderately hydrophobic to hydrophobic compounds in liquid and gas samples. It is composed of a magnet enclosed in a glass tube coated with a thick film of polydimethysiloxane (PDMS). We recently developed the in situ application of SBSE as a passive sampling technique (herein named "Passive SBSE") for the monitoring of agricultural pesticides. The aim of this study is to perform the calibration of the passive SBSE in the laboratory, and to apply and compare this technique to active sampling strategies for the monitoring of 16 relatively hydrophobic to hydrophobic pesticides in streams, during 2 1-month sampling campaigns. Time-weighted averaged concentrations of the target pesticides obtained from passive SBSE were compared to the target pesticide concentrations of grab samples, and time-related and flow-dependent samples of the streams. Results showed passive SBSE as an efficient alternative to conventional active sampling strategies.
Application of Magnetic Nanoparticles in Pretreatment Device for POPs Analysis in Water
NASA Astrophysics Data System (ADS)
Chu, Dongzhi; Kong, Xiangfeng; Wu, Bingwei; Fan, Pingping; Cao, Xuan; Zhang, Ting
2018-01-01
In order to reduce process time and labour force of POPs pretreatment, and solve the problem that extraction column was easily clogged, the paper proposed a new technology of extraction and enrichment which used magnetic nanoparticles. Automatic pretreatment system had automatic sampling unit, extraction enrichment unit and elution enrichment unit. The paper briefly introduced the preparation technology of magnetic nanoparticles, and detailly introduced the structure and control system of automatic pretreatment system. The result of magnetic nanoparticles mass recovery experiments showed that the system had POPs analysis preprocessing capability, and the recovery rate of magnetic nanoparticles were over 70%. In conclusion, the author proposed three points optimization recommendation.
Automatic patient dose registry and clinical audit on line for mammography.
Ten, J I; Vano, E; Sánchez, R; Fernandez-Soto, J M
2015-07-01
The use of automatic registry systems for patient dose in digital mammography allows clinical audit and patient dose analysis of the whole sample of individual mammography exposures while fulfilling the requirements of the European Directives and other international recommendations. Further parameters associated with radiation exposure (tube voltage, X-ray tube output and HVL values for different kVp and target/filter combinations, breast compression, etc.) should be periodically verified and used to evaluate patient doses. This study presents an experience in routine clinical practice for mammography using automatic systems. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.
Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L
2005-01-01
An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.
Sources of error in estimating truck traffic from automatic vehicle classification data
DOT National Transportation Integrated Search
1998-10-01
Truck annual average daily traffic estimation errors resulting from sample classification counts are computed in this paper under two scenarios. One scenario investigates an improper factoring procedure that may be used by highway agencies. The study...
1995-09-15
Large Isothermal Furnace (LIF) was flown on a mission in cooperation with the National Space Development Agency (NASDA) of Japan. LIF is a vacuum-heating furnace designed to heat large samples uniformly. The furnace consists of a sample container and heating element surrounded by a vacuum chamber. A crewmemeber will insert a sample cartridge into the furnace. The furnace will be activated and operations will be controlled automatically by a computer in response to an experiment number entered on the control panel. At the end of operations, helium will be discharged into the furnace, allowing cooling to start. Cooling will occur through the use of a water jacket while rapid cooling of samples can be accomplished through a controlled flow of helium. Data from experiments will help scientists better understand this important process which is vital to the production of high-quality semiconductor crystals.
Storm-water data for Bear Creek basin, Jackson County, Oregon 1977-78
Wittenberg, Loren A.
1978-01-01
Storm-water-quality samples were collected from four subbasins in the Bear Creek basin in southern Oregon. These subbasins vary in drainage size, channel slope, effective impervious area, and land use. Automatic waterquality samplers and precipitation and discharge gages were set up in each of the four subbasins. During the period October 1977 through May 1978, 19 sets of samples, including two base-flow samples, were collected. Fecal coliform bacteria colonies per 100-milliliter sample ranged from less than 1,000 to more than 1,000,000. Suspended-sediment concentrations ranged from less than 1 to more than 2,300 milligrams per liter. One subbasin consisting of downtown businesses and streets with heavy vehicular traffic was monitored for lead. Total lead values ranging from 100 to 1,900 micrograms per liter were measured during one storm event.
OpenMSI Arrayed Analysis Tools v2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOWEN, BENJAMIN; RUEBEL, OLIVER; DE ROND, TRISTAN
2017-02-07
Mass spectrometry imaging (MSI) enables high-resolution spatial mapping of biomolecules in samples and is a valuable tool for the analysis of tissues from plants and animals, microbial interactions, high-throughput screening, drug metabolism, and a host of other applications. This is accomplished by desorbing molecules from the surface on spatially defined locations, using a laser or ion beam. These ions are analyzed by a mass spectrometry and collected into a MSI 'image', a dataset containing unique mass spectra from the sampled spatial locations. MSI is used in a diverse and increasing number of biological applications. The OpenMSI Arrayed Analysis Tool (OMAAT)more » is a new software method that addresses the challenges of analyzing spatially defined samples in large MSI datasets, by providing support for automatic sample position optimization and ion selection.« less
Sampling probe for microarray read out using electrospray mass spectrometry
Van Berkel, Gary J.
2004-10-12
An automated electrospray based sampling system and method for analysis obtains samples from surface array spots having analytes. The system includes at least one probe, the probe including an inlet for flowing at least one eluting solvent to respective ones of a plurality of spots and an outlet for directing the analyte away from the spots. An automatic positioning system is provided for translating the probe relative to the spots to permit sampling of any spot. An electrospray ion source having an input fluidicly connected to the probe receives the analyte and generates ions from the analyte. The ion source provides the generated ions to a structure for analysis to identify the analyte, preferably being a mass spectrometer. The probe can be a surface contact probe, where the probe forms an enclosing seal along the periphery of the array spot surface.
Forest inventory using multistage sampling with probability proportional to size. [Brazil
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Lee, D. C. L.; Hernandezfilho, P.; Shimabukuro, Y. E.; Deassis, O. R.; Demedeiros, J. S.
1984-01-01
A multistage sampling technique, with probability proportional to size, for forest volume inventory using remote sensing data is developed and evaluated. The study area is located in the Southeastern Brazil. The LANDSAT 4 digital data of the study area are used in the first stage for automatic classification of reforested areas. Four classes of pine and eucalypt with different tree volumes are classified utilizing a maximum likelihood classification algorithm. Color infrared aerial photographs are utilized in the second stage of sampling. In the third state (ground level) the time volume of each class is determined. The total time volume of each class is expanded through a statistical procedure taking into account all the three stages of sampling. This procedure results in an accurate time volume estimate with a smaller number of aerial photographs and reduced time in field work.
Automatic initialization for 3D bone registration
NASA Astrophysics Data System (ADS)
Foroughi, Pezhman; Taylor, Russell H.; Fichtinger, Gabor
2008-03-01
In image-guided bone surgery, sample points collected from the surface of the bone are registered to the preoperative CT model using well-known registration methods such as Iterative Closest Point (ICP). These techniques are generally very sensitive to the initial alignment of the datasets. Poor initialization significantly increases the chances of getting trapped local minima. In order to reduce the risk of local minima, the registration is manually initialized by locating the sample points close to the corresponding points on the CT model. In this paper, we present an automatic initialization method that aligns the sample points collected from the surface of pelvis with CT model of the pelvis. The main idea is to exploit a mean shape of pelvis created from a large number of CT scans as the prior knowledge to guide the initial alignment. The mean shape is constant for all registrations and facilitates the inclusion of application-specific information into the registration process. The CT model is first aligned with the mean shape using the bilateral symmetry of the pelvis and the similarity of multiple projections. The surface points collected using ultrasound are then aligned with the pelvis mean shape. This will, in turn, lead to initial alignment of the sample points with the CT model. The experiments using a dry pelvis and two cadavers show that the method can align the randomly dislocated datasets close enough for successful registration. The standard ICP has been used for final registration of datasets.
Exploring geo-tagged photos for land cover validation with deep learning
NASA Astrophysics Data System (ADS)
Xing, Hanfa; Meng, Yuan; Wang, Zixuan; Fan, Kaixuan; Hou, Dongyang
2018-07-01
Land cover validation plays an important role in the process of generating and distributing land cover thematic maps, which is usually implemented by high cost of sample interpretation with remotely sensed images or field survey. With an increasing availability of geo-tagged landscape photos, the automatic photo recognition methodologies, e.g., deep learning, can be effectively utilised for land cover applications. However, they have hardly been utilised in validation processes, as challenges remain in sample selection and classification for highly heterogeneous photos. This study proposed an approach to employ geo-tagged photos for land cover validation by using the deep learning technology. The approach first identified photos automatically based on the VGG-16 network. Then, samples for validation were selected and further classified by considering photos distribution and classification probabilities. The implementations were conducted for the validation of the GlobeLand30 land cover product in a heterogeneous area, western California. Experimental results represented promises in land cover validation, given that GlobeLand30 showed an overall accuracy of 83.80% with classified samples, which was close to the validation result of 80.45% based on visual interpretation. Additionally, the performances of deep learning based on ResNet-50 and AlexNet were also quantified, revealing no substantial differences in final validation results. The proposed approach ensures geo-tagged photo quality, and supports the sample classification strategy by considering photo distribution, with accuracy improvement from 72.07% to 79.33% compared with solely considering the single nearest photo. Consequently, the presented approach proves the feasibility of deep learning technology on land cover information identification of geo-tagged photos, and has a great potential to support and improve the efficiency of land cover validation.
NASA Astrophysics Data System (ADS)
Pressl, B.; Laiho, K.; Chen, H.; Günthner, T.; Schlager, A.; Auchter, S.; Suchomel, H.; Kamp, M.; Höfling, S.; Schneider, C.; Weihs, G.
2018-04-01
Semiconductor alloys of aluminum gallium arsenide (AlGaAs) exhibit strong second-order optical nonlinearities. This makes them prime candidates for the integration of devices for classical nonlinear optical frequency conversion or photon-pair production, for example, through the parametric down-conversion (PDC) process. Within this material system, Bragg-reflection waveguides (BRW) are a promising platform, but the specifics of the fabrication process and the peculiar optical properties of the alloys require careful engineering. Previously, BRW samples have been mostly derived analytically from design equations using a fixed set of aluminum concentrations. This approach limits the variety and flexibility of the device design. Here, we present a comprehensive guide to the design and analysis of advanced BRW samples and show how to automatize these tasks. Then, nonlinear optimization techniques are employed to tailor the BRW epitaxial structure towards a specific design goal. As a demonstration of our approach, we search for the optimal effective nonlinearity and mode overlap which indicate an improved conversion efficiency or PDC pair production rate. However, the methodology itself is much more versatile as any parameter related to the optical properties of the waveguide, for example the phasematching wavelength or modal dispersion, may be incorporated as design goals. Further, we use the developed tools to gain a reliable insight in the fabrication tolerances and challenges of real-world sample imperfections. One such example is the common thickness gradient along the wafer, which strongly influences the photon-pair rate and spectral properties of the PDC process. Detailed models and a better understanding of the optical properties of a realistic BRW structure are not only useful for investigating current samples, but also provide important feedback for the design and fabrication of potential future turn-key devices.
Borges, Chad R
2007-07-01
A chemometrics-based data analysis concept has been developed as a substitute for manual inspection of extracted ion chromatograms (XICs), which facilitates rapid, analyst-mediated interpretation of GC- and LC/MS(n) data sets from samples undergoing qualitative batchwise screening for prespecified sets of analytes. Automatic preparation of data into two-dimensional row space-derived scatter plots (row space plots) eliminates the need to manually interpret hundreds to thousands of XICs per batch of samples while keeping all interpretation of raw data directly in the hands of the analyst-saving great quantities of human time without loss of integrity in the data analysis process. For a given analyte, two analyte-specific variables are automatically collected by a computer algorithm and placed into a data matrix (i.e., placed into row space): the first variable is the ion abundance corresponding to scan number x and analyte-specific m/z value y, and the second variable is the ion abundance corresponding to scan number x and analyte-specific m/z value z (a second ion). These two variables serve as the two axes of the aforementioned row space plots. In order to collect appropriate scan number (retention time) information, it is necessary to analyze, as part of every batch, a sample containing a mixture of all analytes to be tested. When pure standard materials of tested analytes are unavailable, but representative ion m/z values are known and retention time can be approximated, data are evaluated based on two-dimensional scores plots from principal component analysis of small time range(s) of mass spectral data. The time-saving efficiency of this concept is directly proportional to the percentage of negative samples and to the total number of samples processed simultaneously.
Lv, Shidong; Wu, Yuanshuang; Zhou, Jiangsheng; Lian, Ming; Li, Changwen; Xu, Yongquan; Liu, Shunhang; Wang, Chao; Meng, Qingxiong
2014-01-01
The quality of tea is presently evaluated by the sensory assessment of professional tea tasters, however, this approach is both inconsistent and inaccurate. A more standardized and efficient method is urgently needed to objectively evaluate tea quality. In this study, the chemical fingerprint of 7 different Dayi Pu-erh tea brands and 3 different Ya'an tea brands on the market were analyzed using fully automatic headspace solid-phase microextraction (HS-SPME) combined with gas chromatography-mass spectrometry (GC–MS). A total of 78 volatiles were separated, among 75 volatiles were identified by GC–MS in seven Dayi Pu-erh teas, and the major chemical components included methoxyphenolic compounds, hydrocarbons, and alcohol compounds, such as 1,2,3-trimethoxybenzene, 1,2,4-trimethoxybenzene, 2,6,10,14-tetramethyl-pentadecane, linalool and its oxides, α-terpineol, and phytol. The overlapping ratio of peaks (ORP) of the chromatogram in the seven Dayi Pu-erh tea samples was greater than 89.55%, whereas the ORP of Ya'an tea samples was less than 79.10%. The similarity and differences of the Dayi Pu-erh tea samples were also characterized using correlation coefficient similarity and principal component analysis (PCA). The results showed that the correlation coefficient of similarity of the seven Dayi Pu-erh tea samples was greater than 0.820 and was gathered in a specific area, which showed that samples from different brands were basically the same, despite have some slightly differences of chemical indexes was found. These results showed that the GC-MS fingerprint combined with the PCA approach can be used as an effective tool for the quality assessment and control of Pu-erh tea. PMID:25551231
Potthast, Nadine; Neuner, Frank; Catani, Claudia
2017-01-03
A growing body of research attempts to clarify the underlying mechanisms of the association between emotional maltreatment and alcohol dependence (AD). In a preceding study, we found considerable support for a specific priming effect in subjects with AD and emotional abuse experiences receiving alcohol rehabilitation treatment. We concluded that maltreatment related cues can automatically activate an associative memory network comprising cues eliciting craving as well as alcohol-related responses. Generalizability of the results to other treatment settings remains unclear because of considerable differences in German treatment settings as well as insufficiently clarified influences of selection effects. As replication studies in other settings are necessary, the current study aimed to replicate the specific priming effect in a qualified detoxification sample. 22 AD subjects (n = 10 with emotional abuse vs. n = 12 without emotional abuse) participated in a priming experiment. Comparison data from 34 healthy control subjects were derived from the prior study. Contrary to our hypothesis, we did not find a specific priming effect. We could not replicate the result of an automatic network activation by maltreatment related words in a sample of subjects with AD and emotional abuse experiences receiving qualified detoxification treatment. This discrepancy might be attributed to reasons related to treatment settings as well as to methodological limitations. Future work is required to determine the generalizability of the specific priming effect before valid conclusions regarding automatic activation can be drawn.
OMNY PIN—A versatile sample holder for tomographic measurements at room and cryogenic temperatures
NASA Astrophysics Data System (ADS)
Holler, M.; Raabe, J.; Wepf, R.; Shahmoradian, S. H.; Diaz, A.; Sarafimov, B.; Lachat, T.; Walther, H.; Vitins, M.
2017-11-01
Nowadays ptychographic tomography in the hard x-ray regime, i.e., at energies above about 2 keV, is a well-established measurement technique. At the Paul Scherrer Institut, currently two instruments are available: one is measuring at room temperature and atmospheric pressure, and the other, the so-called OMNY (tOMography Nano crYo) instrument, is operating at ultra-high vacuum and offering cryogenic sample temperatures down to 10 K. In this manuscript, we present the sample mounts that were developed for these instruments. Aside from excellent mechanical stability and thermal conductivity, they also offer highly reproducible mounting. Various types were developed for different kinds of samples and are presented in detail, including examples of how specimens can be mounted on these holders. We also show the first hard x-ray ptychographic tomography measurements of high-pressure frozen biological samples, in the present case Chlamydomonas cells, the related sample pins and preparation steps. For completeness, we present accessories such as transportation containers for both room temperature and cryogenic samples and a gripper mechanism for automatic sample changing. The sample mounts are not limited to x-ray tomography or hard x-ray energies, and we believe that they can be very useful for other instrumentation projects.
Effects of urbanization on stream water quality in the city of Atlanta, Georgia, USA
Peters, N.E.
2009-01-01
A long-term stream water quality monitoring network was established in the city of Atlanta, Georgia during 2003 to assess baseline water quality conditions and the effects of urbanization on stream water quality. Routine hydrologically based manual stream sampling, including several concurrent manual point and equal width increment sampling, was conducted ???12 times annually at 21 stations, with drainage areas ranging from 3.7 to 232 km2. Eleven of the stations are real-time (RT) stations having continuous measures of stream stage/ discharge, pH, dissolved oxygen, specific conductance, water temperature and turbidity, and automatic samplers for stormwater collection. Samples were analyzed for field parameters, and a broad suite of water quality and sediment-related constituents. Field parameters and concentrations of major ions, metals, nutrient species and coliform bacteria among stations were evaluated and with respect to watershed characteristics and plausible sources from 2003 through September 2007. Most constituent concentrations are much higher than nearby reference streams. Concentrations are statistically different among stations for several constituents, despite high variability both within and among stations. Routine manual sampling, automatic sampling during stormflows and RT water quality monitoring provided sufficient information about urban stream water quality variability to evaluate causes of water quality differences among streams. Fecal coliform bacteria concentrations of most samples exceeded Georgia's water quality standard for any water-usage class. High chloride concentrations occur at three stations and are hypothesized to be associated with discharges of chlorinated combined sewer overflows, drainage of swimming pool(s) and dissolution and transport during rainstorms of CaCl2, a deicing salt applied to roads during winter storms. One stream was affected by dissolution and transport of ammonium alum [NH4Al(SO4)2] from an alum-manufacturing plant; streamwater has low pH (<5), low alkalinity and high metals concentrations. Several trace metals exceed acute and chronic water quality standards and high concentrations are attributed to washoff from impervious surfaces.
Code of Federal Regulations, 2010 CFR
2010-07-01
... substantial deviations from the design specifications of the sampler specified for reference methods in... general requirements as an ISO 9001-registered facility for the design and manufacture of designated... capable of automatically collecting a series of sequential samples. NO means nitrogen oxide. NO 2 means...
Reference guide for the soil compactor analyzer.
DOT National Transportation Integrated Search
2009-07-01
The Soil Compactor Analyzer (SCA) attaches to the automatic tamper used for Test Methods Tex-113-E and 114-E and uses rapid sampling of the hammer displacement to measure impact velocity. With the known mass of the hammer and the determined velocity,...
Shore, Sabrina; Henderson, Jordana M; Lebedev, Alexandre; Salcedo, Michelle P; Zon, Gerald; McCaffrey, Anton P; Paul, Natasha; Hogrefe, Richard I
2016-01-01
For most sample types, the automation of RNA and DNA sample preparation workflows enables high throughput next-generation sequencing (NGS) library preparation. Greater adoption of small RNA (sRNA) sequencing has been hindered by high sample input requirements and inherent ligation side products formed during library preparation. These side products, known as adapter dimer, are very similar in size to the tagged library. Most sRNA library preparation strategies thus employ a gel purification step to isolate tagged library from adapter dimer contaminants. At very low sample inputs, adapter dimer side products dominate the reaction and limit the sensitivity of this technique. Here we address the need for improved specificity of sRNA library preparation workflows with a novel library preparation approach that uses modified adapters to suppress adapter dimer formation. This workflow allows for lower sample inputs and elimination of the gel purification step, which in turn allows for an automatable sRNA library preparation protocol.
Extended Phase-Space Methods for Enhanced Sampling in Molecular Simulations: A Review.
Fujisaki, Hiroshi; Moritsugu, Kei; Matsunaga, Yasuhiro; Morishita, Tetsuya; Maragliano, Luca
2015-01-01
Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein-ligand, protein-protein, and protein-DNA/RNA interactions. Straightforward applications, however, are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD), Logarithmic Mean Force Dynamics (LogMFD), and Multiscale Enhanced Sampling (MSES) algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free-energy landscape via automatic exploration.
A method for feature selection of APT samples based on entropy
NASA Astrophysics Data System (ADS)
Du, Zhenyu; Li, Yihong; Hu, Jinsong
2018-05-01
By studying the known APT attack events deeply, this paper propose a feature selection method of APT sample and a logic expression generation algorithm IOCG (Indicator of Compromise Generate). The algorithm can automatically generate machine readable IOCs (Indicator of Compromise), to solve the existing IOCs logical relationship is fixed, the number of logical items unchanged, large scale and cannot generate a sample of the limitations of the expression. At the same time, it can reduce the redundancy and useless APT sample processing time consumption, and improve the sharing rate of information analysis, and actively respond to complex and volatile APT attack situation. The samples were divided into experimental set and training set, and then the algorithm was used to generate the logical expression of the training set with the IOC_ Aware plug-in. The contrast expression itself was different from the detection result. The experimental results show that the algorithm is effective and can improve the detection effect.
Estimated content percentages of volatile liquids and fat extractables in ready-to-eat foods.
Daft, J L; Cline, J K; Palmer, R E; Sisk, R L; Griffitt, K R
1996-01-01
Content percentages of volatile liquids and fat extractables in 340 samples of ready-to-eat foods were determined gravimetrically. Volatile liquids were determined by drying samples in a microwave oven with a self-contained balance; results were printed out automatically. Fat extractables were extracted from the samples with mixed ethers; extracts were dried and weighed manually. The samples, 191 nonfat and 149 fatty (containing ca 2% or more fat) foods, represent about 5000 different food items and include infant and toddler, ethnic, fast, and imported items. Samples were initially prepared for screening of essential and toxic elements and chemical contamination by chopping and mixing into homogenous composites. Content determinations were then made on separate portions from each composite. Content results were put into a database for evaluation. Overall, mean results from both determinations agree with published data for moisture and fat contents of similar food items. Coefficients of variation, however, were lower for determination of volatile liquids than for that of fat extractables.
Sobolevsky, Tim; Rodchenkov, Grigory
2010-01-01
Sulbutiamine (isobutyryl thiamine disulfide) is a lipophilic derivative of thiamine used for the treatment of asthenia and other related pathological conditions. It is available over-the-counter in several countries either as a component of nutritional supplements or as a pharmaceutical preparation. The presence of sulbutiamine in urinary doping control samples was monitored to evaluate the relevance of its use in sports. As one of the sulbutiamine metabolites has very close retention time and the same characteristic ion (m/z 194) as the main boldenone metabolite, the raw data files generated from the screening for anabolic steroids were automatically reprocessed to identify the samples containing sulbutiamine. It was found that of ca. 16 000 samples analyzed in the Russian laboratory during 2009, about 100 samples contained sulbutiamine. It is important to note that most of these samples were collected in-competition, and sulbutiamine concentration was estimated to be greater than 500 ng/ml. This may indicate that sulbutiamine was intentionally administered for its ergogenic and mild stimulating properties. Copyright © 2010 John Wiley & Sons, Ltd.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Automated Classification and Analysis of Non-metallic Inclusion Data Sets
NASA Astrophysics Data System (ADS)
Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.
2018-05-01
The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.
Clerkin, Elise M; Teachman, Bethany A
2009-08-01
The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n = 32) or low (n = 31) BDD symptoms. Specifically, we examined the extent that (1) explicit interpretations tied to appearance, as well as (2) automatic associations and (3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, whereas strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures.
Clerkin, Elise M.; Teachman, Bethany A.
2011-01-01
The current study tests cognitive-behavioral models of body dysmorphic disorder (BDD) by examining the relationship between cognitive biases and correlates of mirror gazing. To provide a more comprehensive picture, we investigated both relatively strategic (i.e., available for conscious introspection) and automatic (i.e., outside conscious control) measures of cognitive biases in a sample with either high (n=32) or low (n=31) BDD symptoms. Specifically, we examined the extent that 1) explicit interpretations tied to appearance, as well as 2) automatic associations and 3) strategic evaluations of the importance of attractiveness predict anxiety and avoidance associated with mirror gazing. Results indicated that interpretations tied to appearance uniquely predicted self-reported desire to avoid, while strategic evaluations of appearance uniquely predicted peak anxiety associated with mirror gazing, and automatic appearance associations uniquely predicted behavioral avoidance. These results offer considerable support for cognitive models of BDD, and suggest a dissociation between automatic and strategic measures. PMID:19684496
Li, Qiang; Ou, Xi C; Pang, Yu; Xia, Hui; Huang, Hai R; Zhao, Bing; Wang, Sheng F; Zhao, Yan L
2017-07-01
MiniLab tuberculosis (ML TB) assay is a new automatic diagnostic tool for diagnosis of multidrug resistance tuberculosis (MDR-TB). This study was conducted with aims to know the performance of this assay. Sputum sample from 224 TB suspects was collected from tuberculosis suspects seeking medical care at Beijing Chest hospital. The sputum samples were directly used for smear and ML TB test. The left sputum sample was used to conduct Xpert MTB/RIF, Bactec MGIT culture and drug susceptibility test (DST). All discrepancies between the results from DST, molecular and phenotypic methods were confirmed by DNA Sequencing. The sensitivity and specificity of ML TB test for detecting MTBC from TB suspects were 95.1% and 88.9%, respectively. The sensitivity for smear negative TB suspects was 64.3%. For detection of RIF resistance, the sensitivity and specificity of ML TB test were 89.2% and 95.7%, respectively. For detection of INH resistance, the sensitivity and specificity of ML TB test were 78.3% and 98.1%, respectively. ML TB test showed similar performance to Xpert MTB/RIF for detection of MTBC and RIF resistance. In addition, ML TB also had good performance for INH resistance detection. Copyright © 2017. Published by Elsevier Ltd.
Wang, Yi; Luo, Jie; Chen, Hengwu; He, Qiaohong; Gan, Nin; Li, Tianhua
2008-09-12
A novel chip-based flow injection analysis (FIA) system has been developed for automatic, rapid and selective determination of dopamine (DA) in the presence of ascorbic acid (AA). The system is composed of a polycarbonate (PC) microfluidic chip with an electrochemical detector (ED), a gravity pump, and an automatic sample loading and injection unit. The selectivity of the ED was improved by modification of the gold working microelectrode, which was fabricated on the PC chip by UV-directed electroless gold plating, with a self-assembled monolayer (SAM) of 3-mercaptopropionic acid (MPA). Postplating treatment methods for cleaning the surface of electroless gold microelectrodes were investigated to ensure the formation of high quality SAMs. The effects of detection potential, flow rate, and sampling volume on the performance of the chip-based FIA system were studied. Under optimum conditions, a detection limit of 74 nmol L(-1) for DA was achieved at the sample throughput rate of 180 h(-1). A RSD of 0.9% for peak heights was observed for 19 runs of a 100 micromol L(-1) DA solution. Interference-free determination of DA could be conducted if the concentration ratio of AA-DA was no more than 10.
Concept for tremor compensation for a handheld OCT-laryngoscope
NASA Astrophysics Data System (ADS)
Donner, Sabine; Deutsch, Stefanie; Bleeker, Sebastian; Ripken, Tammo; Krüger, Alexander
2013-06-01
Optical coherence tomography (OCT) is a non-invasive imaging technique which can create optical tissue sections, enabling diagnosis of vocal cord tissue. To take full advantage from the non-contact imaging technique, OCT was adapted to an indirect laryngoscope to work on awake patients. Using OCT in a handheld diagnostic device the challenges of rapid working distance adjustment and tracking of axial motion arise. The optical focus of the endoscopic sample arm and the reference-arm length can be adjusted in a range of 40 mm to 90 mm. Automatic working distance adjustment is based on image analysis of OCT B-scans which identifies off depth images as well as position errors. The movable focal plane and reference plane are used to adjust working distance to match the sample depth and stabilise the sample in the desired axial position of the OCT scans. The autofocus adjusts the working distance within maximum 2.7 seconds for the maximum initial displacement of 40 mm. The amplitude of hand tremor during 60 s handheld scanning was reduced to 50 % and it was shown that the image stabilisation keeps the position error below 0.5 mm. Fast automatic working distance adjustment is crucial to minimise the duration of the diagnostic procedure. The image stabilisation compensates relative axial movements during handheld scanning.
Zheng, Yue; Zhang, Chunxi; Li, Lijing; Song, Lailiang; Chen, Wen
2016-06-10
For a fiber-optic gyroscope (FOG) using electronic dithers to suppress the dead zone, without a fixed loop gain, the deterministic compensation for the dither signals in the control loop of the FOG cannot remain accurate, resulting in the dither residuals in the FOG rotation rate output and the navigation errors in the inertial navigation system. An all-digital automatic-gain-control method for stabilizing the loop gain of the FOG is proposed. By using a perturbation square wave to measure the loop gain of the FOG and adding an automatic gain control loop in the conventional control loop of the FOG, we successfully obtain the actual loop gain and make the loop gain converge to the reference value. The experimental results show that in the case of 20% variation in the loop gain, the dither residuals are successfully eliminated and the standard deviation of the FOG sampling outputs is decreased from 2.00 deg/h to 0.62 deg/h (sampling period 2.5 ms, 10 points smoothing). With this method, the loop gain of the FOG can be stabilized over the operation temperature range and in the long-time application, which provides a solid foundation for the engineering applications of the high-precision FOG.
Solar exposure of sunglasses: aging test display
NASA Astrophysics Data System (ADS)
Gomes, L. M.; Masili, M.; Momesso, G. A.; Silva, F. M.; Ventura, L.
2018-02-01
In previous studies conducted in our lab, we have been investigating the aging effects on sunglasses. Some preliminary results have been indicating changes on the UV protection on the lenses. Therefore, besides irradiating the samples with a proper sun simulator, we have also been concerned on exposing the sunglasses to natural sun for further investigation and comparisons. Thus, this project aims expose the lenses for 24 months using an automatic solar exposition station, which consists of a series of 5 panels, housing 60 lenses arranged in the vertical position to the ground, which will be irradiated by the sun from sunrise until sunset. A box structure moves along a rail, driven by a motor and then the lenses are exposed. Humidity, rain, temperature, dust and UV index sensors, as well as a video camera are part of the system. The exposure time and UV index will be recorded and automatic opening or closing the box system may also be controlled by a PC using a webserver. The system was tested in working conditions, i.e. exposed to the weather and being automatically controlled, for five months to certifying that the samples could be exposed without being damaged. The next step of the research is to start the exposition cycles and to measure the expected transmittance variations after each cycle.
Automatic reconstruction of the muscle architecture from the superficial layer fibres data.
Kohout, Josef; Cholt, David
2017-10-01
Physiological cross-sectional area (PCSA) of a muscle plays a significant role in determining the force contribution of muscle fascicles to skeletal movement. This parameter is typically calculated from the lengths of muscle fibres selectively sampled from the superficial layer of the muscle. However, recent studies have found that the length of fibres in the superficial layer often differs significantly (p < 0.5) from the length of fibres in the deep layer. As a result, PCSA estimation is inaccurate. In this paper, we propose a method to automatically reconstruct fibres in the whole volume of a muscle from those selectively sampled on the superficial layer. The method performs a centripetal Catmull-Rom interpolation of the input fibres within the volume of a muscle represented by its 3D surface model, automatically distributing the fibres among multiple heads of the muscle and shortening the deep fibres to support large attachment areas with extremely acute angles. Our C++ implementation runs in a couple of seconds on commodity hardware providing realistic results for both artificial and real data sets we tested. The fibres produced by the method can be used directly to determine the personalised mechanical muscle functioning. Our implementation is publicly available for the researchers at https://mi.kiv.zcu.cz/. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated Drug Identification for Urban Hospitals
NASA Technical Reports Server (NTRS)
Shirley, Donna L.
1971-01-01
Many urban hospitals are becoming overloaded with drug abuse cases requiring chemical analysis for identification of drugs. In this paper, the requirements for chemical analysis of body fluids for drugs are determined and a system model for automated drug analysis is selected. The system as modeled, would perform chemical preparation of samples, gas-liquid chromatographic separation of drugs in the chemically prepared samples, infrared spectrophotometric analysis of the drugs, and would utilize automatic data processing and control for drug identification. Requirements of cost, maintainability, reliability, flexibility, and operability are considered.
Digital Baseband Architecture For Transponder
NASA Technical Reports Server (NTRS)
Nguyen, Tien M.; Yeh, Hen-Geul
1995-01-01
Proposed advanced transponder for long-distance radio communication system with turnaround ranging contains carrier-signal-tracking loop including baseband digital "front end." For reduced cost, transponder includes analog intermediate-frequency (IF) section and analog automatic gain control (AGC) loop at first of two IF mixers. However, second IF mixer redesigned to ease digitization of baseband functions. To conserve power and provide for simpler and smaller transponder hardware, baseband digital signal-processing circuits designed to implement undersampling scheme. Furthermore, sampling scheme and sampling frequency chosen so redesign involves minimum modification of command-detector unit (CDU).
Kit for the rapid preparation of .sup.99m Tc red blood cells
Richards, Powell; Smith, Terry D.
1976-01-01
A method and sample kit for the preparation of .sup.99m Tc-labeled red blood cells in a closed, sterile system. A partially evacuated tube, containing a freeze-dried stannous citrate formulation with heparin as an anticoagulant, allows whole blood to be automatically drawn from the patient. The radioisotope is added at the end of the labeling sequence to minimize operator exposure. Consistent 97% yields in 20 minutes are obtained with small blood samples. Freeze-dried kits have remained stable after five months.
Applications of luminescent systems to infectious disease methodology
NASA Technical Reports Server (NTRS)
Picciolo, G. L.; Chappelle, E. W.; Deming, J. W.; Mcgarry, M. A.; Nibley, D. A.; Okrend, H.; Thomas, R. R.
1976-01-01
The characterization of a clinical sample by a simple, fast, accurate, automatable analytical measurement is important in the management of infectious disease. Luminescence assays offer methods rich with options for these measurements. The instrumentation is common to each assay, and the investment is reasonable. Three general procedures were developed to varying degrees of completeness which measure bacterial levels by measuring their ATP, FMN and iron porphyrins. Bacteriuria detection and antibiograms can be determined within half a day. The characterization of the sample for its soluble ATP, FMN or prophyrins was also performed.
Cassette bacteria detection system. [for monitoring the sterility of regenerated water in spacecraft
NASA Technical Reports Server (NTRS)
1974-01-01
The design, fabrication, and testing of an automatic bacteria detection system, with a zero-g capability, based on the filter-capable approach, and intended for monitoring the sterility of regenerated water in spacecraft is discussed. The principle of detection is based on measuring the increase in chemiluminescence produced by the action of bacterial porphyrins on a luminol-hydrogen peroxide mixture. Viable organisms are detected by comparing the signal of an incubated water sample with an unincubated control. High signals for the incubated water sample indicate the presence of viable organisms.
NASA Technical Reports Server (NTRS)
Nalepka, R. F. (Principal Investigator); Kauth, R. J.; Thomas, G. S.
1976-01-01
The author has identified the following significant results. A conceptual man machine system framework was created for a large scale agricultural remote sensing system. The system is based on and can grow out of the local recognition mode of LACIE, through a gradual transition wherein computer support functions supplement and replace AI functions. Local proportion estimation functions are broken into two broad classes: (1) organization of the data within the sample segment; and (2) identification of the fields or groups of fields in the sample segment.
Automatic photometric titrations of calcium and magnesium in carbonate rocks
Shapiro, L.; Brannock, W.W.
1955-01-01
Rapid nonsubjective methods have been developed for the determination of calcium and magnesium in carbonate rocks. From a single solution of the sample, calcium is titrated directly, and magnesium is titrated after a rapid removal of R2O3 and precipitation of calcium as the tungstate. A concentrated and a dilute solution of disodium ethylenediamine tetraacetate are used as titrants. The concentrated solution is added almost to the end point, then the weak solution is added in an automatic titrator to determine the end point precisely.
Development and evaluation of an automatic labeling technique for spring small grains
NASA Technical Reports Server (NTRS)
Crist, E. P.; Malila, W. A. (Principal Investigator)
1981-01-01
A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.
Young, Stacie T.M.; Ball, Marcael T.J.
2003-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data was collected at two sites, continuous streamflow data at three sites, and water-quality data at five sites, which include the three streamflow sites. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2002 to June 30, 2003. A total of 28 samples were collected over five storms during July 1, 2002 to June 30, 2003. For two of the five storms, five grab samples and three flow-weighted timecomposite samples were collected. Grab samples were collected nearly simultaneously at all five sites, and flow-weighted timecomposite samples were collected at the three sites equipped with automatic samplers. The other three storms were partially sampled, where only flow-weighted time-composite samples were collected and/or not all stations were sampled. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, copper, lead, and zinc). Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/qualitycontrol samples, collected during storms and during routine maintenance, were also collected to verify analytical procedures and insure proper cleaning of equipment.
Young, Stacie T.M.; Ball, Marcael T.J.
2004-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two sites, continuous streamflow data at three sites, and water-quality data at five sites, which include the three streamflow sites. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2003 and June 30, 2004. A total of 30 samples was collected over four storms during July 1, 2003 to June 30, 2004. In general, an attempt was made to collect grab samples nearly simultaneously at all five sites, and flow-weighted time-composite samples were collected at the three sites equipped with automatic samplers. However, all four storms were partially sampled because either not all stations were sampled or only grab samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, copper, lead, and zinc). Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples, collected during storms and during routine maintenance, were also collected to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-06-01
This Sampling and Analysis Plan addresses surface water monitoring, sampling, and analysis activities that will be conducted in support of the Environmental Monitoring Plan for Waste Area Grouping (WAG) 6. WAG 6 is a shallow-burial land disposal facility for low-level radioactive waste at the Oak Ridge National Laboratory, a research facility owned by the US Department of Energy and managed by Martin Marietta Energy Systems, Inc. Surface water monitoring will be conducted at nine sites within WAG 6. Activities to be conducted will include the installation, inspection, and maintenance of automatic flow-monitoring and sampling equipment and manual collection of variousmore » water and sediment samples. The samples will be analyzed for various organic, inorganic, and radiological parameters. The information derived from the surface water monitoring, sampling, and analysis will aid in evaluating risk associated with contaminants migrating off-WAG, and will be used in calculations to establish relationships between contaminant concentration (C) and flow (Q). The C-Q relationship will be used in calculating the cumulative risk associated with the off-WAG migration of contaminants.« less
Payne, G.A.
1983-01-01
Streamflow and suspended-sediment-transport data were collected in Garvin Brook watershed in Winona County, southeastern Minnesota, during 1982. The data collection was part of a study to determine the effectiveness of agricultural best-management practices designed to improve rural water quality. The study is part of a Rural Clean Water Program demonstration project undertaken by the U.S. Department of Agriculture. Continuous streamflow data were collected at three gaging stations during March through September 1982. Suspended-sediment samples were collected at two of the gaging stations. Samples were collected manually at weekly intervals. During periods of rapidly changing stage, samples were collected at 30-minute to 12-hour intervals by stage-activated automatic samplers. The samples were analyzed for suspendedsediment concentration and particle-size distribution. Particlesize distributions were also determined for one set of bedmaterial samples collected at each sediment-sampling site. The streamflow and suspended-sediment-concentration data were used to compute records of mean-daily flow, mean-daily suspended-sediment concentration, and daily suspended-sediment discharge. The daily records are documented and results of analyses for particle-size distribution and of vertical sampling in the stream cross sections are given.
Pyne, Saumyadipta; Lee, Sharon X; Wang, Kui; Irish, Jonathan; Tamayo, Pablo; Nazaire, Marc-Danie; Duong, Tarn; Ng, Shu-Kay; Hafler, David; Levy, Ronald; Nolan, Garry P; Mesirov, Jill; McLachlan, Geoffrey J
2014-01-01
In biomedical applications, an experimenter encounters different potential sources of variation in data such as individual samples, multiple experimental conditions, and multivariate responses of a panel of markers such as from a signaling network. In multiparametric cytometry, which is often used for analyzing patient samples, such issues are critical. While computational methods can identify cell populations in individual samples, without the ability to automatically match them across samples, it is difficult to compare and characterize the populations in typical experiments, such as those responding to various stimulations or distinctive of particular patients or time-points, especially when there are many samples. Joint Clustering and Matching (JCM) is a multi-level framework for simultaneous modeling and registration of populations across a cohort. JCM models every population with a robust multivariate probability distribution. Simultaneously, JCM fits a random-effects model to construct an overall batch template--used for registering populations across samples, and classifying new samples. By tackling systems-level variation, JCM supports practical biomedical applications involving large cohorts. Software for fitting the JCM models have been implemented in an R package EMMIX-JCM, available from http://www.maths.uq.edu.au/~gjm/mix_soft/EMMIX-JCM/.
A new spectroscopic calibration to determine Teff and [Fe/H] of FGK dwarfs and giants
NASA Astrophysics Data System (ADS)
Teixeira, G. D. C.; Sousa, S. G.; Tsantaki, M.; Monteiro, M. J. P. F. G.; Santos, N. C.; Israelian, G.
2017-10-01
We present a new spectroscopic calibration for a fast estimate of Teff and [Fe/H] for FGK dwarfs and GK giant stars. We used spectra from a joint sample of 708 stars, composed by 451 FGK dwarfs and 257 GK-giant stars with homogeneously determined spectroscopic stellar parameters. We have derived 322 EW line-ratios and 100 FeI lines that can be used to compute Teff and [Fe/H], respectively. We show that these calibrations are effective for FGK dwarfs and GK-giant stars in the following ranges: 4500 K < Teff < 6500 K, 2.5 < log g < 4.9 dex, and -0.8 < [Fe/H] < 0:5 dex. The new calibration has a standard deviation of 74 K for Teff and 0.07 dex for [Fe/H]. We use four independent samples of stars to test and verify the new calibration, a sample of giant stars, a sample composed of Gaia FGK benchmark stars, a sample of GK-giant stars from the DR1 of the Gaia-ESO survey, and a sample of FGK-dwarf stars. We present a new computer code, GeTCal, for automatically producing new calibration files based on any new sample of stars.
Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena
2015-04-01
To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.
Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu
2017-10-02
During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis. Moreover, since the automatically chosen two-group sample test is verified to be outperformed, MetaComp is friendly to users without adequate statistical training. These improvements are aiming to overcome the new challenges under big data era for all meta-omics data. MetaComp is available at: http://cqb.pku.edu.cn/ZhuLab/MetaComp/ and https://github.com/pzhaipku/MetaComp/ .
van der Kloet, Frans M; Hendriks, Margriet; Hankemeier, Thomas; Reijmers, Theo
2013-11-01
Because of its high sensitivity and specificity, hyphenated mass spectrometry has become the predominant method to detect and quantify metabolites present in bio-samples relevant for all sorts of life science studies being executed. In contrast to targeted methods that are dedicated to specific features, global profiling acquisition methods allow new unspecific metabolites to be analyzed. The challenge with these so-called untargeted methods is the proper and automated extraction and integration of features that could be of relevance. We propose a new algorithm that enables untargeted integration of samples that are measured with high resolution liquid chromatography-mass spectrometry (LC-MS). In contrast to other approaches limited user interaction is needed allowing also less experienced users to integrate their data. The large amount of single features that are found within a sample is combined to a smaller list of, compound-related, grouped feature-sets representative for that sample. These feature-sets allow for easier interpretation and identification and as important, easier matching over samples. We show that the automatic obtained integration results for a set of known target metabolites match those generated with vendor software but that at least 10 times more feature-sets are extracted as well. We demonstrate our approach using high resolution LC-MS data acquired for 128 samples on a lipidomics platform. The data was also processed in a targeted manner (with a combination of automatic and manual integration) using vendor software for a set of 174 targets. As our untargeted extraction procedure is run per sample and per mass trace the implementation of it is scalable. Because of the generic approach, we envision that this data extraction lipids method will be used in a targeted as well as untargeted analysis of many different kinds of TOF-MS data, even CE- and GC-MS data or MRM. The Matlab package is available for download on request and efforts are directed toward a user-friendly Windows executable. Copyright © 2013 Elsevier B.V. All rights reserved.
The Digital Sample: Metadata, Unique Identification, and Links to Data and Publications
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Vinayagamoorthy, S.; Djapic, B.; Klump, J.
2006-12-01
A significant part of digital data in the Geosciences refers to physical samples of Earth materials, from igneous rocks to sediment cores to water or gas samples. The application and long-term utility of these sample-based data in research is critically dependent on (a) the availability of information (metadata) about the samples such as geographical location and time of sampling, or sampling method, (b) links between the different data types available for individual samples that are dispersed in the literature and in digital data repositories, and (c) access to the samples themselves. Major problems for achieving this include incomplete documentation of samples in publications, use of ambiguous sample names, and the lack of a central catalog that allows to find a sample's archiving location. The International Geo Sample Number IGSN, managed by the System for Earth Sample Registration SESAR, provides solutions for these problems. The IGSN is a unique persistent identifier for samples and other GeoObjects that can be obtained by submitting sample metadata to SESAR (www.geosamples.org). If data in a publication is referenced to an IGSN (rather than an ambiguous sample name), sample metadata can readily be extracted from the SESAR database, which evolves into a Global Sample Catalog that also allows to locate the owner or curator of the sample. Use of the IGSN in digital data systems allows building linkages between distributed data. SESAR is contributing to the development of sample metadata standards. SESAR will integrate the IGSN in persistent, resolvable identifiers based on the handle.net service to advance direct linkages between the digital representation of samples in SESAR (sample profiles) and their related data in the literature and in web-accessible digital data repositories. Technologies outlined by Klump et al. (this session) such as the automatic creation of ontologies by text mining applications will be explored for harvesting identifiers of publications and datasets that contain information about a specific sample in order to establish comprehensive data profiles for samples.
Papp, Gergely; Felisaz, Franck; Sorez, Clement; Lopez-Marrero, Marcos; Janocha, Robert; Manjasetty, Babu; Gobbo, Alexandre; Belrhali, Hassan; Bowler, Matthew W; Cipriani, Florent
2017-10-01
Automated sample changers are now standard equipment for modern macromolecular crystallography synchrotron beamlines. Nevertheless, most are only compatible with a single type of sample holder and puck. Recent work aimed at reducing sample-handling efforts and crystal-alignment times at beamlines has resulted in a new generation of compact and precise sample holders for cryocrystallography: miniSPINE and NewPin [see the companion paper by Papp et al. (2017, Acta Cryst., D73, 829-840)]. With full data collection now possible within seconds at most advanced beamlines, and future fourth-generation synchrotron sources promising to extract data in a few tens of milliseconds, the time taken to mount and centre a sample is rate-limiting. In this context, a versatile and fast sample changer, FlexED8, has been developed that is compatible with the highly successful SPINE sample holder and with the miniSPINE and NewPin sample holders. Based on a six-axis industrial robot, FlexED8 is equipped with a tool changer and includes a novel open sample-storage dewar with a built-in ice-filtering system. With seven versatile puck slots, it can hold up to 112 SPINE sample holders in uni-pucks, or 252 miniSPINE or NewPin sample holders, with 36 samples per puck. Additionally, a double gripper, compatible with the SPINE sample holders and uni-pucks, allows a reduction in the sample-exchange time from 40 s, the typical time with a standard single gripper, to less than 5 s. Computer vision-based sample-transfer monitoring, sophisticated error handling and automatic error-recovery procedures ensure high reliability. The FlexED8 sample changer has been successfully tested under real conditions on a beamline.
Nara, Osamu
2011-01-24
I describe an interchangeable twin vessel (J, N) automatic glass recrystallizer that eliminates the time-consuming recovery and recycling of crystals for repeated recrystallization. The sample goes in the dissolution vessel J containing a magnetic stir-bar K; J is clamped to the upper joint H of recrystallizer body D. Empty crystallization vessel N is clamped to the lower joint M. Pure solvent is delivered to the dissolution vessel and the crystallization vessel via the head of the condenser A. Crystallization vessel is heated (P). The dissolution reservoir is stirred and heated by the solvent vapor (F). Continuous outflow of filtrate E out of J keeps N at a stable boiling temperature. This results in efficient dissolution, evaporation and separation of pure crystals Q. Pure solvent in the dissolution reservoir is recovered by suction. Empty dissolution and crystallization vessels are detached. Stirrer magnet is transferred to the crystallization vessel and the role of the vessels are then reversed. Evacuating mother liquor out of the upper twin vessel, the apparatus unit is ready for the next automatic recrystallization by refilling twin vessels with pure solvent. We show successive automatic recrystallization of acetaminophen from diethyl ether obtaining acetaminophen of higher melting temperatures than USP and JP reference standards by 8× automatic recrystallization, 96% yield at each stage. Also, I demonstrate a novel approach to the determination of absolute purity by combining the successive automatic recrystallization with differential scanning calorimetry (DSC) measurement requiring no reference standards. This involves the measurement of the criterial melting temperature T(0) corresponding to the 100% pure material and quantitative ΔT in DSC based on the van't Hoff law of melting point depression. The purity of six commercial acetaminophen samples and reference standards and an eight times recrystallized product evaluated were 98.8 mol%, 97.9 mol%, 99.1 mol%, 98.3 mol%, 98.4 mol%, 98.5 mol% and 99.3 mol% respectively. Copyright © 2010 Elsevier B.V. All rights reserved.
Estimating Mutual Information for High-to-Low Calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michaud, Isaac James; Williams, Brian J.; Weaver, Brian Phillip
Presentation shows that KSG 2 is superior to KSG 1 because it scales locally automatically; KSG estimators are limited to a maximum MI due to sample size; LNC extends the capability of KSG without onerous assumptions; iLNC allows LNC to estimate information gain.
Observing metal-poor stars with X-Shooter
NASA Astrophysics Data System (ADS)
Caffau, E.; Bonifacio, P.; Sbordone, L.; Monaco, L.; François; , P.
The extremely metal-poor stars (EMP) hold in their atmospheres the fossil record of the chemical composition of the early phases of the Galactic evolution. The chemical analysis of such objects provides important constraints on these early phases. EMP stars are very rare objects; to dig them out large amounts of data have to be considered. With an automatic procedure, we analysed objects with colours of Turn-Off stars from the Sloan Digital Sky Survey to select a sample of good candidate EMP stars. During the French-Italian GTO of the spectrograph X-Shooter, we observed a sample of these candidates. We could confirm the low metallicity of our sample of stars, and we succeeded in finding a record metal-poor star.
Hyperspectral imaging with wavelet transform for classification of colon tissue biopsy samples
NASA Astrophysics Data System (ADS)
Masood, Khalid
2008-08-01
Automatic classification of medical images is a part of our computerised medical imaging programme to support the pathologists in their diagnosis. Hyperspectral data has found its applications in medical imagery. Its usage is increasing significantly in biopsy analysis of medical images. In this paper, we present a histopathological analysis for the classification of colon biopsy samples into benign and malignant classes. The proposed study is based on comparison between 3D spectral/spatial analysis and 2D spatial analysis. Wavelet textural features in the wavelet domain are used in both these approaches for classification of colon biopsy samples. Experimental results indicate that the incorporation of wavelet textural features using a support vector machine, in 2D spatial analysis, achieve best classification accuracy.
Sampling the sound field in auditoria using large natural-scale array measurements.
Witew, Ingo B; Vorländer, Michael; Xiang, Ning
2017-03-01
Suitable data for spatial wave field analyses in concert halls need to satisfy the sampling theorem and hence requires densely spaced measurement positions over extended regions. The described measurement apparatus is capable of automatically sampling the sound field in auditoria over a surface of 5.30 m × 8.00 m to any appointed resolutions. In addition to discussing design features, a case study based on measured impulse responses is presented. The experimental data allow wave field animations demonstrating how sound propagating at grazing incidence over theater seating is scattered from rows of chairs (seat-dip effect). The visualized data of reflections and scattering from an auditorium's boundaries give insights and opportunities for advanced analyses.
Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey
2014-01-01
With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.
NASA Astrophysics Data System (ADS)
Ge, Xuming
2017-08-01
The coarse registration of point clouds from urban building scenes has become a key topic in applications of terrestrial laser scanning technology. Sampling-based algorithms in the random sample consensus (RANSAC) model have emerged as mainstream solutions to address coarse registration problems. In this paper, we propose a novel combined solution to automatically align two markerless point clouds from building scenes. Firstly, the method segments non-ground points from ground points. Secondly, the proposed method detects feature points from each cross section and then obtains semantic keypoints by connecting feature points with specific rules. Finally, the detected semantic keypoints from two point clouds act as inputs to a modified 4PCS algorithm. Examples are presented and the results compared with those of K-4PCS to demonstrate the main contributions of the proposed method, which are the extension of the original 4PCS to handle heavy datasets and the use of semantic keypoints to improve K-4PCS in relation to registration accuracy and computational efficiency.
Sorting Olive Batches for the Milling Process Using Image Processing
Puerto, Daniel Aguilera; Martínez Gila, Diego Manuel; Gámez García, Javier; Gómez Ortega, Juan
2015-01-01
The quality of virgin olive oil obtained in the milling process is directly bound to the characteristics of the olives. Hence, the correct classification of the different incoming olive batches is crucial to reach the maximum quality of the oil. The aim of this work is to provide an automatic inspection system, based on computer vision, and to classify automatically different batches of olives entering the milling process. The classification is based on the differentiation between ground and tree olives. For this purpose, three different species have been studied (Picudo, Picual and Hojiblanco). The samples have been obtained by picking the olives directly from the tree or from the ground. The feature vector of the samples has been obtained on the basis of the olive image histograms. Moreover, different image preprocessing has been employed, and two classification techniques have been used: these are discriminant analysis and neural networks. The proposed methodology has been validated successfully, obtaining good classification results. PMID:26147729
Houet, Thomas; Pigeon, Grégoire
2011-01-01
Facing the concern of the population to its environment and to climatic change, city planners are now considering the urban climate in their choices of planning. The use of climatic maps, such Urban Climate Zone‑UCZ, is adapted for this kind of application. The objective of this paper is to demonstrate that the UCZ classification, integrated in the World Meteorological Organization guidelines, first can be automatically determined for sample areas and second is meaningful according to climatic variables. The analysis presented is applied on Toulouse urban area (France). Results show first that UCZ differentiate according to air and surface temperature. It has been possible to determine the membership of sample areas to an UCZ using landscape descriptors automatically computed with GIS and remote sensed data. It also emphasizes that climate behavior and magnitude of UCZ may vary from winter to summer. Finally we discuss the influence of climate data and scale of observation on UCZ mapping and climate characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.
Prevalence and functions of non-suicidal self-injury in Spanish adolescents.
Calvete, Esther; Orue, Izaskun; Aizpuru, Leire; Brotherton, Hardin
2015-01-01
This study examined the prevalence, characteristics and functions of Non-suicidal Self-injury (NSSI) among Spanish adolescents. The sample consisted of 1,864 adolescents aged between 12 and 19 years (Mean Age = 15.32, SD = 1.97, 51.45% girls). The participants completed a modified version of the self-report scale Functional Assessment of Self-Mutilation (FASM; Lloyd, Kelley, & Hope, 1997) to assess rates and methods of NSSI used during the last 12 months. They also indicated the functions of NSSI. NSSI behaviors are common among Spanish adolescents. More than half of the sample showed such behavior in the past year, and 32.2% had carried out severe NSSI behaviors. The functions of NSSI were examined by using confirmatory factor analyses. Results supported a hierarchical model consisting of two second-order factors: automatic reinforcement, which explained both positive and negative automatic reinforcement, and social reinforcement, which explained both positive and negative social reinforcement. These dimensions are critical to understand the factors that maintain NSSI behavior and have implications for treatments.
NASA Astrophysics Data System (ADS)
Dutt, R. N.; Meena, D. K.; Kar, S.; Soni, V.; Nadaf, A.; Das, A.; Singh, F.; Datta, T. S.
2017-02-01
A system for carrying out automatic experimental measurements of various electrical transport characteristics and their relation to magnetic fields for samples mounted on the sample holder on a Variable Temperature Insert (VTI) of the Cryogen Free Superconducting Magnet System (CFMS) has been developed. The control and characterization system is capable of monitoring, online plotting and history logging in real-time of cryogenic temperatures with the Silicon (Si) Diode and Zirconium Oxy-Nitride sensors installed inside the magnet facility. Electrical transport property measurements have been automated with implementation of current reversal resistance measurements and automatic temperature set-point ramping with the parameters of interest available in real-time as well as for later analysis. The Graphical User Interface (GUI) based system is user friendly to facilitate operations. An ingenious electronics for reading Zirconium Oxy-Nitride temperature sensors has been used. Price to performance ratio has been optimized by using in house developed measurement techniques mixed with specialized commercial cryogenic measurement / control equipment.
NASA Technical Reports Server (NTRS)
Sherman, W. L.
1975-01-01
The effects of steady wind, turbulence, data sample rate, and control-actuator natural frequency on the response of a possible automatic landing system were investigated in a nonstatistical study. The results indicate that the system, which interfaces with the microwave landing system, functions well in winds and turbulence as long as the guidance law contains proper compensation for wind. The system response was satisfactory down to five data samples per second, which makes the system compatible with the microwave landing system. No adverse effects were observed when actuator natural frequency was lowered. For limiting cases, those cases where the roll angle goes to zero just as the airplane touches down, the basic method for computing the turn-algorithm gains proved unsatisfactory and unacceptable landings resulted. Revised computation methods gave turn-algorithm gains that resulted in acceptable landings. The gains provided by the new method also improved the touchdown conditions for acceptable landings over those obtained when the gains were determined by the old method.
Knowing what to expect, forecasting monthly emergency department visits: A time-series analysis.
Bergs, Jochen; Heerinckx, Philipe; Verelst, Sandra
2014-04-01
To evaluate an automatic forecasting algorithm in order to predict the number of monthly emergency department (ED) visits one year ahead. We collected retrospective data of the number of monthly visiting patients for a 6-year period (2005-2011) from 4 Belgian Hospitals. We used an automated exponential smoothing approach to predict monthly visits during the year 2011 based on the first 5 years of the dataset. Several in- and post-sample forecasting accuracy measures were calculated. The automatic forecasting algorithm was able to predict monthly visits with a mean absolute percentage error ranging from 2.64% to 4.8%, indicating an accurate prediction. The mean absolute scaled error ranged from 0.53 to 0.68 indicating that, on average, the forecast was better compared with in-sample one-step forecast from the naïve method. The applied automated exponential smoothing approach provided useful predictions of the number of monthly visits a year in advance. Copyright © 2013 Elsevier Ltd. All rights reserved.
Comparison of daily and weekly precipitation sampling efficiencies using automatic collectors
Schroder, L.J.; Linthurst, R.A.; Ellson, J.E.; Vozzo, S.F.
1985-01-01
Precipitation samples were collected for approximately 90 daily and 50 weekly sampling periods at Finley Farm, near Raleigh, North Carolina from August 1981 through October 1982. Ten wet-deposition samplers (AEROCHEM METRICS MODEL 301) were used; 4 samplers were operated for daily sampling, and 6 samplers were operated for weekly-sampling periods. This design was used to determine if: (1) collection efficiences of precipitation are affected by small distances between the Universal (Belfort) precipitation gage and collector; (2) measurable evaporation loss occurs and (3) pH and specific conductance of precipitation vary significantly within small distances. Average collection efficiencies were 97% for weekly sampling periods compared with the rain gage. Collection efficiencies were examined by seasons and precipitation volume. Neither factor significantly affected collection efficiency. No evaporation loss was found by comparing daily sampling to weekly sampling at the collection site, which was classified as a subtropical climate. Correlation coefficients for pH and specific conductance of daily samples and weekly samples ranged from 0.83 to 0.99.Precipitation samples were collected for approximately 90 daily and 50 weekly sampling periods at Finley farm, near Raleigh, North Carolina from August 1981 through October 1982. Ten wet-deposition samplers were used; 4 samplers were operated for daily sampling, and 6 samplers were operated for weekly-sampling periods. This design was used to determine if: (1) collection efficiencies of precipitation are affected by small distances between the University (Belfort) precipitation gage and collector; (2) measurable evaporation loss occurs and (3) pH and specific conductance of precipitation vary significantly within small distances.
Automated solid-phase extraction workstations combined with quantitative bioanalytical LC/MS.
Huang, N H; Kagel, J R; Rossi, D T
1999-03-01
An automated solid-phase extraction workstation was used to develop, characterize and validate an LC/MS/MS method for quantifying a novel lipid-regulating drug in dog plasma. Method development was facilitated by workstation functions that allowed wash solvents of varying organic composition to be mixed and tested automatically. Precision estimates for this approach were within 9.8% relative standard deviation (RSD) across the calibration range. Accuracy for replicate determinations of quality controls was between -7.2 and +6.2% relative error (RE) over 5-1,000 ng/ml(-1). Recoveries were evaluated for a wide variety of wash solvents, elution solvents and sorbents. Optimized recoveries were generally > 95%. A sample throughput benchmark for the method was approximately equal 8 min per sample. Because of parallel sample processing, 100 samples were extracted in less than 120 min. The approach has proven useful for use with LC/MS/MS, using a multiple reaction monitoring (MRM) approach.
Study on a pattern classification method of soil quality based on simplified learning sample dataset
Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.
2011-01-01
Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.
NASA Technical Reports Server (NTRS)
Wittmann, A.; Willay, G.
1986-01-01
For a rapid preparation of solutions intended for analysis by inductively coupled plasma emission spectrometry or atomic absorption spectrometry, an automatic device called Plasmasol was developed. This apparatus used the property of nonwettability of glassy C to fuse the sample in an appropriate flux. The sample-flux mixture is placed in a composite crucible, then heated at high temperature, swirled until full dissolution is achieved, and then poured into a water-filled beaker. After acid addition, dissolution of the melt, and filling to the mark, the solution is ready for analysis. The analytical results obtained, either for oxide samples or for prereduced iron ores show that the solutions prepared with this device are undistinguished from those obtained by manual dissolutions done by acid digestion or by high temperature fusion. Preparation reproducibility and analytical tests illustrate the performance of Plasmasol.
Evaluation of complex gonioapparent samples using a bidirectional spectrometer.
Rogelj, Nina; Penttinen, Niko; Gunde, Marta Klanjšek
2015-08-24
Many applications use gonioapparent targets whose appearance depends on irradiation and viewing angles; the strongest effects are provided by light diffraction. These targets, optically variable devices (OVDs), are used in both security and authentication applications. This study introduces a bidirectional spectrometer, which enables to analyze samples with most complex angular and spectral properties. In our work, the spectrometer is evaluated with samples having very different types of reflection, concerning spectral and angular distributions. Furthermore, an OVD containing several different grating patches is evaluated. The device uses automatically adjusting exposure time to provide maximum signal dynamics and is capable of doing steps as small as 0.01°. However, even 2° steps for the detector movement showed that this device is more than capable of characterizing even the most complex reflecting surfaces. This study presents sRGB visualizations, discussion of bidirectional reflection, and accurate grating period calculations for all of the grating samples used.
Ryder, Alan G
2002-03-01
Eighty-five solid samples consisting of illegal narcotics diluted with several different materials were analyzed by near-infrared (785 nm excitation) Raman spectroscopy. Principal Component Analysis (PCA) was employed to classify the samples according to narcotic type. The best sample discrimination was obtained by using the first derivative of the Raman spectra. Furthermore, restricting the spectral variables for PCA to 2 or 3% of the original spectral data according to the most intense peaks in the Raman spectrum of the pure narcotic resulted in a rapid discrimination method for classifying samples according to narcotic type. This method allows for the easy discrimination between cocaine, heroin, and MDMA mixtures even when the Raman spectra are complex or very similar. This approach of restricting the spectral variables also decreases the computational time by a factor of 30 (compared to the complete spectrum), making the methodology attractive for rapid automatic classification and identification of suspect materials.
An investigation of potential applications of OP-SAPS: Operational sampled analog processors
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Mcvey, E. S.
1976-01-01
The impact of charge-coupled device (CCD) processors on future instrumentation was investigated. The CCD devices studied process sampled analog data and are referred to as OP-SAPS - operational sampled analog processors. Preliminary studies into various architectural configurations for systems composed of OP-SAPS show that they have potential in such diverse applications as pattern recognition and automatic control. It appears probable that OP-SAPS may be used to construct computing structures which can serve as special peripherals to large-scale computer complexes used in real time flight simulation. The research was limited to the following benchmark programs: (1) face recognition, (2) voice command and control, (3) terrain classification, and (4) terrain identification. A small amount of effort was spent on examining a method by which OP-SAPS may be used to decrease the limiting ground sampling distance encountered in remote sensing from satellites.
Sampling procedures for throughfall monitoring: A simulation study
NASA Astrophysics Data System (ADS)
Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut
2010-01-01
What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.
Park, Sang Hyuk; Park, Chan-Jeoung; Kim, Mi-Jeong; Choi, Mi-Ok; Han, Min-Young; Cho, Young-Uk; Jang, Seongsoo
2014-12-01
We developed and validated an interinstrument comparison method for automatic hematology analyzers based on the 99th percentile coefficient of variation (CV) cutoff of daily means and validated in both patient samples and quality control (QC) materials. A total of 120 patient samples were obtained over 6 months. Data from the first 3 months were used to determine 99th percentile CV cutoff values, and data obtained in the last 3 months were used to calculate acceptable ranges and rejection rates. Identical analyses were also performed using QC materials. Two instrument comparisons were also performed, and the most appropriate allowable total error (ATE) values were determined. The rejection rates based on the 99th percentile cutoff values were within 10.00% and 9.30% for the patient samples and QC materials, respectively. The acceptable ranges of QC materials based on the currently used method were wider than those calculated from the 99th percentile CV cutoff values in most items. In two-instrument comparisons, 34.8% of all comparisons failed, and 87.0% of failed comparisons were successful when 4 SD was applied as an ATE value instead of 3 SD. The 99th percentile CV cutoff value-derived daily acceptable ranges can be used as a real-time interinstrument comparison method in both patient samples and QC materials. Applying 4 SD as an ATE value can significantly reduce unnecessarily followed recalibration in the leukocyte differential counts, reticulocytes, and mean corpuscular volume. Copyright© by the American Society for Clinical Pathology.
Usha, M; Geetha, Y V; Darshan, Y S
2017-03-01
The field of music is increasingly gaining scope and attracting researchers from varied fields in terms of improvising the art of voice modulation in singing. There has been a lot of competition, and young budding singers are emerging with more talent. This study is aimed to develop software to differentiate a prepubertal voice as that of a singer or a non-singer using an objective tool-singing power ratio (SPR)-as an objective measure to quantify the resonant voice quality. Recordings of singing and phonation were obtained from 30 singers and 30 non-singer girls (8-10 years). Three professional singers perceptually evaluated all samples using a rating scale and categorized them as singers or non-singers. Using Matlab, a program was developed to automatically calculate the SPR of a particular sample and classify it into either of two groups based on the normative values of SPR developed manually. Positive correlation for SPR of phonation or singing was found between perceptual and manual ratings, and objective values of SPR. Software could automatically give the SPR values for samples that are fed and could further differentiate them as singer or non-singer. Researchers need not depend on professional singers or musicians for the judgment of voice for research purposes. This software uses an objective tool, which serves as an instrument to judge singing talent using singing and phonation samples of children. Also, it can be used as a first line of judgment in any singing audition process, which could ease the work of professionals. Copyright © 2017 The Voice Foundation. All rights reserved.
NASA Astrophysics Data System (ADS)
Ometto, Giovanni; Calivá, Francesco; Al-Diri, Bashir; Bek, Toke; Hunter, Andrew
2016-03-01
Automatic, quick and reliable identification of retinal landmarks from fundus photography is key for measurements used in research, diagnosis, screening and treating of common diseases affecting the eyes. This study presents a fast method for the detection of the centre of mass of the vascular arcades, optic nerve head (ONH) and fovea, used in the definition of five clinically relevant areas in use for screening programmes for diabetic retinopathy (DR). Thirty-eight fundus photographs showing 7203 DR lesions were analysed to find the landmarks manually by two retina-experts and automatically by the proposed method. The automatic identification of the ONH and fovea were performed using template matching based on normalised cross correlation. The centre of mass of the arcades was obtained by fitting an ellipse on sample coordinates of the main vessels. The coordinates were obtained by processing the image with hessian filtering followed by shape analyses and finally sampling the results. The regions obtained manually and automatically were used to count the retinal lesions falling within, and to evaluate the method. 92.7% of the lesions were falling within the same regions based on the landmarks selected by the two experts. 91.7% and 89.0% were counted in the same areas identified by the method and the first and second expert respectively. The inter-repeatability of the proposed method and the experts is comparable, while the 100% intra-repeatability makes the algorithm a valuable tool in tasks like analyses in real-time, of large datasets and of intra-patient variability.
Automated Electroglottographic Inflection Events Detection. A Pilot Study.
Codino, Juliana; Torres, María Eugenia; Rubin, Adam; Jackson-Menaldi, Cristina
2016-11-01
Vocal-fold vibration can be analyzed in a noninvasive way by registering impedance changes within the glottis, through electroglottography. The morphology of the electroglottographic (EGG) signal is related to different vibratory patterns. In the literature, a characteristic knee in the descending portion of the signal has been reported. Some EGG signals do not exhibit this particular knee and have other types of events (inflection events) throughout the ascending and/or descending portion of the vibratory cycle. The goal of this work is to propose an automatic method to identify and classify these events. A computational algorithm was developed based on the mathematical properties of the EGG signal, which detects and reports events throughout the contact phase. Retrospective analysis of EGG signals obtained during routine voice evaluation of adult individuals with a variety of voice disorders was performed using the algorithm as well as human raters. Two judges, both experts in clinical voice analysis, and three general speech pathologists performed manual and visual evaluation of the sample set. The results obtained by the automatic method were compared with those of the human raters. Statistical analysis revealed a significant level of agreement. This automatic tool could allow professionals in the clinical setting to obtain an automatic quantitative and qualitative report of such events present in a voice sample, without having to manually analyze the whole EGG signal. In addition, it might provide the speech pathologist with more information that would complement the standard voice evaluation. It could also be a valuable tool in voice research. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
SoilJ - An ImageJ plugin for semi-automatized image-processing of 3-D X-ray images of soil columns
NASA Astrophysics Data System (ADS)
Koestel, John
2016-04-01
3-D X-ray imaging is a formidable tool for quantifying soil structural properties which are known to be extremely diverse. This diversity necessitates the collection of large sample sizes for adequately representing the spatial variability of soil structure at a specific sampling site. One important bottleneck of using X-ray imaging is however the large amount of time required by a trained specialist to process the image data which makes it difficult to process larger amounts of samples. The software SoilJ aims at removing this bottleneck by automatizing most of the required image processing steps needed to analyze image data of cylindrical soil columns. SoilJ is a plugin of the free Java-based image-processing software ImageJ. The plugin is designed to automatically process all images located with a designated folder. In a first step, SoilJ recognizes the outlines of the soil column upon which the column is rotated to an upright position and placed in the center of the canvas. Excess canvas is removed from the images. Then, SoilJ samples the grey values of the column material as well as the surrounding air in Z-direction. Assuming that the column material (mostly PVC of aluminium) exhibits a spatially constant density, these grey values serve as a proxy for the image illumination at a specific Z-coordinate. Together with the grey values of the air they are used to correct image illumination fluctuations which often occur along the axis of rotation during image acquisition. SoilJ includes also an algorithm for beam-hardening artefact removal and extended image segmentation options. Finally, SoilJ integrates the morphology analyses plugins of BoneJ (Doube et al., 2006, BoneJ Free and extensible bone image analysis in ImageJ. Bone 47: 1076-1079) and provides an ASCII file summarizing these measures for each investigated soil column, respectively. In the future it is planned to integrate SoilJ into FIJI, the maintained and updated edition of ImageJ with selected plugins.
Han, Guanghui; Liu, Xiabi; Zheng, Guangyuan; Wang, Murong; Huang, Shan
2018-06-06
Ground-glass opacity (GGO) is a common CT imaging sign on high-resolution CT, which means the lesion is more likely to be malignant compared to common solid lung nodules. The automatic recognition of GGO CT imaging signs is of great importance for early diagnosis and possible cure of lung cancers. The present GGO recognition methods employ traditional low-level features and system performance improves slowly. Considering the high-performance of CNN model in computer vision field, we proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling is performed on multi-views and multi-receptive fields, which reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has the ability to obtain the optimal fine-tuning model. Multi-CNN models fusion strategy obtains better performance than any single trained model. We evaluated our method on the GGO nodule samples in publicly available LIDC-IDRI dataset of chest CT scans. The experimental results show that our method yields excellent results with 96.64% sensitivity, 71.43% specificity, and 0.83 F1 score. Our method is a promising approach to apply deep learning method to computer-aided analysis of specific CT imaging signs with insufficient labeled images. Graphical abstract We proposed an automatic recognition method of 3D GGO CT imaging signs through the fusion of hybrid resampling and layer-wise fine-tuning CNN models in this paper. Our hybrid resampling reduces the risk of missing small or large GGOs by adopting representative sampling panels and processing GGOs with multiple scales simultaneously. The layer-wise fine-tuning strategy has ability to obtain the optimal fine-tuning model. Our method is a promising approach to apply deep learning method to computer-aided analysis of specific CT imaging signs with insufficient labeled images.
An integrated and accessible sample data library for Mars sample return science
NASA Astrophysics Data System (ADS)
Tuite, M. L., Jr.; Williford, K. H.
2015-12-01
Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.
Convair F-106B Delta Dart with Air Sampling Equipment
1974-04-21
The National Aeronautics and Space Administration (NASA) Lewis Research Center’s Convair F-106B Delta Dart equipped with air sampling equipment in the mid-1970s. NASA Lewis created and managed the Global Air Sampling Program (GASP) in 1972 in partnership with several airline companies. NASA researchers used the airliners’ Boeing 747 aircraft to gather air samples to determine the amount of pollution present in the stratosphere. Private companies developed the air sampling equipment for the GASP program, and Lewis created a particle collector. The collector was flight tested on NASA Lewis’ F-106B in the summer of 1973. The sampling equipment was automatically operated once the proper altitude was achieved. The sampling instruments collected dust particles in the air so their chemical composition could be analyzed. The equipment analyzed one second’s worth of data at a time. The researchers also monitored carbon monoxide, monozide, ozone, and water vapor. The 747 flights began in December 1974 and soon included four airlines flying routes all over the globe. The F-106B augmented the airline data with sampling of its own, seen here. It gathered samples throughout this period from locations such as New Mexico, Texas, Michigan, and Ohio. In July 1977 the F-106B flew eight GASP flights in nine days over Alaska to supplement the earlier data gathered by the airlines.
Real-time automatic registration in optical surgical navigation
NASA Astrophysics Data System (ADS)
Lin, Qinyong; Yang, Rongqian; Cai, Ken; Si, Xuan; Chen, Xiuwen; Wu, Xiaoming
2016-05-01
An image-guided surgical navigation system requires the improvement of the patient-to-image registration time to enhance the convenience of the registration procedure. A critical step in achieving this aim is performing a fully automatic patient-to-image registration. This study reports on a design of custom fiducial markers and the performance of a real-time automatic patient-to-image registration method using these markers on the basis of an optical tracking system for rigid anatomy. The custom fiducial markers are designed to be automatically localized in both patient and image spaces. An automatic localization method is performed by registering a point cloud sampled from the three dimensional (3D) pedestal model surface of a fiducial marker to each pedestal of fiducial markers searched in image space. A head phantom is constructed to estimate the performance of the real-time automatic registration method under four fiducial configurations. The head phantom experimental results demonstrate that the real-time automatic registration method is more convenient, rapid, and accurate than the manual method. The time required for each registration is approximately 0.1 s. The automatic localization method precisely localizes the fiducial markers in image space. The averaged target registration error for the four configurations is approximately 0.7 mm. The automatic registration performance is independent of the positions relative to the tracking system and the movement of the patient during the operation.
Naming Speed in Dyslexia and Dyscalculia
ERIC Educational Resources Information Center
Willburger, Edith; Fussenegger, Barbara; Moll, Kristina; Wood, Guilherme; Landerl, Karin
2008-01-01
In four carefully selected samples of 8- to 10-year old children with dyslexia (but age adequate arithmetic skills), dyscalculia (but age adequate reading skills), dyslexia/dyscalculia and controls a domain-general deficit in rapid automatized naming (RAN) was found for both dyslexia groups. Dyscalculic children exhibited a domain-specific deficit…
Influence of sampling time on carbon dioxide and methane emissions by grazing cattle
USDA-ARS?s Scientific Manuscript database
A need to respond to global climate change has focused great attention towards greenhouse gases produced by domestic ruminants and gas emission mitigation. Respiration chambers have long been the preferred method to measure CO2 and CH4 emission by cattle. With quickly advancing technology, automat...
Young, Stacie T.M.; Ball, Marcael T.J.
2005-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at two stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2004 and June 30, 2005. A total of 15 samples was collected over three storms during July 1, 2004 to June 30, 2005. In general, an attempt was made to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. However, all three storms were partially sampled because either not all stations were sampled or not all composite samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Chromium and nickel were added to the analysis starting October 1, 2004. Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
Gray, John R.; Fisk, Gregory G.
1992-01-01
From July 1988 through September 1991, radionuclide and suspended-sediment transport were monitored in ephemeral streams in the semiarid Little Colorado River basin of Arizona and New Mexico, USA, where in-stream gross-alpha plus gross-beta activities have exceeded Arizona's Maximum Allowable Limit through releases from natural weathering processes and from uranium-mining operations in the Church Rock Mining District, Grants Mineral Belt, New Mexico. Water samples were collected at a network of nine continuous-record streamgauges equipped with microprocessor-based satellite telemetry and automatic water-sampling systems, and six partial-record streamgauges equipped with passive water samplers. Analytical results from these samples were used to calculate transport of selected suspended and dissolved radionuclides in the uranium-238 and thorium-232 decay series.
Satoh, K; Noguchi, M; Higuchi, H; Kitamura, K
1984-12-01
Liquid scintillation counting of alpha rays with pulse shape discrimination was applied to the analysis of 226Ra and 239+240Pu in environmental samples and of alpha-emitters in/on a filter paper. The instrument used in this study was either a specially designed detector or a commercial liquid scintillation counter with an automatic sample changer, both of which were connected to the pulse shape discrimination circuit. The background counting rate in alpha energy region of 5-7 MeV was 0.01 or 0.04 cpm/MeV, respectively. The figure of merit indicating the resolving power for alpha- and beta-particles in time spectrum was found to be 5.7 for the commercial liquid scintillation counter.
Convolutional neural network using generated data for SAR ATR with limited samples
NASA Astrophysics Data System (ADS)
Cong, Longjian; Gao, Lei; Zhang, Hui; Sun, Peng
2018-03-01
Being able to adapt all weather at all times, it has been a hot research topic that using Synthetic Aperture Radar(SAR) for remote sensing. Despite all the well-known advantages of SAR, it is hard to extract features because of its unique imaging methodology, and this challenge attracts the research interest of traditional Automatic Target Recognition(ATR) methods. With the development of deep learning technologies, convolutional neural networks(CNNs) give us another way out to detect and recognize targets, when a huge number of samples are available, but this premise is often not hold, when it comes to monitoring a specific type of ships. In this paper, we propose a method to enhance the performance of Faster R-CNN with limited samples to detect and recognize ships in SAR images.
Restrepo, John F; Garcia-Sucerquia, Jorge
2012-02-15
We present an automatic procedure for 3D tracking of micrometer-sized particles with high-NA digital lensless holographic microscopy. The method uses a two-feature approach to search for the best focal planes and to distinguish particles from artifacts or other elements on the reconstructed stream of the holograms. A set of reconstructed images is axially projected onto a single image. From the projected image, the centers of mass of all the reconstructed elements are identified. Starting from the centers of mass, the morphology of the profile of the maximum intensity along the reconstruction direction allows for the distinguishing of particles from others elements. The method is tested with modeled holograms and applied to automatically track micrometer-sized bubbles in a sample of 4 mm3 of soda.
A Peltier-based freeze-thaw device for meteorite disaggregation
NASA Astrophysics Data System (ADS)
Ogliore, R. C.
2018-02-01
A Peltier-based freeze-thaw device for the disaggregation of meteorite or other rock samples is described. Meteorite samples are kept in six water-filled cavities inside a thin-walled Al block. This block is held between two Peltier coolers that are automatically cycled between cooling and warming. One cycle takes approximately 20 min. The device can run unattended for months, allowing for ˜10 000 freeze-thaw cycles that will disaggregate meteorites even with relatively low porosity. This device was used to disaggregate ordinary and carbonaceous chondrite regoltih breccia meteorites to search for micrometeoroid impact craters.
Commutated automatic gain control system
NASA Technical Reports Server (NTRS)
Yost, S. R.
1982-01-01
A commutated automatic gain control (AGC) system was designed and built for a prototype Loran C receiver. The receiver uses a microcomputer to control a memory aided phase-locked loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The circuit designed for the AGC is described, and bench and flight test results are presented. The AGC circuit described actually samples starting at a point 40 microseconds after a zero crossing determined by the software lock pulse ultimately generated by a 30 microsecond delay and add network in the receiver front end envelope detector.
Detecting brain tumor in pathological slides using hyperspectral imaging
Ortega, Samuel; Fabelo, Himar; Camacho, Rafael; de la Luz Plaza, María; Callicó, Gustavo M.; Sarmiento, Roberto
2018-01-01
Hyperspectral imaging (HSI) is an emerging technology for medical diagnosis. This research work presents a proof-of-concept on the use of HSI data to automatically detect human brain tumor tissue in pathological slides. The samples, consisting of hyperspectral cubes collected from 400 nm to 1000 nm, were acquired from ten different patients diagnosed with high-grade glioma. Based on the diagnosis provided by pathologists, a spectral library of normal and tumor tissues was created and processed using three different supervised classification algorithms. Results prove that HSI is a suitable technique to automatically detect high-grade tumors from pathological slides. PMID:29552415
Detecting brain tumor in pathological slides using hyperspectral imaging.
Ortega, Samuel; Fabelo, Himar; Camacho, Rafael; de la Luz Plaza, María; Callicó, Gustavo M; Sarmiento, Roberto
2018-02-01
Hyperspectral imaging (HSI) is an emerging technology for medical diagnosis. This research work presents a proof-of-concept on the use of HSI data to automatically detect human brain tumor tissue in pathological slides. The samples, consisting of hyperspectral cubes collected from 400 nm to 1000 nm, were acquired from ten different patients diagnosed with high-grade glioma. Based on the diagnosis provided by pathologists, a spectral library of normal and tumor tissues was created and processed using three different supervised classification algorithms. Results prove that HSI is a suitable technique to automatically detect high-grade tumors from pathological slides.
Users manual for the Variable dimension Automatic Synthesis Program (VASP)
NASA Technical Reports Server (NTRS)
White, J. S.; Lee, H. Q.
1971-01-01
A dictionary and some problems for the Variable Automatic Synthesis Program VASP are submitted. The dictionary contains a description of each subroutine and instructions on its use. The example problems give the user a better perspective on the use of VASP for solving problems in modern control theory. These example problems include dynamic response, optimal control gain, solution of the sampled data matrix Ricatti equation, matrix decomposition, and pseudo inverse of a matrix. Listings of all subroutines are also included. The VASP program has been adapted to run in the conversational mode on the Ames 360/67 computer.
Improving sediment transport measurements in the Erlenbach stream using a moving basket system
NASA Astrophysics Data System (ADS)
Rickenmann, Dieter; Turowski, Jens; Hegglin, Ramon; Fritschi, Bruno
2010-05-01
In the Erlenbach stream, a prealpine torrent in Switzerland, sediment transport has been monitored for more than 25 years. Sediment transporting flood events in the Erlenbach are typically of short duration with a rapid rise of discharge during summer thunderstorms, thus hampering on-site measurements. On average there are more than 20 bedload transport events per year. Near the confluence with the main valley river, there is a stream gauging station and a sediment retention basin with a capacity of about 2,000 m3. The basin is surveyed at regular intervals and after large flood events. In addition, sediment transport has been continuously monitored with a piezoelectric bedload impact sensor (PBIS) array since 1986. The sensor array is mounted flush with the surface of a check dam immediately upstream of the retention basin. The PBIS system was developed to continuously measure the intensity of bedload transport and its relation to stream discharge. To standardize the sensors, the piezoelectric crystals were replaced by geophones in 2000. The geophone measuring system has also been employed at a number of other streams. In 2008, the measuring system in the Erlenbach stream has been enhanced with an automatic system to obtain bedload samples. Movable, slot-type cubic metal baskets are mounted on a rail at the downstream wall of the large check dam above the retention basin. The metal baskets can be moved automatically and individually into the flow according to flow and bedload transport conditions (i.e. geophone recordings). The basket is stopped at the centerline of the approach flow channel of the overflow section to obtain a sediment sample during a limited time interval. The wire mesh of the basket has a spacing of 10 mm to sample all sediment particles coarser than this size (which is about the limiting grain size detected by the geophones). The weight increase due to the collected sediment is measured by weighing cells located in the basket supporting structure, and this information is used in combination with the geophone recordings to determine when to move a basket laterally away from the flow. The upgraded measuring system allows: (i) to obtain bedload samples over short sampling periods; (ii) to measure the grain size distribution of the transported material and its variation over time and with discharge; (iii) to obtain direct bedload measurements that can be used to improve the understanding of the geophone signal; and (iv) to improve the geophone calibration for the Erlenbach stream. We introduce the new measuring installations, discuss our experience from the first successful automatic sampling operations in summer 2009, and we present first results.
Sun, Yanqing; Sun, Liuquan; Zhou, Jie
2013-07-01
This paper studies the generalized semiparametric regression model for longitudinal data where the covariate effects are constant for some and time-varying for others. Different link functions can be used to allow more flexible modelling of longitudinal data. The nonparametric components of the model are estimated using a local linear estimating equation and the parametric components are estimated through a profile estimating function. The method automatically adjusts for heterogeneity of sampling times, allowing the sampling strategy to depend on the past sampling history as well as possibly time-dependent covariates without specifically model such dependence. A [Formula: see text]-fold cross-validation bandwidth selection is proposed as a working tool for locating an appropriate bandwidth. A criteria for selecting the link function is proposed to provide better fit of the data. Large sample properties of the proposed estimators are investigated. Large sample pointwise and simultaneous confidence intervals for the regression coefficients are constructed. Formal hypothesis testing procedures are proposed to check for the covariate effects and whether the effects are time-varying. A simulation study is conducted to examine the finite sample performances of the proposed estimation and hypothesis testing procedures. The methods are illustrated with a data example.
NASA Astrophysics Data System (ADS)
Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino
2013-12-01
Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.
Unsupervised active learning based on hierarchical graph-theoretic clustering.
Hu, Weiming; Hu, Wei; Xie, Nianhua; Maybank, Steve
2009-10-01
Most existing active learning approaches are supervised. Supervised active learning has the following problems: inefficiency in dealing with the semantic gap between the distribution of samples in the feature space and their labels, lack of ability in selecting new samples that belong to new categories that have not yet appeared in the training samples, and lack of adaptability to changes in the semantic interpretation of sample categories. To tackle these problems, we propose an unsupervised active learning framework based on hierarchical graph-theoretic clustering. In the framework, two promising graph-theoretic clustering algorithms, namely, dominant-set clustering and spectral clustering, are combined in a hierarchical fashion. Our framework has some advantages, such as ease of implementation, flexibility in architecture, and adaptability to changes in the labeling. Evaluations on data sets for network intrusion detection, image classification, and video classification have demonstrated that our active learning framework can effectively reduce the workload of manual classification while maintaining a high accuracy of automatic classification. It is shown that, overall, our framework outperforms the support-vector-machine-based supervised active learning, particularly in terms of dealing much more efficiently with new samples whose categories have not yet appeared in the training samples.
Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino
2013-12-07
Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.
Young, Stacie T.M.; Jamison, Marcael T.J.
2007-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at three stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2006 and June 30, 2007. A total of 13 samples was collected over two storms during July 1, 2006 to June 30, 2007. The goal was to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
Charles, Isabel; Sinclair, Ian; Addison, Daniel H
2014-04-01
A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.
Liu, Liwei; Zheng, Huaili; Xu, Bincheng; Xiao, Lang; Chigan, Yong; Zhangluo, Yilan
2018-03-01
In this paper, a procedure for in-situ pre-concentration in graphite furnace by repeated sampling and pyrolysis is proposed for the determination of ultra-trace thallium in drinking water by graphite furnace atomic absorption spectrometry (GF-AAS). Without any other laborious enrichment processes that routinely result in analyte loss and contamination, thallium was directly concentrated in the graphite furnace automatically and subsequently subject to analysis. The effects of several key factors, such as the temperature for pyrolysis and atomization, the chemical modifier, and the repeated sampling times were investigated. Under the optimized conditions, a limit of detection of 0.01µgL -1 was obtained, which fulfilled thallium determination in drinking water by GB 5749-2006 regulated by China. Successful analysis of thallium in certified water samples and drinking water samples was demonstrated, with analytical results in good agreement with the certified values and those by inductively coupled plasma mass spectrometry (ICP-MS), respectively. Routine spike-recovery tests with randomly selected drinking water samples showed satisfactory results of 80-96%. The proposed method is simple and sensitive for screening of ultra-trace thallium in drinking water samples. Copyright © 2017. Published by Elsevier B.V.
Beraud, L; Gervasoni, K; Freydiere, A M; Descours, G; Ranc, A G; Vandenesch, F; Lina, G; Gaia, V; Jarraud, S
2015-09-01
The Sofia Legionella Fluorescence Immunoassay (FIA; Quidel) is a recently introduced rapid immunochromatographic diagnostic test for Legionnaires' disease using immunofluorescence technology designed to enhance its sensitivity. The aim of this study was to evaluate its performance for the detection of urinary antigens for Legionella pneumophila serogroup 1 in two National Reference Centers for Legionella. The sensitivity and specificity of the Sofia Legionella FIA test were determined in concentrated and nonconcentrated urine samples, before and after boiling, in comparison with the BinaxNOW® Legionella Urinary Antigen Card (UAC; Alere). Compared with BinaxNOW® Legionella UAC, the sensitivity of the Sofia Legionella test was slightly higher in nonconcentrated urine samples and was identical in concentrated urine samples. The specificity of the Sofia Legionella FIA test was highly reduced by the concentration of urine samples. In nonconcentrated samples, a lack of specificity was observed in 2.3 % of samples, all of them resolved by heat treatment. The Sofia Legionella FIA is a sensitive test for detecting Legionella urinary antigens with no previous urine concentration. However, all positive samples have to be re-tested after boiling to reach a high specificity. The reading is automatized on the Sofia analyzer, which can be connected to laboratory information systems, facilitating accurate and rapid reporting of results.
Meneses, Anderson Alvarenga de Moura; Palheta, Dayara Bastos; Pinheiro, Christiano Jorge Gomes; Barroso, Regina Cely Rodrigues
2018-03-01
X-ray Synchrotron Radiation Micro-Computed Tomography (SR-µCT) allows a better visualization in three dimensions with a higher spatial resolution, contributing for the discovery of aspects that could not be observable through conventional radiography. The automatic segmentation of SR-µCT scans is highly valuable due to its innumerous applications in geological sciences, especially for morphology, typology, and characterization of rocks. For a great number of µCT scan slices, a manual process of segmentation would be impractical, either for the time expended and for the accuracy of results. Aiming the automatic segmentation of SR-µCT geological sample images, we applied and compared Energy Minimization via Graph Cuts (GC) algorithms and Artificial Neural Networks (ANNs), as well as the well-known K-means and Fuzzy C-Means algorithms. The Dice Similarity Coefficient (DSC), Sensitivity and Precision were the metrics used for comparison. Kruskal-Wallis and Dunn's tests were applied and the best methods were the GC algorithms and ANNs (with Levenberg-Marquardt and Bayesian Regularization). For those algorithms, an approximate Dice Similarity Coefficient of 95% was achieved. Our results confirm the possibility of usage of those algorithms for segmentation and posterior quantification of porosity of an igneous rock sample SR-µCT scan. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Shiffman, Smadar
2004-01-01
Automated cloud detection and tracking is an important step in assessing global climate change via remote sensing. Cloud masks, which indicate whether individual pixels depict clouds, are included in many of the data products that are based on data acquired on- board earth satellites. Many cloud-mask algorithms have the form of decision trees, which employ sequential tests that scientists designed based on empirical astrophysics studies and astrophysics simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In this study we explored the potential benefits of automatically-learned decision trees for detecting clouds from images acquired using the Advanced Very High Resolution Radiometer (AVHRR) instrument on board the NOAA-14 weather satellite of the National Oceanic and Atmospheric Administration. We constructed three decision trees for a sample of 8km-daily AVHRR data from 2000 using a decision-tree learning procedure provided within MATLAB(R), and compared the accuracy of the decision trees to the accuracy of the cloud mask. We used ground observations collected by the National Aeronautics and Space Administration Clouds and the Earth s Radiant Energy Systems S COOL project as the gold standard. For the sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks included in the AVHRR data product.
Automatic detection of apical roots in oral radiographs
NASA Astrophysics Data System (ADS)
Wu, Yi; Xie, Fangfang; Yang, Jie; Cheng, Erkang; Megalooikonomou, Vasileios; Ling, Haibin
2012-03-01
The apical root regions play an important role in analysis and diagnosis of many oral diseases. Automatic detection of such regions is consequently the first step toward computer-aided diagnosis of these diseases. In this paper we propose an automatic method for periapical root region detection by using the state-of-theart machine learning approaches. Specifically, we have adapted the AdaBoost classifier for apical root detection. One challenge in the task is the lack of training cases especially for diseased ones. To handle this problem, we boost the training set by including more root regions that are close to the annotated ones and decompose the original images to randomly generate negative samples. Based on these training samples, the Adaboost algorithm in combination with Haar wavelets is utilized in this task to train an apical root detector. The learned detector usually generates a large amount of true and false positives. In order to reduce the number of false positives, a confidence score for each candidate detection result is calculated for further purification. We first merge the detected regions by combining tightly overlapped detected candidate regions and then we use the confidence scores from the Adaboost detector to eliminate the false positives. The proposed method is evaluated on a dataset containing 39 annotated digitized oral X-Ray images from 21 patients. The experimental results show that our approach can achieve promising detection accuracy.
Improving labeling efficiency in automatic quality control of MRSI data.
Pedrosa de Barros, Nuno; McKinley, Richard; Wiest, Roland; Slotboom, Johannes
2017-12-01
To improve the efficiency of the labeling task in automatic quality control of MR spectroscopy imaging data. 28'432 short and long echo time (TE) spectra (1.5 tesla; point resolved spectroscopy (PRESS); repetition time (TR)= 1,500 ms) from 18 different brain tumor patients were labeled by two experts as either accept or reject, depending on their quality. For each spectrum, 47 signal features were extracted. The data was then used to run several simulations and test an active learning approach using uncertainty sampling. The performance of the classifiers was evaluated as a function of the number of patients in the training set, number of spectra in the training set, and a parameter α used to control the level of classification uncertainty required for a new spectrum to be selected for labeling. The results showed that the proposed strategy allows reductions of up to 72.97% for short TE and 62.09% for long TE in the amount of data that needs to be labeled, without significant impact in classification accuracy. Further reductions are possible with significant but minimal impact in performance. Active learning using uncertainty sampling is an effective way to increase the labeling efficiency for training automatic quality control classifiers. Magn Reson Med 78:2399-2405, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Darda, Kohinoor M; Butler, Emily E; Ramsey, Richard
2018-06-01
Humans show an involuntary tendency to copy other people's actions. Although automatic imitation builds rapport and affiliation between individuals, we do not copy actions indiscriminately. Instead, copying behaviors are guided by a selection mechanism, which inhibits some actions and prioritizes others. To date, the neural underpinnings of the inhibition of automatic imitation and differences between the sexes in imitation control are not well understood. Previous studies involved small sample sizes and low statistical power, which produced mixed findings regarding the involvement of domain-general and domain-specific neural architectures. Here, we used data from Experiment 1 ( N = 28) to perform a power analysis to determine the sample size required for Experiment 2 ( N = 50; 80% power). Using independent functional localizers and an analysis pipeline that bolsters sensitivity, during imitation control we show clear engagement of the multiple-demand network (domain-general), but no sensitivity in the theory-of-mind network (domain-specific). Weaker effects were observed with regard to sex differences, suggesting that there are more similarities than differences between the sexes in terms of the neural systems engaged during imitation control. In summary, neurocognitive models of imitation require revision to reflect that the inhibition of imitation relies to a greater extent on a domain-general selection system rather than a domain-specific system that supports social cognition.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Model-based monitoring of stormwater runoff quality.
Birch, Heidi; Vezzaro, Luca; Mikkelsen, Peter Steen
2013-01-01
Monitoring of micropollutants (MP) in stormwater is essential to evaluate the impacts of stormwater on the receiving aquatic environment. The aim of this study was to investigate how different strategies for monitoring of stormwater quality (combining a model with field sampling) affect the information obtained about MP discharged from the monitored system. A dynamic stormwater quality model was calibrated using MP data collected by automatic volume-proportional sampling and passive sampling in a storm drainage system on the outskirts of Copenhagen (Denmark) and a 10-year rain series was used to find annual average (AA) and maximum event mean concentrations. Use of this model reduced the uncertainty of predicted AA concentrations compared to a simple stochastic method based solely on data. The predicted AA concentration, obtained by using passive sampler measurements (1 month installation) for calibration of the model, resulted in the same predicted level but with narrower model prediction bounds than by using volume-proportional samples for calibration. This shows that passive sampling allows for a better exploitation of the resources allocated for stormwater quality monitoring.
Ogden, Samantha J; Horton, Jeffrey K; Stubbs, Simon L; Tatnell, Peter J
2015-01-01
The 1.2 mm Electric Coring Tool (e-Core™) was developed to increase the throughput of FTA(™) sample collection cards used during forensic workflows and is similar to a 1.2 mm Harris manual micro-punch for sampling dried blood spots. Direct short tandem repeat (STR) DNA profiling was used to compare samples taken by the e-Core tool with those taken by the manual micro-punch. The performance of the e-Core device was evaluated using a commercially available PowerPlex™ 18D STR System. In addition, an analysis was performed that investigated the potential carryover of DNA via the e-Core punch from one FTA disc to another. This contamination study was carried out using Applied Biosystems AmpflSTR™ Identifiler™ Direct PCR Amplification kits. The e-Core instrument does not contaminate FTA discs when a cleaning punch is used following excision of discs containing samples and generates STR profiles that are comparable to those generated by the manual micro-punch. © 2014 American Academy of Forensic Sciences.
Cappelli, Christopher; Ames, Susan; Shono, Yusuke; Dust, Mark; Stacy, Alan
2017-09-01
This study used a dual-process model of cognition in order to investigate the possible influence of automatic and deliberative processes on lifetime alcohol use in a sample of drug offenders. The objective was to determine if automatic/implicit associations in memory can exert an influence over an individual's alcohol use and if decision-making ability could potentially modify the influence of these associations. 168 participants completed a battery of cognitive tests measuring implicit alcohol associations in memory (verb generation) as well as their affective decision-making ability (Iowa Gambling Task). Structural equation modeling procedures were used to test the relationship between implicit associations, decision-making, and lifetime alcohol use. Results revealed that among participants with lower levels of decision-making, implicit alcohol associations more strongly predicted higher lifetime alcohol use. These findings provide further support for the interaction between a specific decision function and its influence over automatic processes in regulating alcohol use behavior in a risky population. Understanding the interaction between automatic associations and decision processes may aid in developing more effective intervention components.
Collender, Mark A; Doherty, Kevin A J; Stanton, Kenneth T
2017-01-01
Following a shooting incident where a vehicle is used to convey the culprits to and from the scene, both the getaway car and the firearm are often deliberately burned in an attempt to destroy any forensic evidence which may be subsequently recovered. Here we investigate the factors that influence the ability to make toolmark identifications on ammunition discharged from pistols recovered from such car fires. This work was carried out by conducting a number of controlled furnace tests in conjunction with real car fire tests in which three 9mm semi-automatic pistols were burned. Comparisons between pre-burn and post burn test fired ammunition discharged from these pistols were then performed to establish if identifications were still possible. The surfaces of the furnace heated samples and car fire samples were examined following heating/burning to establish what factors had influenced their surface morphology. The primary influence on the surfaces of the furnace heated and car fire samples was the formation of oxide layers. The car fire samples were altered to a greater extent than the furnace heated samples. Identifications were still possible between pre- and post-burn discharged cartridge cases, but this was not the case for the discharged bullets. It is suggested that the reason for this is a difference between the types of firearms discharge-generated toolmarks impressed onto the base of cartridge cases compared to those striated along the surfaces of bullets. It was also found that the temperatures recorded in the front foot wells were considerably less than those recorded on top of the rear seats during the car fires. These factors should be assessed by forensic firearms examiners when performing casework involving pistols recovered from car fires. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
A practical guideline for intracranial volume estimation in patients with Alzheimer's disease
2015-01-01
Background Intracranial volume (ICV) is an important normalization measure used in morphometric analyses to correct for head size in studies of Alzheimer Disease (AD). Inaccurate ICV estimation could introduce bias in the outcome. The current study provides a decision aid in defining protocols for ICV estimation in patients with Alzheimer disease in terms of sampling frequencies that can be optimally used on the volumetric MRI data, and the type of software most suitable for use in estimating the ICV measure. Methods Two groups of 22 subjects are considered, including adult controls (AC) and patients with Alzheimer Disease (AD). Reference measurements were calculated for each subject by manually tracing intracranial cavity by the means of visual inspection. The reliability of reference measurements were assured through intra- and inter- variation analyses. Three publicly well-known software packages (Freesurfer, FSL, and SPM) were examined in their ability to automatically estimate ICV across the groups. Results Analysis of the results supported the significant effect of estimation method, gender, cognitive condition of the subject and the interaction among method and cognitive condition factors in the measured ICV. Results on sub-sampling studies with a 95% confidence showed that in order to keep the accuracy of the interleaved slice sampling protocol above 99%, the sampling period cannot exceed 20 millimeters for AC and 15 millimeters for AD. Freesurfer showed promising estimates for both adult groups. However SPM showed more consistency in its ICV estimation over the different phases of the study. Conclusions This study emphasized the importance in selecting the appropriate protocol, the choice of the sampling period in the manual estimation of ICV and selection of suitable software for the automated estimation of ICV. The current study serves as an initial framework for establishing an appropriate protocol in both manual and automatic ICV estimations with different subject populations. PMID:25953026
Pharmaceutical dust exposure at pharmacies using automatic dispensing machines: a preliminary study.
Fent, Kenneth W; Durgam, Srinivas; Mueller, Charles
2014-01-01
Automatic dispensing machines (ADMs) used in pharmacies concentrate and dispense large volumes of pharmaceuticals, including uncoated tablets that can shed dust. We evaluated 43 employees' exposures to pharmaceutical dust at three pharmacies where ADMs were used. We used an optical particle counter to identify tasks that generated pharmaceutical dust. We collected 72 inhalable dust air samples in or near the employees' breathing zones. In addition to gravimetric analysis, our contract laboratory used internal methods involving liquid chromatography to analyze these samples for active pharmaceutical ingredients (APIs) and/or lactose, an inactive filler in tablets. We had to choose samples for these additional analyses because many methods used different extraction solvents. We selected 57 samples for analysis of lactose. We used real-time particle monitoring results, observations, and information from employees on the dustiness of pharmaceuticals to select 28 samples (including 13 samples that were analyzed for lactose) for analysis of specific APIs. Pharmaceutical dust was generated during a variety of tasks like emptying and refilling of ADM canisters. Using compressed air to clean canisters and manual count machines produced the overall highest peak number concentrations (19,000-580,000 particles/L) of smallest particles (count median aerodynamic diameter ≤ 2 μm). Employees who refilled, cleaned, or repaired ADM canisters, or hand filled prescriptions were exposed to higher median air concentrations of lactose (5.0-12 μg/m(3)) than employees who did other jobs (0.04-1.3 μg/m(3)), such as administrative/office work, labeling/packaging, and verifying prescriptions. We detected 10 APIs in air, including lisinopril, a drug prescribed for high blood pressure, levothyroxine, a drug prescribed for hypothyroidism, and methotrexate, a hazardous drug prescribed for cancer and other disorders. Three air concentrations of lisinopril (1.8-2.7 μg/m(3)) exceeded the lower bound of the manufacturer's hazard control band (1-10 μg/m(3)). All other API air concentrations were below applicable occupational exposure limits. Our findings indicate that some pharmacy employees are exposed to multiple APIs and that measures are needed to control those exposures.
Kim, Hyoungrae; Jang, Cheongyun; Yadav, Dharmendra K; Kim, Mi-Hyun
2017-03-23
The accuracy of any 3D-QSAR, Pharmacophore and 3D-similarity based chemometric target fishing models are highly dependent on a reasonable sample of active conformations. Since a number of diverse conformational sampling algorithm exist, which exhaustively generate enough conformers, however model building methods relies on explicit number of common conformers. In this work, we have attempted to make clustering algorithms, which could find reasonable number of representative conformer ensembles automatically with asymmetric dissimilarity matrix generated from openeye tool kit. RMSD was the important descriptor (variable) of each column of the N × N matrix considered as N variables describing the relationship (network) between the conformer (in a row) and the other N conformers. This approach used to evaluate the performance of the well-known clustering algorithms by comparison in terms of generating representative conformer ensembles and test them over different matrix transformation functions considering the stability. In the network, the representative conformer group could be resampled for four kinds of algorithms with implicit parameters. The directed dissimilarity matrix becomes the only input to the clustering algorithms. Dunn index, Davies-Bouldin index, Eta-squared values and omega-squared values were used to evaluate the clustering algorithms with respect to the compactness and the explanatory power. The evaluation includes the reduction (abstraction) rate of the data, correlation between the sizes of the population and the samples, the computational complexity and the memory usage as well. Every algorithm could find representative conformers automatically without any user intervention, and they reduced the data to 14-19% of the original values within 1.13 s per sample at the most. The clustering methods are simple and practical as they are fast and do not ask for any explicit parameters. RCDTC presented the maximum Dunn and omega-squared values of the four algorithms in addition to consistent reduction rate between the population size and the sample size. The performance of the clustering algorithms was consistent over different transformation functions. Moreover, the clustering method can also be applied to molecular dynamics sampling simulation results.
Gonçalves, João L; Alves, Vera L; Rodrigues, Fátima P; Figueira, José A; Câmara, José S
2013-08-23
In this work a highly selective and sensitive analytical procedure based on semi-automatic microextraction by packed sorbents (MEPS) technique, using a new digitally controlled syringe (eVol(®)) combined with ultra-high pressure liquid chromatography (UHPLC), is proposed to determine the prenylated chalcone derived from the hop (Humulus lupulus L.), xanthohumol (XN), and its isomeric flavonone isoxanthohumol (IXN) in beers. Extraction and UHPLC parameters were accurately optimized to achieve the highest recoveries and to enhance the analytical characteristics of the method. Important parameters affecting MEPS performance, namely the type of sorbent material (C2, C8, C18, SIL, and M1), elution solvent system, number of extraction cycles (extract-discard), sample volume, elution volume, and sample pH, were evaluated. The optimal experimental conditions involves the loading of 500μL of sample through a C18 sorbent in a MEPS syringe placed in the semi-automatic eVol(®) syringe followed by elution using 250μL of acetonitrile (ACN) in a 10 extractions cycle (about 5min for the entire sample preparation step). The obtained extract is directly analyzed in the UHPLC system using a binary mobile phase composed of aqueous 0.1% formic acid (eluent A) and ACN (eluent B) in the gradient elution mode (10min total analysis). Under optimized conditions good results were obtained in terms of linearity within the established concentration range with correlation coefficients (R) values higher than 0.986, with a residual deviation for each calibration point below 12%. The limit of detection (LOD) and limit of quantification (LOQ) obtained were 0.4ngmL(-1) and 1.0ngmL(-1) for IXN, and 0.9ngmL(-1) and 3.0ngmL(-1) for XN, respectively. Precision was lower than 4.6% for IXN and 8.4% for XN. Typical recoveries ranged between 67.1% and 99.3% for IXN and between 74.2% and 99.9% for XN, with relative standard deviations %RSD no larger than 8%. The applicability of the proposed analytical procedure in commercial beers, revealed the presence of both target prenylchalcones in all samples being IXN the most abundant with concentration of between 0.126 and 0.200μgmL(-1). Copyright © 2013 Elsevier B.V. All rights reserved.
Himmel, Wolfgang; Reincke, Ulrich; Michelmann, Hans Wilhelm
2009-07-22
Both healthy and sick people increasingly use electronic media to obtain medical information and advice. For example, Internet users may send requests to Web-based expert forums, or so-called "ask the doctor" services. To automatically classify lay requests to an Internet medical expert forum using a combination of different text-mining strategies. We first manually classified a sample of 988 requests directed to a involuntary childlessness forum on the German website "Rund ums Baby" ("Everything about Babies") into one or more of 38 categories belonging to two dimensions ("subject matter" and "expectations"). After creating start and synonym lists, we calculated the average Cramer's V statistic for the association of each word with each category. We also used principle component analysis and singular value decomposition as further text-mining strategies. With these measures we trained regression models and determined, on the basis of best regression models, for any request the probability of belonging to each of the 38 different categories, with a cutoff of 50%. Recall and precision of a test sample were calculated as a measure of quality for the automatic classification. According to the manual classification of 988 documents, 102 (10%) documents fell into the category "in vitro fertilization (IVF)," 81 (8%) into the category "ovulation," 79 (8%) into "cycle," and 57 (6%) into "semen analysis." These were the four most frequent categories in the subject matter dimension (consisting of 32 categories). The expectation dimension comprised six categories; we classified 533 documents (54%) as "general information" and 351 (36%) as a wish for "treatment recommendations." The generation of indicator variables based on the chi-square analysis and Cramer's V proved to be the best approach for automatic classification in about half of the categories. In combination with the two other approaches, 100% precision and 100% recall were realized in 18 (47%) out of the 38 categories in the test sample. For 35 (92%) categories, precision and recall were better than 80%. For some categories, the input variables (ie, "words") also included variables from other categories, most often with a negative sign. For example, absence of words predictive for "menstruation" was a strong indicator for the category "pregnancy test." Our approach suggests a way of automatically classifying and analyzing unstructured information in Internet expert forums. The technique can perform a preliminary categorization of new requests and help Internet medical experts to better handle the mass of information and to give professional feedback.
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Fei; Maslov, Sergei; Yoo, Shinjae
Here, transcriptome datasets from thousands of samples of the model plant Arabidopsis thaliana have been collectively generated by multiple individual labs. Although integration and meta-analysis of these samples has become routine in the plant research community, it is often hampered by the lack of metadata or differences in annotation styles by different labs. In this study, we carefully selected and integrated 6,057 Arabidopsis microarray expression samples from 304 experiments deposited to NCBI GEO. Metadata such as tissue type, growth condition, and developmental stage were manually curated for each sample. We then studied global expression landscape of the integrated dataset andmore » found that samples of the same tissue tend to be more similar to each other than to samples of other tissues, even in different growth conditions or developmental stages. Root has the most distinct transcriptome compared to aerial tissues, but the transcriptome of cultured root is more similar to those of aerial tissues as the former samples lost their cellular identity. Using a simple computational classification method, we showed that the tissue type of a sample can be successfully predicted based on its expression profile, opening the door for automatic metadata extraction and facilitating re-use of plant transcriptome data. As a proof of principle we applied our automated annotation pipeline to 708 RNA-seq samples from public repositories and verified accuracy of our predictions with samples’ metadata provided by authors.« less
Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin
2016-01-01
We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder. PMID:28616373
He, Fei; Maslov, Sergei; Yoo, Shinjae; ...
2016-05-25
Here, transcriptome datasets from thousands of samples of the model plant Arabidopsis thaliana have been collectively generated by multiple individual labs. Although integration and meta-analysis of these samples has become routine in the plant research community, it is often hampered by the lack of metadata or differences in annotation styles by different labs. In this study, we carefully selected and integrated 6,057 Arabidopsis microarray expression samples from 304 experiments deposited to NCBI GEO. Metadata such as tissue type, growth condition, and developmental stage were manually curated for each sample. We then studied global expression landscape of the integrated dataset andmore » found that samples of the same tissue tend to be more similar to each other than to samples of other tissues, even in different growth conditions or developmental stages. Root has the most distinct transcriptome compared to aerial tissues, but the transcriptome of cultured root is more similar to those of aerial tissues as the former samples lost their cellular identity. Using a simple computational classification method, we showed that the tissue type of a sample can be successfully predicted based on its expression profile, opening the door for automatic metadata extraction and facilitating re-use of plant transcriptome data. As a proof of principle we applied our automated annotation pipeline to 708 RNA-seq samples from public repositories and verified accuracy of our predictions with samples’ metadata provided by authors.« less
Zonta, Marco Antonio; Velame, Fernanda; Gema, Samara; Filassi, Jose Roberto; Longatto-Filho, Adhemar
2014-01-01
Background Breast cancer is the second cause of death in women worldwide. The spontaneous breast nipple discharge may contain cells that can be analyzed for malignancy. Halo® Mamo Cyto Test (HMCT) was recently developed as an automated system indicated to aspirate cells from the breast ducts. The objective of this study was to standardize the methodology of sampling and sample preparation of nipple discharge obtained by the automated method Halo breast test and perform cytological evaluation in samples preserved in liquid medium (SurePath™). Methods We analyzed 564 nipple fluid samples, from women between 20 and 85 years old, without history of breast disease and neoplasia, no pregnancy, and without gynecologic medical history, collected by HMCT method and preserved in two different vials with solutions for transport. Results From 306 nipple fluid samples from method 1, 199 (65%) were classified as unsatisfactory (class 0), 104 (34%) samples were classified as benign findings (class II), and three (1%) were classified as undetermined to neoplastic cells (class III). From 258 samples analyzed in method 2, 127 (49%) were classified as class 0, 124 (48%) were classified as class II, and seven (2%) were classified as class III. Conclusion Our study suggests an improvement in the quality and quantity of cellular samples when the association of the two methodologies is performed, Halo breast test and the method in liquid medium. PMID:29147397
Time-of-flight radio location system
McEwan, T.E.
1996-04-23
A bi-static radar configuration measures the direct time-of-flight of a transmitted RF pulse and is capable of measuring this time-of-flight with a jitter on the order of about one pico-second, or about 0.01 inch of free space distance for an electromagnetic pulse over a range of about one to ten feet. A transmitter transmits a sequence of electromagnetic pulses in response to a transmit timing signal, and a receiver samples the sequence of electromagnetic pulses with controlled timing in response to a receive timing signal, and generates a sample signal in response to the samples. A timing circuit supplies the transmit timing signal to the transmitter and supplies the receive timing signal to the receiver. The receive timing signal causes the receiver to sample the sequence of electromagnetic pulses such that the time between transmission of pulses in the sequence and sampling by the receiver sweeps over a range of delays. The receive timing signal sweeps over the range of delays in a sweep cycle such that pulses in the sequence are sampled at the pulse repetition rate, and with different delays in the range of delays to produce a sample signal representing magnitude of a received pulse in equivalent time. Automatic gain control circuitry in the receiver controls the magnitude of the equivalent time sample signal. A signal processor analyzes the sample signal to indicate the time-of-flight of the electromagnetic pulses in the sequence. 7 figs.
Time-of-flight radio location system
McEwan, Thomas E.
1996-01-01
A bi-static radar configuration measures the direct time-of-flight of a transmitted RF pulse and is capable of measuring this time-of-flight with a jitter on the order of about one pico-second, or about 0.01 inch of free space distance for an electromagnetic pulse over a range of about one to ten feet. A transmitter transmits a sequence of electromagnetic pulses in response to a transmit timing signal, and a receiver samples the sequence of electromagnetic pulses with controlled timing in response to a receive timing signal, and generates a sample signal in response to the samples. A timing circuit supplies the transmit timing signal to the transmitter and supplies the receive timing signal to the receiver. The receive timing signal causes the receiver to sample the sequence of electromagnetic pulses such that the time between transmission of pulses in the sequence and sampling by the receiver sweeps over a range of delays. The receive timing signal sweeps over the range of delays in a sweep cycle such that pulses in the sequence are sampled at the pulse repetition rate, and with different delays in the range of delays to produce a sample signal representing magnitude of a received pulse in equivalent time. Automatic gain control circuitry in the receiver controls the magnitude of the equivalent time sample signal. A signal processor analyzes the sample signal to indicate the time-of-flight of the electromagnetic pulses in the sequence.
Yamada, Yuki; Ninomiya, Satoshi; Hiraoka, Kenzo; Chen, Lee Chuin
2016-01-01
We report on combining a self-aspirated sampling probe and an ESI source using a single metal capillary which is electrically grounded and safe for use by the operator. To generate an electrospray, a negative H.V. is applied to the counter electrode of the ESI emitter to operate in positive ion mode. The sampling/ESI capillary is enclosed within another concentric capillary similar to the arrangement for a standard pneumatically assisted ESI source. The suction of the liquid sample is due to the Venturi effect created by the high-velocity gas flow near the ESI tip. In addition to serving as the mechanism for suction, the high-velocity gas flow also assists in the nebulization of charged droplets, thus producing a stable ion signal. Even though the potential of the ion source counter electrode is more negative than the mass spectrometer in the positive ion mode, the electric field effect is not significant if the ion source and the mass spectrometer are separated by a sufficient distance. Ion transmission is achieved by the viscous flow of the carrier gas. Using the present arrangement, the user can hold the ion source in a bare hand and the ion signal appears almost immediately when the sampling capillary is brought into contact with the liquid sample. The automated analysis of multiple samples can also be achieved by using motorized sample stage and an automated ion source holder.
Maintaining and Enhancing Diversity of Sampled Protein Conformations in Robotics-Inspired Methods.
Abella, Jayvee R; Moll, Mark; Kavraki, Lydia E
2018-01-01
The ability to efficiently sample structurally diverse protein conformations allows one to gain a high-level view of a protein's energy landscape. Algorithms from robot motion planning have been used for conformational sampling, and several of these algorithms promote diversity by keeping track of "coverage" in conformational space based on the local sampling density. However, large proteins present special challenges. In particular, larger systems require running many concurrent instances of these algorithms, but these algorithms can quickly become memory intensive because they typically keep previously sampled conformations in memory to maintain coverage estimates. In addition, robotics-inspired algorithms depend on defining useful perturbation strategies for exploring the conformational space, which is a difficult task for large proteins because such systems are typically more constrained and exhibit complex motions. In this article, we introduce two methodologies for maintaining and enhancing diversity in robotics-inspired conformational sampling. The first method addresses algorithms based on coverage estimates and leverages the use of a low-dimensional projection to define a global coverage grid that maintains coverage across concurrent runs of sampling. The second method is an automatic definition of a perturbation strategy through readily available flexibility information derived from B-factors, secondary structure, and rigidity analysis. Our results show a significant increase in the diversity of the conformations sampled for proteins consisting of up to 500 residues when applied to a specific robotics-inspired algorithm for conformational sampling. The methodologies presented in this article may be vital components for the scalability of robotics-inspired approaches.
Fast automated online xylanase activity assay using HPAEC-PAD.
Cürten, Christin; Anders, Nico; Juchem, Niels; Ihling, Nina; Volkenborn, Kristina; Knapp, Andreas; Jaeger, Karl-Erich; Büchs, Jochen; Spiess, Antje C
2018-01-01
In contrast to biochemical reactions, which are often carried out under automatic control and maintained overnight, the automation of chemical analysis is usually neglected. Samples are either analyzed in a rudimentary fashion using in situ techniques, or aliquots are withdrawn and stored to facilitate more precise offline measurements, which can result in sampling and storage errors. Therefore, in this study, we implemented automated reaction control, sampling, and analysis. As an example, the activities of xylanases on xylotetraose and soluble xylan were examined using high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD). The reaction was performed in HPLC vials inside a temperature-controlled Dionex™ AS-AP autosampler. It was started automatically when the autosampler pipetted substrate and enzyme solution into the reaction vial. Afterwards, samples from the reaction vial were injected repeatedly for 60 min onto a CarboPac™ PA100 column for analysis. Due to the rapidity of the reaction, the analytical method and the gradient elution of 200 mM sodium hydroxide solution and 100 mM sodium hydroxide with 500 mM sodium acetate were adapted to allow for an overall separation time of 13 min and a detection limit of 0.35-1.83 mg/L (depending on the xylooligomer). This analytical method was applied to measure the soluble short-chain products (xylose, xylobiose, xylotriose, xylotetraose, xylopentaose, and longer xylooligomers) that arise during enzymatic hydrolysis. Based on that, the activities of three endoxylanases (EX) were determined as 294 U/mg for EX from Aspergillus niger, 1.69 U/mg for EX from Bacillus stearothermophilus, and 0.36 U/mg for EX from Bacillus subtilis. Graphical abstract Xylanase activity assay automation.
Factors affecting volume calculation with single photon emission tomography (SPECT) method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T.H.; Lee, K.H.; Chen, D.C.P.
1985-05-01
Several factors may influence the calculation of absolute volumes (VL) from SPECT images. The effect of these factors must be established to optimize the technique. The authors investigated the following on the VL calculations: % of background (BG) subtraction, reconstruction filters, sample activity, angular sampling and edge detection methods. Transaxial images of a liver-trunk phantom filled with Tc-99m from 1 to 3 ..mu..Ci/cc were obtained in 64x64 matrix with a Siemens Rota Camera and MDS computer. Different reconstruction filters including Hanning 20,32, 64 and Butterworth 20, 32 were used. Angular samplings were performed in 3 and 6 degree increments. ROI'smore » were drawn manually and with an automatic edge detection program around the image after BG subtraction. VL's were calculated by multiplying the number of pixels within the ROI by the slice thickness and the x- and y- calibrations of each pixel. One or 2 pixel per slice thickness was applied in the calculation. An inverse correlation was found between the calculated VL and the % of BG subtraction (r=0.99 for 1,2,3 ..mu..Ci/cc activity). Based on the authors' linear regression analysis, the correct liver VL was measured with about 53% BG subtraction. The reconstruction filters, slice thickness and angular sampling had only minor effects on the calculated phantom volumes. Detection of the ROI automatically by the computer was not as accurate as the manual method. The authors conclude that the % of BG subtraction appears to be the most important factor affecting the VL calculation. With good quality control and appropriate reconstruction factors, correct VL calculations can be achieved with SPECT.« less
NASA Astrophysics Data System (ADS)
Manjili, Mohsen Hajipour; Halali, Mohammad
2018-02-01
Samples of INCONEL 718 were levitated and melted in a slag by the application of an electromagnetic field. The effects of temperature, time, and slag composition on the inclusion content of the samples were studied thoroughly. Samples were compared with the original alloy to study the effect of the process on inclusions. Size, shape, and chemical composition of remaining non-metallic inclusions were investigated. The samples were prepared by Standard Guide for Preparing and Evaluating Specimens for Automatic Inclusion Assessment of Steel (ASTM E 768-99) method and the results were reported by means of the Standard Test Methods for Determining the Inclusion Content of Steel (ASTM E 45-97). Results indicated that by increasing temperature and processing time, greater level of cleanliness could be achieved, and numbers and size of the remaining inclusions decreased significantly. It was also observed that increasing calcium fluoride content of the slag helped reduce inclusion content.
Design of a portable electronic nose for real-fake detection of liquors
NASA Astrophysics Data System (ADS)
Qi, Pei-Feng; Zeng, Ming; Li, Zhi-Hua; Sun, Biao; Meng, Qing-Hao
2017-09-01
Portability is a major issue that influences the practical application of electronic noses (e-noses). For liquors detection, an e-nose must preprocess the liquid samples (e.g., using evaporation and thermal desorption), which makes the portable design even more difficult. To realize convenient and rapid detection of liquors, we designed a portable e-nose platform that consists of hardware and software systems. The hardware system contains an evaporation/sampling module, a reaction module, a control/data acquisition and analysis module, and a power module. The software system provides a user-friendly interface and can achieve automatic sampling and data processing. This e-nose platform has been applied to the real-fake recognition of Chinese liquors. Through parameter optimization of a one-class support vector machine classifier, the error rate of the negative samples is greatly reduced, and the overall recognition accuracy is improved. The results validated the feasibility of the designed portable e-nose platform.
Yu, Yong-Jie; Xia, Qiao-Ling; Wang, Sheng; Wang, Bing; Xie, Fu-Wei; Zhang, Xiao-Bing; Ma, Yun-Ming; Wu, Hai-Long
2014-09-12
Peak detection and background drift correction (BDC) are the key stages in using chemometric methods to analyze chromatographic fingerprints of complex samples. This study developed a novel chemometric strategy for simultaneous automatic chromatographic peak detection and BDC. A robust statistical method was used for intelligent estimation of instrumental noise level coupled with first-order derivative of chromatographic signal to automatically extract chromatographic peaks in the data. A local curve-fitting strategy was then employed for BDC. Simulated and real liquid chromatographic data were designed with various kinds of background drift and degree of overlapped chromatographic peaks to verify the performance of the proposed strategy. The underlying chromatographic peaks can be automatically detected and reasonably integrated by this strategy. Meanwhile, chromatograms with BDC can be precisely obtained. The proposed method was used to analyze a complex gas chromatography dataset that monitored quality changes in plant extracts during storage procedure. Copyright © 2014 Elsevier B.V. All rights reserved.
Eccles, B A; Klevecz, R R
1986-06-01
Mitotic frequency in a synchronous culture of mammalian cells was determined fully automatically and in real time using low-intensity phase-contrast microscopy and a newvicon video camera connected to an EyeCom III image processor. Image samples, at a frequency of one per minute for 50 hours, were analyzed by first extracting the high-frequency picture components, then thresholding and probing for annular objects indicative of putative mitotic cells. Both the extraction of high-frequency components and the recognition of rings of varying radii and discontinuities employed novel algorithms. Spatial and temporal relationships between annuli were examined to discern the occurrences of mitoses, and such events were recorded in a computer data file. At present, the automatic analysis is suited for random cell proliferation rate measurements or cell cycle studies. The automatic identification of mitotic cells as described here provides a measure of the average proliferative activity of the cell population as a whole and eliminates more than eight hours of manual review per time-lapse video recording.
Measuring micro-organism gas production
NASA Technical Reports Server (NTRS)
Wilkins, J. R.; Pearson, A. O.; Mills, S. M.
1973-01-01
Transducer, which senses pressure buildup, is easy to assemble and use, and rate of gas produced can be measured automatically and accurately. Method can be used in research, in clinical laboratories, and for environmental pollution studies because of its ability to detect and quantify rapidly the number of gas-producing microorganisms in water, beverages, and clinical samples.
The Impact of Facial Emotional Expressions on Behavioral Tendencies in Women and Men
ERIC Educational Resources Information Center
Seidel, Eva-Maria; Habel, Ute; Kirschner, Michaela; Gur, Ruben C.; Derntl, Birgit
2010-01-01
Emotional faces communicate both the emotional state and behavioral intentions of an individual. They also activate behavioral tendencies in the perceiver, namely approach or avoidance. Here, we compared more automatic motor to more conscious rating responses to happy, sad, angry, and disgusted faces in a healthy student sample. Happiness was…
Visual Search by Children with and without ADHD
ERIC Educational Resources Information Center
Mullane, Jennifer C.; Klein, Raymond M.
2008-01-01
Objective: To summarize the literature that has employed visual search tasks to assess automatic and effortful selective visual attention in children with and without ADHD. Method: Seven studies with a combined sample of 180 children with ADHD (M age = 10.9) and 193 normally developing children (M age = 10.8) are located. Results: Using a…
46 CFR 161.002-2 - Types of fire-protective systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., but not be limited to, automatic fire and smoke detecting systems, manual fire alarm systems, sample... unit, fire detectors, smoke detectors, and audible and visual alarms distinct in both respects from the alarms of any other system not indicating fire. (c) Manual fire alarm systems. For the purpose of this...
A New Method for Measuring Text Similarity in Learning Management Systems Using WordNet
ERIC Educational Resources Information Center
Alkhatib, Bassel; Alnahhas, Ammar; Albadawi, Firas
2014-01-01
As text sources are getting broader, measuring text similarity is becoming more compelling. Automatic text classification, search engines and auto answering systems are samples of applications that rely on text similarity. Learning management systems (LMS) are becoming more important since electronic media is getting more publicly available. As…
Controlling suspended samplers by programmable calculator and interface circuitry
Rand E. Eads; Mark R. Boolootian
1985-01-01
A programmable calculator connected to an interface circuit can control automatic samplers and record streamflow data. The circuit converts a voltage representing water stage to a digital signal. The sampling program logs streamflow data when there is a predefined deviation from a linear trend in the water elevation. The calculator estimates suspended sediment...
Controlling suspended sediment samplers by programmable calculator and interface circuitry
Rand E. Eads; Mark R. Boolootian
1985-01-01
A programmable calculator connected to an interface circuit can control automatic samplers and record streamflow data. The circuit converts a voltage representing water stage to a digital signal. The sampling program logs streamflow data when there is a predefined deviation from a linear trend in the water elevation. The calculator estimates suspended sediment...
Predictors of Early versus Later Spelling Development in Danish
ERIC Educational Resources Information Center
Nielsen, Anne-Mette Veber; Juul, Holger
2016-01-01
The present study examined phoneme awareness, phonological short term memory, letter knowledge, rapid automatized naming (RAN), and visual-verbal paired associate learning (PAL) as longitudinal predictors of spelling skills in an early phase (Grade 2) and a later phase (Grade 5) of development in a sample of 140 children learning to spell in the…
The Suitability of Cloud-Based Speech Recognition Engines for Language Learning
ERIC Educational Resources Information Center
Daniels, Paul; Iwago, Koji
2017-01-01
As online automatic speech recognition (ASR) engines become more accurate and more widely implemented with call software, it becomes important to evaluate the effectiveness and the accuracy of these recognition engines using authentic speech samples. This study investigates two of the most prominent cloud-based speech recognition engines--Apple's…
Diagnostics for the Analysis of Surface Chemistry Effects on Composite Energetic Material Reactions
2015-10-30
integration time) and a NETZSCH STA 449 Jupiter that will allow for consistency and efficiency with its automatic 20 sample changer. (2) Together these...Purchase of the NETZSCH STA 449 Jupiter (DSC-TGA) to resolve reaction kinetics under equilibrium conditions. Images of this instrumentation are included in
Gil-Moltó, J; Varea, M; Galindo, N; Crespo, J
2009-02-27
The application of the thermal desorption (TD) method coupled with gas chromatography-mass spectrometry (GC-MS) to the analysis of aerosol organics has been the focus of many studies in recent years. This technique overcomes the main drawbacks of the solvent extraction approach such as the use of large amounts of toxic organic solvents and long and laborious extraction processes. In this work, the application of an automatic TD-GC-MS instrument for the determination of particle-bound polycyclic aromatic hydrocarbons (PAHs) is evaluated. This device offers the advantage of allowing the analysis of either gaseous or particulate organics without any modification. Once the thermal desorption conditions for PAH extraction were optimised, the method was verified on NIST standard reference material (SRM) 1649a urban dust, showing good linearity, reproducibility and accuracy for all target PAHs. The method has been applied to PM10 and PM2.5 samples collected on quartz fibre filters with low volume samplers, demonstrating its capability to quantify PAHs when only a small amount of sample is available.
Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety
Jiang, Wen; Huang, Yulin; Yang, Jianyu
2016-01-01
Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD) result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method. PMID:27399714
Detection of microbial concentration in ice-cream using the impedance technique.
Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B
2008-06-15
The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
Sørbye, Sveinung Wergeland; Pedersen, Mette Kristin; Ekeberg, Bente; Williams, Merete E Johansen; Sauer, Torill; Chen, Ying
2017-01-01
The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7%) were processed through the automated "gynecologic" application for cervix cytology samples, and 96 (51.3%) were processed with the "nongynecological" automatic program. Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7%) were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the "gynecology" program and "nongynecology" program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0%) were screened as normal while 13 samples (14.0%) were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187) of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the "nongynecology" program to ensure an adequate number of cells.
Bakshi, Sonal R; Shukla, Shilin N; Shah, Pankaj M
2009-01-01
We developed a Microsoft Access-based laboratory management system to facilitate database management of leukemia patients referred for cytogenetic tests in regards to karyotyping and fluorescence in situ hybridization (FISH). The database is custom-made for entry of patient data, clinical details, sample details, cytogenetics test results, and data mining for various ongoing research areas. A number of clinical research laboratoryrelated tasks are carried out faster using specific "queries." The tasks include tracking clinical progression of a particular patient for multiple visits, treatment response, morphological and cytogenetics response, survival time, automatic grouping of patient inclusion criteria in a research project, tracking various processing steps of samples, turn-around time, and revenue generated. Since 2005 we have collected of over 5,000 samples. The database is easily updated and is being adapted for various data maintenance and mining needs.
Automatic electrochemical ambient air monitor for chloride and chlorine
Mueller, Theodore R.
1976-07-13
An electrochemical monitoring system has been provided for determining chloride and chlorine in air at levels of from about 10-1000 parts per billion. The chloride is determined by oxidation to chlorine followed by reduction to chloride in a closed system. Chlorine is determined by direct reduction at a platinum electrode in 6 M H.sub.2 SO.sub.4 electrolyte. A fully automated system is utilized to (1) acquire and store a value corresponding to electrolyte-containing impurities, (2) subtract this value from that obtained in the presence of air, (3) generate coulometrically a standard sample of chlorine mixed with air sample, and determine it as chlorine and/or chloride, and (4) calculate, display, and store for permanent record the ratio of the signal obtained from the air sample and that obtained with the standard.
Organics in water contamination analyzer, phase 1
NASA Technical Reports Server (NTRS)
1986-01-01
The requirements which would result in identifying the components of an automatic analytical system for the analysis of specific organic compounds in the space station potable water supply are defined. The gas chromatographic system for such an analysis is limited to commercially available off-the-shelf hardware and includes the sample inlet, an ionization detector, capillary columns as well as computerized compound identification. The sampling system will be a special variation of the purge and trap Tenax mode using six-port valves and a 500 microliter water sample. Capillary columns used for the separating of contaminants will be bonded phase fused silica with a silicone stationary phase. Two detectors can be used: photoionization and far ultraviolet, since they are sensitive and compatible with capillary columns. A computer system evaluation and program with the principle of compound identification based on the retention index is presented.
Fang, Ning; Sun, Wei
2015-04-21
A method, apparatus, and system for improved VA-TIRFM microscopy. The method comprises automatically controlled calibration of one or more laser sources by precise control of presentation of each laser relative a sample for small incremental changes of incident angle over a range of critical TIR angles. The calibration then allows precise scanning of the sample for any of those calibrated angles for higher and more accurate resolution, and better reconstruction of the scans for super resolution reconstruction of the sample. Optionally the system can be controlled for incident angles of the excitation laser at sub-critical angles for pseudo TIRFM. Optionally both above-critical angle and sub critical angle measurements can be accomplished with the same system.
Testing of a scanning adiabatic calorimeter with Joule effect heating of the sample
NASA Astrophysics Data System (ADS)
Barreiro-Rodríguez, G.; Yáñez-Limón, J. M.; Contreras-Servin, C. A.; Herrera-Gomez, A.
2008-01-01
We evaluated a scanning adiabatic resistive calorimeter (SARC) developed to measure the specific enthalpy of viscous and gel-type materials. The sample is heated employing the Joule effect. The cell is constituted by a cylindrical jacket and two pistons, and the sample is contained inside the jacket between the two pistons. The upper piston can slide to allow for thermal expansion and to keep the pressure constant. The pistons also function as electrodes for the sample. While the sample is heated through the Joule effect, the electrodes and the jacket are independently heated to the same temperature of the sample using automatic control. This minimizes the heat transport between the sample and its surroundings. The energy to the sample is supplied by applying to the electrodes an ac voltage in the kilohertz range, establishing a current in the sample and inducing electric dissipation. This energy can be measured with enough exactitude to determine the heat capacity. This apparatus also allows for the quantification of the thermal conductivity by reproducing the evolution of the temperature as heat is introduced only to one of the pistons. To this end, the system was modeled using finite element calculations. This dual capability proved to be very valuable for correction in the determination of the specific enthalpy. The performance of the SARC was evaluated by comparing the heat capacity results to those obtained by differential scanning calorimetry measurements using a commercial apparatus. The analyzed samples were zeolite, bauxite, hematite, bentonite, rice flour, corn flour, and potato starch.
X-shooter Finds an Extremely Primitive Star
NASA Astrophysics Data System (ADS)
Caffau, E.; Bonifacio, P.; François, P.; Sbordone, L.; Monaco, L.; Spite, M.; Spite, F.; Ludwig, H.-G.; Cayrel, R.; Zaggia, S.; Hammer, F.; Randich, S.; Molaro, P.; Hill, V.
2011-12-01
Low-mass extremely metal-poor (EMP) stars hold the fossil record of the chemical composition of the early phases of the Universe in their atmospheres. Chemical analysis of such objects provides important constraints on these early phases. EMP stars are rather rare objects: to dig them out, large amounts of data have to be considered. We have analysed stars from the Sloan Digital Sky Survey using an automatic procedure and selected a sample of good candidate EMP stars, which we observed with the spectrographs X-shooter and UVES. We could confirm the low metallicity of our sample of stars, and we succeeded in finding a record metal-poor star.
Three-level sampler having automated thresholds
NASA Technical Reports Server (NTRS)
Jurgens, R. F.
1976-01-01
A three-level sampler is described that has its thresholds controlled automatically so as to track changes in the statistics of the random process being sampled. In particular, the mean value is removed and the ratio of the standard deviation of the random process to the threshold is maintained constant. The system is configured in such a manner that slow drifts in the level comparators and digital-to-analog converters are also removed. The ratio of the standard deviation to threshold level may be chosen within the constraints of the ratios of two integers N and M. These may be chosen to minimize the quantizing noise of the sampled process.
Extracting knowledge from the World Wide Web
Henzinger, Monika; Lawrence, Steve
2004-01-01
The World Wide Web provides a unprecedented opportunity to automatically analyze a large sample of interests and activity in the world. We discuss methods for extracting knowledge from the web by randomly sampling and analyzing hosts and pages, and by analyzing the link structure of the web and how links accumulate over time. A variety of interesting and valuable information can be extracted, such as the distribution of web pages over domains, the distribution of interest in different areas, communities related to different topics, the nature of competition in different categories of sites, and the degree of communication between different communities or countries. PMID:14745041
General purpose rocket furnace
NASA Technical Reports Server (NTRS)
Aldrich, B. R.; Whitt, W. D. (Inventor)
1979-01-01
A multipurpose furnace for space vehicles used for material processing experiments in an outer space environment is described. The furnace contains three separate cavities designed to process samples of the widest possible range of materials and thermal requirements. Each cavity contains three heating elements capable of independent function under the direction of an automatic and programmable control system. A heat removable mechanism is also provided for each cavity which operates in conjunction with the control system for establishing an isothermally heated cavity or a wide range of thermal gradients and cool down rates. A monitoring system compatible with the rocket telemetry provides furnace performance and sample growth rate data throughout the processing cycle.
Design and field results of a walk-through EDS
NASA Astrophysics Data System (ADS)
Wendel, Gregory J.; Bromberg, Edward E.; Durfee, Memorie K.; Curby, William A.
1997-01-01
A walk-through portal sampling module which incorporates active sampling has been developed. The module uses opposing wands which actively brush the subjects exterior clothing to disturb explosive traces. These traces are entrained in an air stream and transported to a High Speed GC- chemiluminescence explosives detection system. This combination provides automatic screening of passengers at rates of 10 per minute. The system exhibits sensitivity and selectivity which equals or betters that available from commercially available manual equipment. The systems has been developed for deployment at border crossings, airports and other security screening points. Detailed results of laboratory tests and airport field trials are reviewed.
NASA Technical Reports Server (NTRS)
Kaulen, D. R.; Bulatova, T. I.; Fridenshteyn, A. Y.; Skvortsova, Y. B.
1974-01-01
Lunar surface material was studied for its content of viable microorganisms (aerobic and anaerobic, fungi, and viruses); the effect of the lunar surface material on the growth of microorganisms and its interaction with somatic cells of mammals was also observed. No viable microorganisms were detected; the samples exhibited neither stimulant or inhibitory action on the growth of microorganisms, and also showed no cytopathogenic action on tissue cultures. A suspension of lunar surface material particles was not toxic when parenterally administered to certain laboratory animals. The particles were subjected to intense phagocytosis by connective tissue cells in vivo and in vitro.
Waste Management System overview for future spacecraft.
NASA Technical Reports Server (NTRS)
Ingelfinger, A. L.; Murray, R. W.
1973-01-01
Waste Management Systems (WMS) for post Apollo spacecraft will be significantly more sophisticated and earthlike in user procedures. Some of the features of the advanced WMS will be accommodation of both males and females, automatic operation, either tissue wipe or anal wash, measurement and sampling of urine, feces and vomitus for medical analysis, water recovery, and solids disposal. This paper presents an overview of the major problems of and approaches to waste management for future spacecraft. Some of the processes discussed are liquid/gas separation, the Dry-John, the Hydro-John, automated sampling, vapor compression distillation, vacuum distillation-catalytic oxidation, incineration, and the integration of the above into complete systems.
Caboche, Ségolène; Even, Gaël; Loywick, Alexandre; Audebert, Christophe; Hot, David
2017-12-19
The increase in available sequence data has advanced the field of microbiology; however, making sense of these data without bioinformatics skills is still problematic. We describe MICRA, an automatic pipeline, available as a web interface, for microbial identification and characterization through reads analysis. MICRA uses iterative mapping against reference genomes to identify genes and variations. Additional modules allow prediction of antibiotic susceptibility and resistance and comparing the results of several samples. MICRA is fast, producing few false-positive annotations and variant calls compared to current methods, making it a tool of great interest for fully exploiting sequencing data.
Time-of-flight radio location system
McEwan, T.E.
1997-08-26
A bi-static radar configuration measures the direct time-of-flight of a transmitted RF pulse and is capable of measuring this time-of-flight with a jitter on the order of about one pico-second, or about 0.01 inch of free space distance for an electromagnetic pulse over a range of about one to ten feet. A transmitter transmits a sequence of electromagnetic pulses in response to a transmit timing signal, and a receiver samples the sequence of electromagnetic pulses with controlled timing in response to a receive timing signal, and generates a sample signal in response to the samples. A timing circuit supplies the transmit timing signal to the transmitter and supplies the receive timing signal to the receiver. The receive timing signal causes the receiver to sample the sequence of electromagnetic pulses such that the time between transmission of pulses in the sequence and sampling by the receiver sweeps over a range of delays. The receive timing signal sweeps over the range of delays in a sweep cycle such that pulses in the sequence are sampled at the pulse repetition rate, and with different delays in the range of delays to produce a sample signal representing magnitude of a received pulse in equivalent time. Automatic gain control circuitry in the receiver controls the magnitude of the equivalent time sample signal. A signal processor analyzes the sample signal to indicate the time-of-flight of the electromagnetic pulses in the sequence. The sample signal in equivalent time is passed through an envelope detection circuit, formed of an absolute value circuit followed by a low pass filter, to convert the sample signal to a unipolar signal to eliminate effects of antenna misorientation. 8 figs.
Time-of-flight radio location system
McEwan, Thomas E.
1997-01-01
A bi-static radar configuration measures the direct time-of-flight of a transmitted RF pulse and is capable of measuring this time-of-flight with a jitter on the order of about one pico-second, or about 0.01 inch of free space distance for an electromagnetic pulse over a range of about one to ten feet. A transmitter transmits a sequence of electromagnetic pulses in response to a transmit timing signal, and a receiver samples the sequence of electromagnetic pulses with controlled timing in response to a receive timing signal, and generates a sample signal in response to the samples. A timing circuit supplies the transmit timing signal to the transmitter and supplies the receive timing signal to the receiver. The receive timing signal causes the receiver to sample the sequence of electromagnetic pulses such that the time between transmission of pulses in the sequence and sampling by the receiver sweeps over a range of delays. The receive timing signal sweeps over the range of delays in a sweep cycle such that pulses in the sequence are sampled at the pulse repetition rate, and with different delays in the range of delays to produce a sample signal representing magnitude of a received pulse in equivalent time. Automatic gain control circuitry in the receiver controls the magnitude of the equivalent time sample signal. A signal processor analyzes the sample signal to indicate the time-of-flight of the electromagnetic pulses in the sequence. The sample signal in equivalent time is passed through an envelope detection circuit, formed of an absolute value circuit followed by a low pass filter, to convert the sample signal to a unipolar signal to eliminate effects of antenna misorientation.
Holst, Birgitte; Hau, Jann; Rozell, Björn; Abelson, Klas Stig Peter
2014-01-01
Retro-bulbar sinus puncture and facial vein phlebotomy are two widely used methods for blood sampling in laboratory mice. However, the animal welfare implications associated with these techniques are currently debated, and the possible physiological and pathological implications of blood sampling using these methods have been sparsely investigated. Therefore, this study was conducted to assess and compare the impacts of blood sampling by retro-bulbar sinus puncture and facial vein phlebotomy. Blood was obtained from either the retro-bulbar sinus or the facial vein from male C57BL/6J mice at two time points, and the samples were analyzed for plasma corticosterone. Body weights were measured at the day of blood sampling and the day after blood sampling, and the food consumption was recorded automatically during the 24 hours post-procedure. At the end of study, cheeks and orbital regions were collected for histopathological analysis to assess the degree of tissue trauma. Mice subjected to facial vein phlebotomy had significantly elevated plasma corticosterone levels at both time points in contrast to mice subjected to retro-bulbar sinus puncture, which did not. Both groups of sampled mice lost weight following blood sampling, but the body weight loss was higher in mice subjected to facial vein phlebotomy. The food consumption was not significantly different between the two groups. At gross necropsy, subcutaneous hematomas were found in both groups and the histopathological analyses revealed extensive tissue trauma after both facial vein phlebotomy and retro-bulbar sinus puncture. This study demonstrates that both blood sampling methods have a considerable impact on the animals' physiological condition, which should be considered whenever blood samples are obtained. PMID:25426941
NASA Astrophysics Data System (ADS)
Consolandi, Guido
2017-04-01
The evolution of galaxies can be thought as the result of the cumulative effects of two broad classes of processes: (i) secular (internal) processes determined by the very nature of the galaxy, and (ii) external processes that are determined by the environment in which the object is embedded. In this thesis I face both aspects of galaxy evolution. Among secular processes, I investigated the effects of stellar bars on the gaseous components of galaxies and their consequences on their evolution. In particular I show how bars affect both the ionized and cold gas in two different samples: the sample of the Halpha3 survey, an Halpha imaging survey of galaxies selected from ALFALFA in the Local and Coma superclusters; the Herschel Reference Sample, a representative sample of 323 local galaxies observed with the space-based Herschel observatory sensitive to the far-infrared emission of dust, a good tracer of cold gas. Owing to the Halpha3 data I demonstrate that main sequence barred galaxies have specific star formation rate suppressed with respect to pure disks. Here I propose a simple model in which bars drive the evolution of disk galaxies. Hydrodynamical simulations indeed show that a barred potential funnels the gas inside the corotation radius toward the center of the galaxy where it reaches high densities, cools and can be consumed by a burst of star formation. At the same time the dynamical torque of the bar keeps the gas outside the corotation radius in place, cutting the gas supply to the central region that consequently stops its star formation activity. Taking advantage of the images of the HRS sample, we show the evidences of such quenching. The aforementioned model is further tested by studying the stellar population properties of galaxies belonging to a sample of 6000 galaxies extracted from SDSS. To this aim, I designed in-house IDL codes that automatically perform aperture photometry and isophotal fitting recovering reliable magnitudes, colors, ellipticity, position angle (P.A.) and color pr! ofiles. The automatic procedure is complemented by an automatic bar finder able to extract a fairly pure sample of barred galaxies on the basis of their P.A. and ellipticity profiles. The analysis of color profiles show that disk galaxies have their central regions redder (therefore quenched) than their outer regions and that this is more evident at high mass. The high local bar fraction that we extrapolate as well as the analysis of the average color profile of barred galaxies shows the strong contribution of bars to the observed colors. In a second part, I present the work done in the field of environmental processes. The work is focused on the analysis of the observations, carried on with the IFU MUSE, of a system belonging to the nearby galaxy cluster A1367. These observations mosaicked the galaxies UGC-66967 and CGCG-97087N, two galaxies suffering ram pressure stripping and that have possibly interacted, as hinted by the presence of gas in the region between them. Owing to in-house automatic Python codes and by comparing the gas velocities to the stellar kinematics, we could separate the emission of the ionized gas in a stripped component and a component still attached to the potential of the galaxy. While the gas onboard the galaxy shows low velocity dispersions and ionizations states consistent with photoionization by stars, the stripped gas is more turbulent and ionized by shocks. The HII regions that formed in the tail of UGC-66967 (but are absent in the tail of CGCG-97087N) are systematically found in regions where the velocity dispersion of the gas is lower than 50 km/s, while the stripped gas show typical velocity dispersions about or greater than 100 km/s.
Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel
2012-01-01
The potential of a headspace device coupled to multi-capillary column-ion mobility spectrometry has been studied as a screening system to differentiate virgin olive oils ("lampante," "virgin," and "extra virgin" olive oil). The last two types are virgin olive oil samples of very similar characteristics, which were very difficult to distinguish with the existing analytical method. The procedure involves the direct introduction of the virgin olive oil sample into a vial, headspace generation, and automatic injection of the volatiles into a gas chromatograph-ion mobility spectrometer. The data obtained after the analysis by duplicate of 98 samples of three different categories of virgin olive oils, were preprocessed and submitted to a detailed chemometric treatment to classify the virgin olive oil samples according to their sensory quality. The same virgin olive oil samples were also analyzed by an expert's panel to establish their category and use these data as reference values to check the potential of this new screening system. This comparison confirms the potential of the results presented here. The model was able to classify 97% of virgin olive oil samples in their corresponding group. Finally, the chemometric method was validated obtaining a percentage of prediction of 87%. These results provide promising perspectives for the use of ion mobility spectrometry to differentiate virgin olive oil samples according to their quality instead of using the classical analytical procedure.
Fingerprint Liveness Detection in the Presence of Capable Intruders.
Sequeira, Ana F; Cardoso, Jaime S
2015-06-19
Fingerprint liveness detection methods have been developed as an attempt to overcome the vulnerability of fingerprint biometric systems to spoofing attacks. Traditional approaches have been quite optimistic about the behavior of the intruder assuming the use of a previously known material. This assumption has led to the use of supervised techniques to estimate the performance of the methods, using both live and spoof samples to train the predictive models and evaluate each type of fake samples individually. Additionally, the background was often included in the sample representation, completely distorting the decision process. Therefore, we propose that an automatic segmentation step should be performed to isolate the fingerprint from the background and truly decide on the liveness of the fingerprint and not on the characteristics of the background. Also, we argue that one cannot aim to model the fake samples completely since the material used by the intruder is unknown beforehand. We approach the design by modeling the distribution of the live samples and predicting as fake the samples very unlikely according to that model. Our experiments compare the performance of the supervised approaches with the semi-supervised ones that rely solely on the live samples. The results obtained differ from the ones obtained by the more standard approaches which reinforces our conviction that the results in the literature are misleadingly estimating the true vulnerability of the biometric system.
Quality assurance in the pre-analytical phase of human urine samples by (1)H NMR spectroscopy.
Budde, Kathrin; Gök, Ömer-Necmi; Pietzner, Maik; Meisinger, Christine; Leitzmann, Michael; Nauck, Matthias; Köttgen, Anna; Friedrich, Nele
2016-01-01
Metabolomic approaches investigate changes in metabolite profiles, which may reflect changes in metabolic pathways and provide information correlated with a specific biological process or pathophysiology. High-resolution (1)H NMR spectroscopy is used to identify metabolites in biofluids and tissue samples qualitatively and quantitatively. This pre-analytical study evaluated the effects of storage time and temperature on (1)H NMR spectra from human urine in two settings. Firstly, to evaluate short time effects probably due to acute delay in sample handling and secondly, the effect of prolonged storage up to one month to find markers of sample miss-handling. A number of statistical procedures were used to assess the differences between samples stored under different conditions, including Projection to Latent Structure Discriminant Analysis (PLS-DA), non-parametric testing as well as mixed effect linear regression analysis. The results indicate that human urine samples can be stored at 10 °C for 24 h or at -80 °C for 1 month, as no relevant changes in (1)H NMR fingerprints were observed during these time periods and temperature conditions. However, some metabolites most likely of microbial origin showed alterations during prolonged storage but without facilitating classification. In conclusion, the presented protocol for urine sample handling and semi-automatic metabolite quantification is suitable for large-scale epidemiological studies. Copyright © 2015 Elsevier Inc. All rights reserved.
Fingerprint Liveness Detection in the Presence of Capable Intruders
Sequeira, Ana F.; Cardoso, Jaime S.
2015-01-01
Fingerprint liveness detection methods have been developed as an attempt to overcome the vulnerability of fingerprint biometric systems to spoofing attacks. Traditional approaches have been quite optimistic about the behavior of the intruder assuming the use of a previously known material. This assumption has led to the use of supervised techniques to estimate the performance of the methods, using both live and spoof samples to train the predictive models and evaluate each type of fake samples individually. Additionally, the background was often included in the sample representation, completely distorting the decision process. Therefore, we propose that an automatic segmentation step should be performed to isolate the fingerprint from the background and truly decide on the liveness of the fingerprint and not on the characteristics of the background. Also, we argue that one cannot aim to model the fake samples completely since the material used by the intruder is unknown beforehand. We approach the design by modeling the distribution of the live samples and predicting as fake the samples very unlikely according to that model. Our experiments compare the performance of the supervised approaches with the semi-supervised ones that rely solely on the live samples. The results obtained differ from the ones obtained by the more standard approaches which reinforces our conviction that the results in the literature are misleadingly estimating the true vulnerability of the biometric system. PMID:26102491
Automation of ⁹⁹Tc extraction by LOV prior ICP-MS detection: application to environmental samples.
Rodríguez, Rogelio; Leal, Luz; Miranda, Silvia; Ferrer, Laura; Avivar, Jessica; García, Ariel; Cerdà, Víctor
2015-02-01
A new, fast, automated and inexpensive sample pre-treatment method for (99)Tc determination by inductively coupled plasma-mass spectrometry (ICP-MS) detection is presented. The miniaturized approach is based on a lab-on-valve (LOV) system, allowing automatic separation and preconcentration of (99)Tc. Selectivity is provided by the solid phase extraction system used (TEVA resin) which retains selectively pertechnetate ion in diluted nitric acid solution. The proposed system has some advantages such as minimization of sample handling, reduction of reagents volume, improvement of intermediate precision and sample throughput, offering a significant decrease of both time and cost per analysis in comparison to other flow techniques and batch methods. The proposed LOV system has been successfully applied to different samples of environmental interest (water and soil) with satisfactory recoveries, between 94% and 98%. The detection limit (LOD) of the developed method is 0.005 ng. The high durability of the resin and its low amount (32 mg), its good intermediate precision (RSD 3.8%) and repeatability (RSD 2%) and its high extraction frequency (up to 5 h(-1)) makes this method an inexpensive, high precision and fast tool for monitoring (99)Tc in environmental samples. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.
2013-03-01
Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.
An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions
Li, Weixuan; Lin, Guang
2015-03-21
Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less
BLIND ordering of large-scale transcriptomic developmental timecourses.
Anavy, Leon; Levin, Michal; Khair, Sally; Nakanishi, Nagayasu; Fernandez-Valverde, Selene L; Degnan, Bernard M; Yanai, Itai
2014-03-01
RNA-Seq enables the efficient transcriptome sequencing of many samples from small amounts of material, but the analysis of these data remains challenging. In particular, in developmental studies, RNA-Seq is challenged by the morphological staging of samples, such as embryos, since these often lack clear markers at any particular stage. In such cases, the automatic identification of the stage of a sample would enable previously infeasible experimental designs. Here we present the 'basic linear index determination of transcriptomes' (BLIND) method for ordering samples comprising different developmental stages. The method is an implementation of a traveling salesman algorithm to order the transcriptomes according to their inter-relationships as defined by principal components analysis. To establish the direction of the ordered samples, we show that an appropriate indicator is the entropy of transcriptomic gene expression levels, which increases over developmental time. Using BLIND, we correctly recover the annotated order of previously published embryonic transcriptomic timecourses for frog, mosquito, fly and zebrafish. We further demonstrate the efficacy of BLIND by collecting 59 embryos of the sponge Amphimedon queenslandica and ordering their transcriptomes according to developmental stage. BLIND is thus useful in establishing the temporal order of samples within large datasets and is of particular relevance to the study of organisms with asynchronous development and when morphological staging is difficult.
Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S
2015-02-01
With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.
An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Lin, Guang, E-mail: guanglin@purdue.edu
2015-08-01
Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less
Comparison of water-quality samples collected by siphon samplers and automatic samplers in Wisconsin
Graczyk, David J.; Robertson, Dale M.; Rose, William J.; Steur, Jeffrey J.
2000-01-01
In small streams, flow and water-quality concentrations often change quickly in response to meteorological events. Hydrologists, field technicians, or locally hired stream ob- servers involved in water-data collection are often unable to reach streams quickly enough to observe or measure these rapid changes. Therefore, in hydrologic studies designed to describe changes in water quality, a combination of manual and automated sampling methods have commonly been used manual methods when flow is relatively stable and automated methods when flow is rapidly changing. Auto- mated sampling, which makes use of equipment programmed to collect samples in response to changes in stage and flow of a stream, has been shown to be an effective method of sampling to describe the rapid changes in water quality (Graczyk and others, 1993). Because of the high cost of automated sampling, however, especially for studies examining a large number of sites, alternative methods have been considered for collecting samples during rapidly changing stream conditions. One such method employs the siphon sampler (fig. 1). also referred to as the "single-stage sampler." Siphon samplers are inexpensive to build (about $25- $50 per sampler), operate, and maintain, so they are cost effective to use at a large number of sites. Their ability to collect samples representing the average quality of water passing though the entire cross section of a stream, however, has not been fully demonstrated for many types of stream sites.
ERIC Educational Resources Information Center
Moses, Tim; Holland, Paul
2009-01-01
This simulation study evaluated the potential of alternative loglinear smoothing strategies for improving equipercentile equating function accuracy. These alternative strategies use cues from the sample data to make automatable and efficient improvements to model fit, either through the use of indicator functions for fitting large residuals or by…
A Class of Population Covariance Matrices in the Bootstrap Approach to Covariance Structure Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro; Yanagihara, Hirokazu
2007-01-01
Model evaluation in covariance structure analysis is critical before the results can be trusted. Due to finite sample sizes and unknown distributions of real data, existing conclusions regarding a particular statistic may not be applicable in practice. The bootstrap procedure automatically takes care of the unknown distribution and, for a given…
A new machine classification method applied to human peripheral blood leukocytes
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.; Fitzpatrick, Steven J.; Vitthal, Sanjay; Ladoulis, Charles T.
1994-01-01
Human beings judge images by complex mental processes, whereas computing machines extract features. By reducing scaled human judgments and machine extracted features to a common metric space and fitting them by regression, the judgments of human experts rendered on a sample of images may be imposed on an image population to provide automatic classification.
Constant frequency pulsed phase-locked loop measuring device
NASA Technical Reports Server (NTRS)
Yost, William T. (Inventor); Kushnick, Peter W. (Inventor); Cantrell, John H. (Inventor)
1993-01-01
A measuring apparatus is presented that uses a fixed frequency oscillator to measure small changes in the phase velocity ultrasonic sound when a sample is exposed to environmental changes such as changes in pressure, temperature, etc. The invention automatically balances electrical phase shifts against the acoustical phase shifts in order to obtain an accurate measurement of electrical phase shifts.
ERIC Educational Resources Information Center
Economou, A.; Tzanavaras, P. D.; Themelis, D. G.
2005-01-01
The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…
40 CFR 86.107-98 - Sampling and analytical system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... automatic sealing opening of the boot during fueling. There shall be no loss in the gas tightness of the... system (recorder and sensor) shall have an accuracy of ±3 °F (±1.7 °C). The recorder (data processor... ambient temperature sensors, connected to provide one average output, located 3 feet above the floor at...
40 CFR 86.107-98 - Sampling and analytical system.
Code of Federal Regulations, 2011 CFR
2011-07-01
... automatic sealing opening of the boot during fueling. There shall be no loss in the gas tightness of the... system (recorder and sensor) shall have an accuracy of ±3 °F (±1.7 °C). The recorder (data processor... ambient temperature sensors, connected to provide one average output, located 3 feet above the floor at...
2017-03-01
the Center for Technology Enhanced Language Learning (CTELL), a research cell in the Department of Foreign Languages, United States Military Academy...models for automatic speech recognition (ASR), and to, thereby, investigate the utility of ASR in pedagogical technology . The corpus is a sample of...lexical resources, language technology 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF
Automatic devices to take water samples and to raise trash screens at weirs
K. G. Reinhart; R. E. Leonard; G. E. Hart
1960-01-01
Experimentation on small watersheds is assuming increasing importance in watershed-management research. Much has been accomplished in developing adequate instrumentation for use in these experiments. Yet many problems still await solution. One difficulty encountered is that small streams are subject to wide variations in flow and that these variations are generally...
Analysis of helium-ion scattering with a desktop computer
NASA Astrophysics Data System (ADS)
Butler, J. W.
1986-04-01
This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.
Yasmin, Rubina; Barber, Cheryl A.; Castro, Talita; Malamud, Daniel; Kim, Beum Jun; Zhu, Hui; Montagna, Richard A.; Abrams, William R.
2018-01-01
In recent years, there have been increasing numbers of infectious disease outbreaks that spread rapidly to population centers resulting from global travel, population vulnerabilities, environmental factors, and ecological disasters such as floods and earthquakes. Some examples of the recent outbreaks are the Ebola epidemic in West Africa, Middle East respiratory syndrome coronavirus (MERS-Co) in the Middle East, and the Zika outbreak through the Americas. We have created a generic protocol for detection of pathogen RNA and/or DNA using loop-mediated isothermal amplification (LAMP) and reverse dot-blot for detection (RDB) and processed automatically in a microfluidic device. In particular, we describe how a microfluidic assay to detect HIV viral RNA was converted to detect Zika virus (ZIKV) RNA. We first optimized the RT-LAMP assay to detect ZIKV RNA using a benchtop isothermal amplification device. Then we implemented the assay in a microfluidic device that will allow analyzing 24 samples simultaneously and automatically from sample introduction to detection by RDB technique. Preliminary data using saliva samples spiked with ZIKV showed that our diagnostic system detects ZIKV RNA in saliva. These results will be validated in further experiments with well-characterized ZIKV human specimens of saliva. The described strategy and methodology to convert the HIV diagnostic assay and platform to a ZIKV RNA detection assay provides a model that can be readily utilized for detection of the next emerging or re-emerging infectious disease. PMID:29401479
A machine learning approach for classification of anatomical coverage in CT
NASA Astrophysics Data System (ADS)
Wang, Xiaoyong; Lo, Pechin; Ramakrishna, Bharath; Goldin, Johnathan; Brown, Matthew
2016-03-01
Automatic classification of anatomical coverage of medical images is critical for big data mining and as a pre-processing step to automatically trigger specific computer aided diagnosis systems. The traditional way to identify scans through DICOM headers has various limitations due to manual entry of series descriptions and non-standardized naming conventions. In this study, we present a machine learning approach where multiple binary classifiers were used to classify different anatomical coverages of CT scans. A one-vs-rest strategy was applied. For a given training set, a template scan was selected from the positive samples and all other scans were registered to it. Each registered scan was then evenly split into k × k × k non-overlapping blocks and for each block the mean intensity was computed. This resulted in a 1 × k3 feature vector for each scan. The feature vectors were then used to train a SVM based classifier. In this feasibility study, four classifiers were built to identify anatomic coverages of brain, chest, abdomen-pelvis, and chest-abdomen-pelvis CT scans. Each classifier was trained and tested using a set of 300 scans from different subjects, composed of 150 positive samples and 150 negative samples. Area under the ROC curve (AUC) of the testing set was measured to evaluate the performance in a two-fold cross validation setting. Our results showed good classification performance with an average AUC of 0.96.
Fratianni, Alessandra; Irano, Mario; Panfili, Gianfranco; Acquistucci, Rita
2005-04-06
Color is an important parameter involved in the definition of semolina and pasta quality. This character is mainly due to natural pigments (carotenoids) that are present at different levels in cereals and cereal products, due to botanical origin, growing conditions, distribution in the kernel, and technological processes. In food industries, color measurements are usually performed by means of automatic instruments that are rapid and safe, as alternatives to the chemical extraction methods. In this study, automatic measurements (CIE, color-space system L, a, b), water-saturated butanol (WSB), and HPLC determinations have been applied to evaluate the carotenoid content in whole meals and respective semolina samples produced from wheat cultivated in the years 2001 and 2002. In whole meals, total carotenoids, determined by HPLC, were about 3.0 microg/g (2001) and 3.5 microg/g (2002) calculated on dry weight (dw) and about 3.0 and 3.2 microg/g dw in corresponding semolina samples. The b values for the same period were 19.78 and 15.75, respectively, in raw materials and 20.03-21.67 in semolina. Results have confirmed lutein and beta-carotene as the main components mainly responsible for the yellow color in wheat grains. The ability of the index b to express natural dyeing was dependent on sample characteristics as demonstrated by the relationships found between this index and pigments, although the best correlation resulted between HPLC and WSB.
Sabalza, Maite; Yasmin, Rubina; Barber, Cheryl A; Castro, Talita; Malamud, Daniel; Kim, Beum Jun; Zhu, Hui; Montagna, Richard A; Abrams, William R
2018-01-01
In recent years, there have been increasing numbers of infectious disease outbreaks that spread rapidly to population centers resulting from global travel, population vulnerabilities, environmental factors, and ecological disasters such as floods and earthquakes. Some examples of the recent outbreaks are the Ebola epidemic in West Africa, Middle East respiratory syndrome coronavirus (MERS-Co) in the Middle East, and the Zika outbreak through the Americas. We have created a generic protocol for detection of pathogen RNA and/or DNA using loop-mediated isothermal amplification (LAMP) and reverse dot-blot for detection (RDB) and processed automatically in a microfluidic device. In particular, we describe how a microfluidic assay to detect HIV viral RNA was converted to detect Zika virus (ZIKV) RNA. We first optimized the RT-LAMP assay to detect ZIKV RNA using a benchtop isothermal amplification device. Then we implemented the assay in a microfluidic device that will allow analyzing 24 samples simultaneously and automatically from sample introduction to detection by RDB technique. Preliminary data using saliva samples spiked with ZIKV showed that our diagnostic system detects ZIKV RNA in saliva. These results will be validated in further experiments with well-characterized ZIKV human specimens of saliva. The described strategy and methodology to convert the HIV diagnostic assay and platform to a ZIKV RNA detection assay provides a model that can be readily utilized for detection of the next emerging or re-emerging infectious disease.
Li, Pingjing; He, Man; Chen, Beibei; Hu, Bin
2015-10-09
A simple home-made automatic dynamic hollow fiber based liquid-liquid-liquid microextraction (AD-HF-LLLME) device was designed and constructed for the simultaneous extraction of organomercury and inorganic mercury species with the assistant of a programmable flow injection analyzer. With 18-crown-6 as the complexing reagent, mercury species including methyl-, ethyl-, phenyl- and inorganic mercury were extracted into the organic phase (chlorobenzene), and then back-extracted into the acceptor phase of 0.1% (m/v) 3-mercapto-1-propanesulfonic acid (MPS) aqueous solution. Compared with automatic static (AS)-HF-LLLME system, the extraction equilibrium of target mercury species was obtained in shorter time with higher extraction efficiency in AD-HF-LLLME system. Based on it, a new method of AD-HF-LLLME coupled with large volume sample stacking (LVSS)-capillary electrophoresis (CE)/UV detection was developed for the simultaneous analysis of methyl-, phenyl- and inorganic mercury species in biological samples and environmental water. Under the optimized conditions, AD-HF-LLLME provided high enrichment factors (EFs) of 149-253-fold within relatively short extraction equilibrium time (25min) and good precision with RSD between 3.8 and 8.1%. By combining AD-HF-LLLME with LVSS-CE/UV, EFs were magnified up to 2195-fold and the limits of detection (at S/N=3) for target mercury species were improved to be sub ppb level. Copyright © 2015 Elsevier B.V. All rights reserved.
Automatic sequential fluid handling with multilayer microfluidic sample isolated pumping
Liu, Jixiao; Fu, Hai; Yang, Tianhang; Li, Songjing
2015-01-01
To sequentially handle fluids is of great significance in quantitative biology, analytical chemistry, and bioassays. However, the technological options are limited when building such microfluidic sequential processing systems, and one of the encountered challenges is the need for reliable, efficient, and mass-production available microfluidic pumping methods. Herein, we present a bubble-free and pumping-control unified liquid handling method that is compatible with large-scale manufacture, termed multilayer microfluidic sample isolated pumping (mμSIP). The core part of the mμSIP is the selective permeable membrane that isolates the fluidic layer from the pneumatic layer. The air diffusion from the fluidic channel network into the degassing pneumatic channel network leads to fluidic channel pressure variation, which further results in consistent bubble-free liquid pumping into the channels and the dead-end chambers. We characterize the mμSIP by comparing the fluidic actuation processes with different parameters and a flow rate range of 0.013 μl/s to 0.097 μl/s is observed in the experiments. As the proof of concept, we demonstrate an automatic sequential fluid handling system aiming at digital assays and immunoassays, which further proves the unified pumping-control and suggests that the mμSIP is suitable for functional microfluidic assays with minimal operations. We believe that the mμSIP technology and demonstrated automatic sequential fluid handling system would enrich the microfluidic toolbox and benefit further inventions. PMID:26487904
NASA Astrophysics Data System (ADS)
Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A.; Moreno-Paz, Mercedes; Rivas, Luis A.; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C.; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R.
2008-10-01
A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.
Dantas, Hebertty V; Barbosa, Mayara F; Nascimento, Elaine C L; Moreira, Pablo N T; Galvão, Roberto K H; Araújo, Mário C U
2013-03-15
This paper proposes a NIR spectrometric method for screening analysis of liquefied petroleum gas (LPG) samples. The proposed method is aimed at discriminating samples with low and high propane content, which can be useful for the adjustment of burn settings in industrial applications. A gas flow system was developed to introduce the LPG sample into a NIR flow cell at constant pressure. In addition, a gas chromatographer was employed to determine the propane content of the sample for reference purposes. The results of a principal component analysis, as well as a classification study using SIMCA (soft independent modeling of class analogies), revealed that the samples can be successfully discriminated with respect to propane content by using the NIR spectrum in the range 8100-8800 cm(-1). In addition, by using SPA-LDA (linear discriminant analysis with variables selected by the successive projections algorithm), it was found that perfect discrimination can also be achieved by using only two wavenumbers (8215 and 8324 cm(-1)). This finding may be of value for the design of a dedicated, low-cost instrument for routine analyses. Copyright © 2012 Elsevier B.V. All rights reserved.
Parro, Víctor; Fernández-Calvo, Patricia; Rodríguez Manfredi, José A; Moreno-Paz, Mercedes; Rivas, Luis A; García-Villadangos, Miriam; Bonaccorsi, Rosalba; González-Pastor, José Eduardo; Prieto-Ballesteros, Olga; Schuerger, Andrew C; Davidson, Mark; Gómez-Elvira, Javier; Stoker, Carol R
2008-10-01
A field prototype of an antibody array-based life-detector instrument, Signs Of LIfe Detector (SOLID2), has been tested in a Mars drilling mission simulation called MARTE (Mars Astrobiology Research and Technology Experiment). As one of the analytical instruments on the MARTE robotic drilling rig, SOLID2 performed automatic sample processing and analysis of ground core samples (0.5 g) with protein microarrays that contained 157 different antibodies. Core samples from different depths (down to 5.5 m) were analyzed, and positive reactions were obtained in antibodies raised against the Gram-negative bacterium Leptospirillum ferrooxidans, a species of the genus Acidithiobacillus (both common microorganisms in the Río Tinto area), and extracts from biofilms and other natural samples from the Río Tinto area. These positive reactions were absent when the samples were previously subjected to a high-temperature treatment, which indicates the biological origin and structural dependency of the antibody-antigen reactions. We conclude that an antibody array-based life-detector instrument like SOLID2 can detect complex biological material, and it should be considered as a potential analytical instrument for future planetary missions that search for life.
Self-paced model learning for robust visual tracking
NASA Astrophysics Data System (ADS)
Huang, Wenhui; Gu, Jason; Ma, Xin; Li, Yibin
2017-01-01
In visual tracking, learning a robust and efficient appearance model is a challenging task. Model learning determines both the strategy and the frequency of model updating, which contains many details that could affect the tracking results. Self-paced learning (SPL) has recently been attracting considerable interest in the fields of machine learning and computer vision. SPL is inspired by the learning principle underlying the cognitive process of humans, whose learning process is generally from easier samples to more complex aspects of a task. We propose a tracking method that integrates the learning paradigm of SPL into visual tracking, so reliable samples can be automatically selected for model learning. In contrast to many existing model learning strategies in visual tracking, we discover the missing link between sample selection and model learning, which are combined into a single objective function in our approach. Sample weights and model parameters can be learned by minimizing this single objective function. Additionally, to solve the real-valued learning weight of samples, an error-tolerant self-paced function that considers the characteristics of visual tracking is proposed. We demonstrate the robustness and efficiency of our tracker on a recent tracking benchmark data set with 50 video sequences.
NASA Technical Reports Server (NTRS)
Halyo, N.
1983-01-01
The design and development of a 3-D Digital Integrated Automatic Landing System (DIALS) for the Terminal Configured Vehicle (TCV) Research Aircraft, a B-737-100 is described. The system was designed using sampled data Linear Quadratic Gaussian (LOG) methods, resulting in a direct digital design with a modern control structure which consists of a Kalman filter followed by a control gain matrix, all operating at 10 Hz. DIALS uses Microwave Landing System (MLS) position, body-mounted accelerometers, as well as on-board sensors usually available on commercial aircraft, but does not use inertial platforms. The phases of the final approach considered are the localizer and glideslope capture which may be performed simultaneously, localizer and steep glideslope track or hold, crab/decrab and flare to touchdown. DIALS captures, tracks and flares from steep glideslopes ranging from 2.5 deg to 5.5 deg, selected prior to glideslope capture. Digital Integrated Automatic Landing System is the first modern control design automatic landing system successfully flight tested. The results of an initial nonlinear simulation are presented here.
NASA Astrophysics Data System (ADS)
Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab
2015-12-01
Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.
Accelerating IMRT optimization by voxel sampling
NASA Astrophysics Data System (ADS)
Martin, Benjamin C.; Bortfeld, Thomas R.; Castañon, David A.
2007-12-01
This paper presents a new method for accelerating intensity-modulated radiation therapy (IMRT) optimization using voxel sampling. Rather than calculating the dose to the entire patient at each step in the optimization, the dose is only calculated for some randomly selected voxels. Those voxels are then used to calculate estimates of the objective and gradient which are used in a randomized version of a steepest descent algorithm. By selecting different voxels on each step, we are able to find an optimal solution to the full problem. We also present an algorithm to automatically choose the best sampling rate for each structure within the patient during the optimization. Seeking further improvements, we experimented with several other gradient-based optimization algorithms and found that the delta-bar-delta algorithm performs well despite the randomness. Overall, we were able to achieve approximately an order of magnitude speedup on our test case as compared to steepest descent.
A tool for developing an automatic insect identification system based on wing outlines
Yang, He-Ping; Ma, Chun-Sen; Wen, Hui; Zhan, Qing-Bin; Wang, Xin-Li
2015-01-01
For some insect groups, wing outline is an important character for species identification. We have constructed a program as the integral part of an automated system to identify insects based on wing outlines (DAIIS). This program includes two main functions: (1) outline digitization and Elliptic Fourier transformation and (2) classifier model training by pattern recognition of support vector machines and model validation. To demonstrate the utility of this program, a sample of 120 owlflies (Neuroptera: Ascalaphidae) was split into training and validation sets. After training, the sample was sorted into seven species using this tool. In five repeated experiments, the mean accuracy for identification of each species ranged from 90% to 98%. The accuracy increased to 99% when the samples were first divided into two groups based on features of their compound eyes. DAIIS can therefore be a useful tool for developing a system of automated insect identification. PMID:26251292
Heavy metals phytoremediation potential of Hevea brasiliensis in Bentong, Malaysia
NASA Astrophysics Data System (ADS)
Yusof, Muhammad Jefri Mohd; Latif, Mohd Talip; Yusoff, Siti Fairus Mohd
2018-04-01
Biomonitoring uses living organisms to assess environmental quality and being preferred over conventional methods that use fully or semi-automatic gauges for its lower cost and practicality. Recently, higher plants are widely used for biomonitoring purposes by means of their species identification simplicity, larger availability of biological substantial, and easy to sample. In this study, samples of Hevea brasiliensis (i.e leaves, barks, and latex as well as surrounding soils) from outskirts of Pelangai, Bentong were tested for heavy metals by using inductively coupled plasma optical emission spectroscopy (ICP-OES). Enrichment factor of soils indicated that some metals (B, Ca, Cu, Mn, Pb, Zn, As and Na) were anthropogenic which most likely originated from traffic emissions. In addition, leaves trapped the most heavy metals compared to barks and latex. The accumulation of pollutants in those samples has identified biomonitoring abilities of Hevea brasiliensis.
Universal explosive detection system for homeland security applications
NASA Astrophysics Data System (ADS)
Lee, Vincent Y.; Bromberg, Edward E. A.
2010-04-01
L-3 Communications CyTerra Corporation has developed a high throughput universal explosive detection system (PassPort) to automatically screen the passengers in airports without requiring them to remove their shoes. The technical approach is based on the patented energetic material detection (EMD) technology. By analyzing the results of sample heating with an infrared camera, one can distinguish the deflagration or decomposition of an energetic material from other clutters such as flammables and general background substances. This becomes the basis of a universal explosive detection system that does not require a library and is capable of detecting trace levels of explosives with a low false alarm rate. The PassPort is a simple turnstile type device and integrates a non-intrusive aerodynamic sampling scheme that has been shown capable of detecting trace levels of explosives on shoes. A detailed description of the detection theory and the automated sampling techniques, as well as the field test results, will be presented.
Headspace profiling of cocaine samples for intelligence purposes.
Dujourdy, Laurence; Besacier, Fabrice
2008-08-06
A method for determination of residual solvents in illicit hydrochloride cocaine samples using static headspace-gas chromatography (HS-GC) associated with a storage computerized procedure is described for the profiling and comparison of seizures. The system involves a gas chromatographic separation of 18 occluded solvents followed by fully automatic data analysis and transfer to a PHP/MySQL database. First, a fractional factorial design was used to evaluate the main effects of some critical method parameters (salt choice, vial agitation intensity, oven temperature, pressurization and loop equilibration) on the results with a minimum of experiments. The method was then validated for tactical intelligence purposes (batch comparison) via several studies: selection of solvents and mathematical comparison tool, reproducibility and "cutting" influence studies. The decision threshold to determine the similarity of two samples was set and false positives and negatives evaluated. Finally, application of the method to distinguish geographical origins is discussed.
Patient safety with blood products administration using wireless and bar-code technology.
Porcella, Aleta; Walker, Kristy
2005-01-01
Supported by a grant from the Agency for Healthcare Research and Quality, a University of Iowa Hospitals and Clinics interdisciplinary research team created an online data-capture-response tool utilizing wireless mobile devices and bar code technology to track and improve blood products administration process. The tool captures 1) sample collection, 2) sample arrival in the blood bank, 3) blood product dispense from blood bank, and 4) administration. At each step, the scanned patient wristband ID bar code is automatically compared to scanned identification barcode on requisition, sample, and/or product, and the system presents either a confirmation or an error message to the user. Following an eight-month, 5 unit, staged pilot, a 'big bang,' hospital-wide implementation occurred on February 7, 2005. Preliminary results from pilot data indicate that the new barcode process captures errors 3 to 10 times better than the old manual process.
Influence of Sample Size of Polymer Materials on Aging Characteristics in the Salt Fog Test
NASA Astrophysics Data System (ADS)
Otsubo, Masahisa; Anami, Naoya; Yamashita, Seiji; Honda, Chikahisa; Takenouchi, Osamu; Hashimoto, Yousuke
Polymer insulators have been used in worldwide because of some superior properties; light weight, high mechanical strength, good hydrophobicity etc., as compared with porcelain insulators. In this paper, effect of sample size on the aging characteristics in the salt fog test is examined. Leakage current was measured by using 100 MHz AD board or 100 MHz digital oscilloscope and separated three components as conductive current, corona discharge current and dry band arc discharge current by using FFT and the current differential method newly proposed. Each component cumulative charge was estimated automatically by a personal computer. As the results, when the sample size increased under the same average applied electric field, the peak values of leakage current and each component current increased. Especially, the cumulative charges and the arc discharge length of dry band arc discharge increased remarkably with the increase of gap length.
Self-similarity Clustering Event Detection Based on Triggers Guidance
NASA Astrophysics Data System (ADS)
Zhang, Xianfei; Li, Bicheng; Tian, Yuxuan
Traditional method of Event Detection and Characterization (EDC) regards event detection task as classification problem. It makes words as samples to train classifier, which can lead to positive and negative samples of classifier imbalance. Meanwhile, there is data sparseness problem of this method when the corpus is small. This paper doesn't classify event using word as samples, but cluster event in judging event types. It adopts self-similarity to convergence the value of K in K-means algorithm by the guidance of event triggers, and optimizes clustering algorithm. Then, combining with named entity and its comparative position information, the new method further make sure the pinpoint type of event. The new method avoids depending on template of event in tradition methods, and its result of event detection can well be used in automatic text summarization, text retrieval, and topic detection and tracking.
Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud
NASA Astrophysics Data System (ADS)
Chen, Jianqin; Zhu, Hehua; Li, Xiaojun
2016-10-01
This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.
Moderators of the Relationship between Implicit and Explicit Evaluation
Nosek, Brian A.
2005-01-01
Automatic and controlled modes of evaluation sometimes provide conflicting reports of the quality of social objects. This paper presents evidence for four moderators of the relationship between automatic (implicit) and controlled (explicit) evaluations. Implicit and explicit preferences were measured for a variety of object pairs using a large sample. The average correlation was r = .36, and 52 of the 57 object pairs showed a significant positive correlation. Results of multilevel modeling analyses suggested that: (a) implicit and explicit preferences are related, (b) the relationship varies as a function of the objects assessed, and (c) at least four variables moderate the relationship – self-presentation, evaluative strength, dimensionality, and distinctiveness. The variables moderated implicit-explicit correspondence across individuals and accounted for much of the observed variation across content domains. The resulting model of the relationship between automatic and controlled evaluative processes is grounded in personal experience with the targets of evaluation. PMID:16316292
Diffraction phase microscopy realized with an automatic digital pinhole
NASA Astrophysics Data System (ADS)
Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Zhang, Zhimin; Liu, Xu
2017-12-01
We report a novel approach to diffraction phase microscopy (DPM) with automatic pinhole alignment. The pinhole, which serves as a spatial low-pass filter to generate a uniform reference beam, is made out of a liquid crystal display (LCD) device that allows for electrical control. We have made DPM more accessible to users, while maintaining high phase measurement sensitivity and accuracy, through exploring low cost optical components and replacing the tedious pinhole alignment process with an automatic pinhole optical alignment procedure. Due to its flexibility in modifying the size and shape, this LCD device serves as a universal filter, requiring no future replacement. Moreover, a graphic user interface for real-time phase imaging has been also developed by using a USB CMOS camera. Experimental results of height maps of beads sample and live red blood cells (RBCs) dynamics are also presented, making this system ready for broad adaption to biological imaging and material metrology.
NASA Astrophysics Data System (ADS)
Giorgino, Toni
2018-07-01
The proper choice of collective variables (CVs) is central to biased-sampling free energy reconstruction methods in molecular dynamics simulations. The PLUMED 2 library, for instance, provides several sophisticated CV choices, implemented in a C++ framework; however, developing new CVs is still time consuming due to the need to provide code for the analytical derivatives of all functions with respect to atomic coordinates. We present two solutions to this problem, namely (a) symbolic differentiation and code generation, and (b) automatic code differentiation, in both cases leveraging open-source libraries (SymPy and Stan Math, respectively). The two approaches are demonstrated and discussed in detail implementing a realistic example CV, the local radius of curvature of a polymer. Users may use the code as a template to streamline the implementation of their own CVs using high-level constructs and automatic gradient computation.
Sample Preparation for Electron Probe Microanalysis—Pushing the Limits
Geller, Joseph D.; Engle, Paul D.
2002-01-01
There are two fundamental considerations in preparing samples for electron probe microanalysis (EPMA). The first one may seem obvious, but we often find it is overlooked. That is, the sample analyzed should be representative of the population from which it comes. The second is a direct result of the assumptions in the calculations used to convert x-ray intensity ratios, between the sample and standard, to concentrations. Samples originate from a wide range of sources. During their journey to being excited under the electron beam for the production of x rays there are many possibilities for sample alteration. Handling can contaminate samples by adding extraneous matter. In preparation, the various abrasives used in sizing the sample by sawing, grinding and polishing can embed themselves. The most accurate composition of a contaminated sample is, at best, not representative of the original sample; it is misleading. Our laboratory performs EPMA analysis on customer submitted samples and prepares over 250 different calibration standards including pure elements, compounds, alloys, glasses and minerals. This large variety of samples does not lend itself to mass production techniques, including automatic polishing. Our manual preparation techniques are designed individually for each sample. The use of automated preparation equipment does not lend itself to this environment, and is not included in this manuscript. The final step in quantitative electron probe microanalysis is the conversion of x-ray intensities ratios, known as the “k-ratios,” to composition (in mass fraction or atomic percent) and/or film thickness. Of the many assumptions made in the ZAF (where these letters stand for atomic number, absorption and fluorescence) corrections the localized geometry between the sample and electron beam, or takeoff angle, must be accurately known. Small angular errors can lead to significant errors in the final results. The sample preparation technique then becomes very important, and, under certain conditions, may even be the limiting factor in the analytical uncertainty budget. This paper considers preparing samples to get known geometries. It will not address the analysis of samples with irregular, unprepared surfaces or unknown geometries. PMID:27446757
Reincke, Ulrich; Michelmann, Hans Wilhelm
2009-01-01
Background Both healthy and sick people increasingly use electronic media to obtain medical information and advice. For example, Internet users may send requests to Web-based expert forums, or so-called “ask the doctor” services. Objective To automatically classify lay requests to an Internet medical expert forum using a combination of different text-mining strategies. Methods We first manually classified a sample of 988 requests directed to a involuntary childlessness forum on the German website “Rund ums Baby” (“Everything about Babies”) into one or more of 38 categories belonging to two dimensions (“subject matter” and “expectations”). After creating start and synonym lists, we calculated the average Cramer’s V statistic for the association of each word with each category. We also used principle component analysis and singular value decomposition as further text-mining strategies. With these measures we trained regression models and determined, on the basis of best regression models, for any request the probability of belonging to each of the 38 different categories, with a cutoff of 50%. Recall and precision of a test sample were calculated as a measure of quality for the automatic classification. Results According to the manual classification of 988 documents, 102 (10%) documents fell into the category “in vitro fertilization (IVF),” 81 (8%) into the category “ovulation,” 79 (8%) into “cycle,” and 57 (6%) into “semen analysis.” These were the four most frequent categories in the subject matter dimension (consisting of 32 categories). The expectation dimension comprised six categories; we classified 533 documents (54%) as “general information” and 351 (36%) as a wish for “treatment recommendations.” The generation of indicator variables based on the chi-square analysis and Cramer’s V proved to be the best approach for automatic classification in about half of the categories. In combination with the two other approaches, 100% precision and 100% recall were realized in 18 (47%) out of the 38 categories in the test sample. For 35 (92%) categories, precision and recall were better than 80%. For some categories, the input variables (ie, “words”) also included variables from other categories, most often with a negative sign. For example, absence of words predictive for “menstruation” was a strong indicator for the category “pregnancy test.” Conclusions Our approach suggests a way of automatically classifying and analyzing unstructured information in Internet expert forums. The technique can perform a preliminary categorization of new requests and help Internet medical experts to better handle the mass of information and to give professional feedback. PMID:19632978
Chemiluminescence Study on Thermal Degradation of Aircraft Tire Elastomers
NASA Technical Reports Server (NTRS)
Mendenhall, G. D.; Stanford, T. B.; Nathan, R. A.
1976-01-01
Since the autoxidative process accounts in part for the degradation of rubber, including aircraft tires, it was felt that a study of the chemiluminescence from unsaturated elastomers could contribute significantly to an understanding of the degradation mechanism. The study revealed similarities in chemiluminescence behavior between four elastomers which were investigated, and it shows that similar oxidation mechanisms occur. Oxidative chemiluminescence was observed from purified samples of cis-1,4-polybutadiene, cis-1,4-polyisoprene, trans-polypentenamer, and 1,2-polybutadiene in an oxygen atmosphere at 25-150 C. The elastomer samples were placed in a 600 watt oven which is equipped with gas inlets for introducing any desired atmosphere. Chemiluminescence emission from the samples was focused with a two inch quartz lens onto the detector of a 12" photomultiplier which is connected to a photon counter. A strip-chart recorder, connected to the counter, permitted automatic data collection. Diagrams of the apparatus are included. The chemical reactions which occurred from the thermal decomposition of the polymer samples are described, and results (and tabulated data) are discussed.
[Establishment of Automation System for Detection of Alcohol in Blood].
Tian, L L; Shen, Lei; Xue, J F; Liu, M M; Liang, L J
2017-02-01
To establish an automation system for detection of alcohol content in blood. The determination was performed by automated workstation of extraction-headspace gas chromatography (HS-GC). The blood collection with negative pressure, sealing time of headspace bottle and sample needle were checked and optimized in the abstraction of automation system. The automatic sampling was compared with the manual sampling. The quantitative data obtained by the automated workstation of extraction-HS-GC for alcohol was stable. The relative differences of two parallel samples were less than 5%. The automated extraction was superior to the manual extraction. A good linear relationship was obtained at the alcohol concentration range of 0.1-3.0 mg/mL ( r ≥0.999) with good repeatability. The method is simple and quick, with more standard experiment process and accurate experimental data. It eliminates the error from the experimenter and has good repeatability, which can be applied to the qualitative and quantitative detections of alcohol in blood. Copyright© by the Editorial Department of Journal of Forensic Medicine
Poppe, L.J.; Commeau, J.A.; Pense, G.M.
1989-01-01
Silver metal-membrane filters are commonly used as substrates in the preparation of oriented clay-mineral specimens for X-ray powder diffraction (XRD). They are relatively unaffected by organic solvent treatments and specimens can be prepared rapidly. The filter mounts are adaptable to automatic sample changers, have few discrete reflections at higher 20 angles, and, because of the high atomic number of silver, produce a relatively low overall background compared with other membrane filters, such as cellulose (Poppe and Hathaway, 1979). The silver metal-membrane filters, however, present some problems after heat treatment if either the filters or the samples contain significant amounts of chlorine. At elevated temperature, the chloride ions react with the silver substrate to form crystalline compounds. These compounds change the mass-absorption coefficient of the sample, reducing peak intensities and areas and, therefore, complicating the semiquantitative estimation of clay minerals. A simple procedure that eliminates most of the chloride from a sample and the silver metal-membrane substrate is presented here.
Zhang, Meihua; Bi, Jinhu; Yang, Cui; Li, Donghao; Piao, Xiangfan
2012-01-01
In order to achieve rapid, automatic, and efficient extraction for trace chemicals from samples, a system of gas-purged headspace liquid phase microextraction (GP-HS-LPME) has been researched and developed based on the original HS-LPME technique. In this system, semiconductor condenser and heater, whose refrigerating and heating temperatures were controlled by microcontroller, were designed to cool the extraction solvent and to heat the sample, respectively. Besides, inert gas, whose gas flow rate was adjusted by mass flow controller, was continuously introduced into and discharged from the system. Under optimized parameters, extraction experiments were performed, respectively, using GP-HS-LPME system and original HS-LPME technique for enriching volatile and semivolatile target compounds from the same kind of sample of 15 PAHs standard mixture. GC-MS analysis results for the two experiments indicated that a higher enrichment factor was obtained from GP-HS-LPME. The enrichment results demonstrate that GP-HS-LPME system is potential in determination of volatile and semivolatile analytes from various kinds of samples. PMID:22448341
A piezo-ring-on-chip microfluidic device for simple and low-cost mass spectrometry interfacing.
Tsao, Chia-Wen; Lei, I-Chao; Chen, Pi-Yu; Yang, Yu-Liang
2018-02-12
Mass spectrometry (MS) interfacing technology provides the means for incorporating microfluidic processing with post MS analysis. In this study, we propose a simple piezo-ring-on-chip microfluidic device for the controlled spraying of MALDI-MS targets. This device uses a low-cost, commercially-available ring-shaped piezoelectric acoustic atomizer (piezo-ring) directly integrated into a polydimethylsiloxane microfluidic device to spray the sample onto the MS target substrate. The piezo-ring-on-chip microfluidic device's design, fabrication, and actuation, and its pulsatile pumping effects were evaluated. The spraying performance was examined by depositing organic matrix samples onto the MS target substrate by using both an automatic linear motion motor, and manual deposition. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) was performed to analyze the peptide samples on the MALDI target substrates. Using our technique, model peptides with 10 -6 M concentration can be successfully detected. The results also indicate that the piezo-ring-on-chip approach forms finer matrix crystals and presents better MS signal uniformity with little sample consumption compared to the conventional pipetting method.
Golden, J.P.; Verbarg, J.; Howell, P.B.; Shriver-Lake, L.C.; Ligler, F.S.
2012-01-01
A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose–response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. PMID:22960010
Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S
2013-02-15
A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Yan, Yue
2018-03-01
A synthetic aperture radar (SAR) automatic target recognition (ATR) method based on the convolutional neural networks (CNN) trained by augmented training samples is proposed. To enhance the robustness of CNN to various extended operating conditions (EOCs), the original training images are used to generate the noisy samples at different signal-to-noise ratios (SNRs), multiresolution representations, and partially occluded images. Then, the generated images together with the original ones are used to train a designed CNN for target recognition. The augmented training samples can contrapuntally improve the robustness of the trained CNN to the covered EOCs, i.e., the noise corruption, resolution variance, and partial occlusion. Moreover, the significantly larger training set effectively enhances the representation capability for other conditions, e.g., the standard operating condition (SOC), as well as the stability of the network. Therefore, better performance can be achieved by the proposed method for SAR ATR. For experimental evaluation, extensive experiments are conducted on the Moving and Stationary Target Acquisition and Recognition dataset under SOC and several typical EOCs.